US20180025649A1 - Unmanned aerial vehicle privacy controls - Google Patents
Unmanned aerial vehicle privacy controls Download PDFInfo
- Publication number
- US20180025649A1 US20180025649A1 US15/088,005 US201615088005A US2018025649A1 US 20180025649 A1 US20180025649 A1 US 20180025649A1 US 201615088005 A US201615088005 A US 201615088005A US 2018025649 A1 US2018025649 A1 US 2018025649A1
- Authority
- US
- United States
- Prior art keywords
- uav
- geofence
- privacy
- flight
- boundary
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 claims abstract description 56
- 238000012545 processing Methods 0.000 claims description 9
- 230000000670 limiting effect Effects 0.000 claims description 5
- 230000008569 process Effects 0.000 abstract description 37
- 238000004590 computer program Methods 0.000 abstract description 2
- 238000004891 communication Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 14
- 230000002452 interceptive effect Effects 0.000 description 10
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 9
- 238000007689 inspection Methods 0.000 description 9
- 230000009471 action Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 230000007704 transition Effects 0.000 description 4
- 230000001419 dependent effect Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 239000000446 fuel Substances 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 239000007787 solid Substances 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000013480 data collection Methods 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 238000003709 image segmentation Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000008520 organization Effects 0.000 description 2
- 230000036961 partial effect Effects 0.000 description 2
- 230000000737 periodic effect Effects 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 230000002829 reductive effect Effects 0.000 description 2
- 238000012552 review Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 230000009182 swimming Effects 0.000 description 2
- 241000699670 Mus sp. Species 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000010923 batch production Methods 0.000 description 1
- 230000033228 biological regulation Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000013481 data capture Methods 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 230000010006 flight Effects 0.000 description 1
- 230000002068 genetic effect Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000002401 inhibitory effect Effects 0.000 description 1
- 230000005764 inhibitory process Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000000547 structure data Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000001931 thermography Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/50—Navigation or guidance aids
- G08G5/55—Navigation or guidance aids for a single aircraft
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0033—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by having the operator tracking the vehicle either by direct line of sight or via one or more cameras located remotely from the vehicle
-
- G08G5/006—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
- B64C39/024—Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0055—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots with safety arrangements
- G05D1/0061—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots with safety arrangements for transition from automatic pilot to manual pilot and vice versa
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0094—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
-
- G08G5/0069—
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/50—Navigation or guidance aids
- G08G5/57—Navigation or guidance aids for unmanned aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/50—Navigation or guidance aids
- G08G5/59—Navigation or guidance aids in accordance with predefined flight zones, e.g. to avoid prohibited zones
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B7/00—Radio transmission systems, i.e. using radiation field
- H04B7/14—Relay systems
- H04B7/15—Active relay systems
- H04B7/185—Space-based or airborne stations; Stations for satellite systems
- H04B7/18502—Airborne stations
- H04B7/18506—Communications with or from aircraft, i.e. aeronautical mobile service
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
- B64D47/08—Arrangements of cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
- B64U10/13—Flying platforms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U20/00—Constructional aspects of UAVs
- B64U20/80—Arrangement of on-board electronics, e.g. avionics systems or wiring
- B64U20/87—Mounting of imaging devices, e.g. mounting of gimbals
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
- B64U2101/32—UAVs specially adapted for particular uses or applications for imaging, photography or videography for cartography or topography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/20—Remote controls
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/01—Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/13—Receivers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20004—Adaptive image processing
- G06T2207/20012—Locally adaptive
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/10—Recognition assisted with metadata
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/021—Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
Definitions
- UAV unmanned aerial vehicle
- UAV unmanned aerial vehicle
- flight and post-flight operations of UAVs may be configured to address privacy concerns with regard to data collection and flight path operation.
- a UAV computer system may be configured to limit or disable the use of certain sensors or payload devices connected to the UAV in or about a geofence.
- the UAV can be configured to limit sensor data being captured of another property beyond the geofence within the sensor's field of view (FOV).
- the UAV computer system may be configured so UAV operation of the sensors or payload devices are temporarily disabled or their functionality restricted when in proximity to a geofence boundary.
- Subject matter described in this specification can be embodied in a system, method or computer program product including the action of navigating a UAV within or in proximity to a geofence.
- the UAV may obtain flight information from, for example, a global navigation satellite system (GNSS) receiver and determine a geospatial position of the UAV.
- GNSS global navigation satellite system
- a UAV computer system can then determine the UAV geospatial position relative to the boundary of the geofence (e.g., a closest boundary).
- the UAV computer system may perform one or more privacy operations if the UAV is within a particular distance to the boundary of the geofence.
- Some privacy operations include but are not limited to positioning a sensor (e.g., pointing a camera lens) away from the direction of the geofence boundary or disabling operation of a payload device.
- the positioning may be done by orienting the body of the UAV or a gimbaled camera to a position where the lens of the camera points in a direction away from the geofence boundary. If the UAV is flying above a geofence over a private property, a gimbaled camera can be positioned such that the camera cannot capture images of the private property.
- the camera can be disabled, its functionality restricted or the camera parameters adjusted (e.g., adjust aperture, focal length) to obfuscate an object in the image (e.g., in the background of the image).
- images obtained from an aerial inspection of a particular property may be modified to obfuscate portions of images that may contain imagery of other properties besides the property being inspected.
- the UAV computer system can be configured to allow for real-time graphics processing to obfuscate or delete portions of still images or videos taken by the UAV.
- post-flight processing may be performed on the still images or videos.
- a flight planning system (FPS) or a ground control system (GCS) may use UAV flight log data to correlate a still image or video frame with a particular location and an orientation of the camera with respect to a geofence boundary.
- the UAV, GCS, or FPS can utilize one or more of GNSS data, inertial measurement unit (IMU) data, ground altitude data and imagery data (e.g., satellite imagery) to determine portions of the imagery that are to be obfuscated (e.g., blurred).
- IMU inertial measurement unit
- imagery data e.g., satellite imagery
- the FPS or GCS system can identify those images, video frames, or portions of images or video frames that may include imagery captured outside of the geofence boundary and obfuscate the images, video frames or portions of the images or video frames.
- FIG. 1 is a block diagram of an example flight control system architecture for a UAV.
- FIG. 2 is a block diagram of an example flight planning system.
- FIG. 3 illustrates an example user interface for determining a flight boundary geofence.
- FIGS. 4A-4D illustrate examples of a UAV privacy operation where the UAV is navigating around and through two different privacy geofence types.
- FIG. 5 is a flowchart of an example process of the UAV performing a privacy operation.
- FIG. 6 illustrates an example of a UAV navigating and collecting images beyond a flight boundary geofence.
- FIG. 7 is a flowchart of an example process of the UAV performing an image obfuscation process.
- FIGS. 8A and 8B illustrate the process described in FIG. 7 .
- FIG. 1 is a block diagram of an example Unmanned Aerial Vehicle (UAV) architecture for implementing the features and processes described herein.
- UAV can include a primary computer system 100 and a secondary computer system 102 .
- the UAV primary computer system 100 can be a system of one or more computers, or software executing on a system of one or more computers, which is in communication with, or maintains, one or more databases.
- the UAV primary computer system 100 can include a processing subsystem 130 including of one or more processors 135 , graphics processor units (GPUs) 136 , input/output (I/O) subsystem 134 and an inertial measurement unit (IMU) 132 .
- processors 135 the graphics processor units
- I/O input/output subsystem
- IMU inertial measurement unit
- the primary computer system 100 can include logic circuits, analog circuits, associated volatile and/or non-volatile memory, associated input/output data ports, power ports, etc., and include one or more software processes executing on one or more processors or computers.
- the primary computer system 100 can include memory 118 .
- Memory 118 may include non-volatile memory, such as one or more magnetic disk storage devices, solid-state hard drives, or flash memory. Other volatile memory such a RAM, DRAM, SRAM may be used for temporary storage of data while the UAV is operational.
- Databases may store information describing UAV flight operations, flight plans, contingency events, geofence information, component information, and other information.
- the primary computer system 100 may be coupled to one or more sensors, such as GNSS receivers 150 (e.g., GPS receivers), temperature sensor 154 (e.g., a thermometer), gyroscopes 156 , accelerometers 158 , pressure sensors (static or differential) 152 , current sensors, voltage sensors, magnetometers, hydrometers, and motor sensors.
- the UAV may use IMU 132 for use in inertial navigation of the UAV.
- Sensors can be coupled to the primary computer system 100 or to controller boards coupled to the primary computer system 100 .
- One or more communication buses such as, for example, a controller area network (CAN) bus, or signal lines, may communicatively couple the various sensor and components.
- CAN controller area network
- the primary computer system 100 may use various sensors to determine the UAV's current geospatial position, attitude, altitude, velocity, direction, pitch, roll, yaw and/or airspeed.
- the primary computer system 100 may also use various sensors to pilot the UAV along a specified flight path and/or to a specified location and/or to control the UAV's attitude, velocity, altitude, and/or airspeed (optionally even when not navigating the UAV along a specific flight path or to a specific location).
- the flight control module 122 handles flight control operations of the UAV.
- the module interacts with one or more controllers 140 that control operation of motors 142 and/or actuators 144 .
- the motors may be used for rotation of propellers
- the actuators may be used for flight surface control such as ailerons, rudders, flaps, landing gear, and parachute deployment.
- the contingency module 124 monitors and handles contingency events. For example, the contingency module may detect that the UAV has crossed a boundary of a geofence, and then instruct the flight control module 122 to return to a predetermined landing location. In some implementations, the contingency module 124 may detect that the UAV is flying out of a visual line-of-sight (VLOS) from a ground operator and instruct the flight control module 122 to perform a contingency action, e.g., to land at a landing location.
- VLOS visual line-of-sight
- Other contingency criteria may be the detection of a low battery or fuel state, malfunctioning of an onboard sensor, motor or actuator, or a deviation from the flight plan. The foregoing is not meant to be limiting, as other contingency events may be detected. In some instances, if equipped on the UAV, a parachute may be deployed if the motors or actuators fail.
- the mission module 129 processes the flight plan, waypoints, and other associated information with the flight plan as provided to the UAV in a flight package.
- the mission module 129 works in conjunction with the flight control module 122 .
- the mission module 129 may send information concerning the flight plan to the flight control module 122 , for example waypoints (e.g., latitude, longitude, altitude), flight velocity, so that the flight control module 122 can autopilot the UAV.
- waypoints e.g., latitude, longitude, altitude
- the UAV may have various sensor and other devices connected to the UAV for performing a variety of task, such as data collection.
- the UAV may carry a camera 149 , which can be, for example, a still image camera, a video camera, an infrared camera, or a multispectral camera.
- the UAV may carry a Lidar, radio transceiver, sonar, and traffic collision avoidance system (TCAS). Data collected by the devices may be stored on the sensor or device collecting the data, or the data may be stored on memory 118 of the primary computer system 100 .
- TCAS traffic collision avoidance system
- the primary computer system 100 may be coupled to various radios, e.g., transceivers 159 for manual control of the UAV, and for wireless or wired data transmission to and from the UAV primary computer system 100 , and optionally a UAV secondary computer system 102 .
- the UAV may use one or more communications subsystems, such as a wireless communication or wired subsystem, to facilitate communication to and from the UAV.
- Wireless communication subsystems may include radio transceivers, infrared, optical ultrasonic and electromagnetic devices.
- Wired communication systems may include ports such as Ethernet, USB ports, serial ports, or other types of port to establish a wired connection to the UAV with other devices, such as a GCS, an FPS, or other devices, for example a mobile phone, tablet, personal computer, display monitor, or other network-enabled devices.
- the UAV may use a light-weight wire tethered to a GCS for communication with the UAV.
- the wire may be affixed to the UAV, for example via a magnetic coupler.
- Flight data logs may be generated by reading various information from the UAV sensors and operating system 120 and storing the information in computer-readable media (e.g., memory 118 ).
- the data logs may include a combination of various data, such as time, altitude, heading, ambient temperature, processor temperatures, atmospheric pressure, battery level, fuel level, absolute or relative position, position coordinates (e.g., GPS coordinates), pitch, roll, yaw, ground speed, humidity level, velocity, acceleration and contingency information.
- position coordinates e.g., GPS coordinates
- the flight data logs may be stored on a removable medium. The medium can be installed on the GCS or onboard the UAV.
- the data logs, and individual data from the sensor or operating system may be wirelessly transmitted to the GCS or to the FPS.
- Modules, programs or instructions for performing flight operations, contingency maneuvers, and other functions may be performed with operating system 120 .
- the operating system 120 can be a real-time operating system (RTOS), UNIX, LINUX, OS X, WINDOWS, ANDROID or other operating system.
- RTOS real-time operating system
- other software modules and applications may run on the operating system 120 , such as the flight control module 122 , contingency module 124 , application module 126 , database module 128 and mission module 129 .
- flight critical functions will be performed using the primary computer system 100 .
- Operating system 120 may include instructions for handling basic system services and for performing hardware dependent tasks.
- the secondary computer system 102 may be used to run another operating system 172 to perform other functions.
- the UAV secondary computer system 102 can be a system of one or more computers, or software executing on a system of one or more computers, which is in communication with, or maintains, one or more databases.
- the UAV secondary computer system 102 can include a processing subsystem 190 of one or more processors 194 , GPUs 192 and I/O subsystem 193 .
- the UAV secondary computer system 102 can include logic circuits, analog circuits, associated volatile and/or non-volatile memory, associated I/O data ports, power ports, etc., and include one or more software processes executing on one or more processors or computers.
- the UAV second computer system 102 can include memory 170 .
- Memory 170 may include non-volatile memory, such as one or more magnetic disk storage devices, solid state hard drives, flash memory. Other volatile memory such a RAM, DRAM, SRAM may be used for storage of data while the UAV is operational.
- the UAV secondary computer system 102 can include operating system 172 .
- the operating system 172 can be based on real time operating system (RTOS), UNIX, LINUX, OS X, WINDOWS, ANDROID or other operating system.
- RTOS real time operating system
- other software modules and applications may run on the operating system 172 , such as an application module 174 , database module 176 , mission module 178 and contingency module 180 .
- Operating system 172 may include instructions for handling basic system services and for performing hardware dependent tasks.
- the UAV can include controllers 146 .
- Controllers 146 may be used to interact with and operate a payload device 148 , and other devices such as camera 149 .
- Camera 149 can include a still-image camera, video camera, infrared camera, multispectral camera, stereo camera pair.
- controllers 146 may interact with a Lidar, radio transceiver, sonar, laser ranger, altimeter, TCAS and Automatic dependent surveillance-broadcast (ADS-B) transponder.
- the secondary computer system 102 may include controllers to control payload devices.
- FIG. 2 is a block diagram illustrating an example FPS 200 .
- the various illustrated components may communicate over wired and/or wireless communication channels (e.g., networks, peripheral buses, etc.).
- FPS 200 can be a system of one or more computer processors, or software executing on a system of one or more computers.
- the FPS 200 can maintain and communicate with one or more databases (e.g., databases 202 - 209 ), storing information describing prior implemented flight plans and information associated with each flight plan (e.g., information describing a UAV, an operator, property/map, mission, database, etc.).
- the databases can include operator database 202 , operational database 204 , UAV configuration database 206 , UAV mission information database 208 and property and map database 209 .
- the FPS 200 can be a system of one or more processors, graphics processors, logic circuits, analog circuits, associated volatile and/or non-volatile memory, associated input/output data ports, power ports, etc., and include one or more software processes executing on one or more processors or computers.
- the FPS 200 can be a component of, or be coupled to, one or more devices including one or more processors and configured to send data to and receive data from one or more UAVs 234 A, 234 B and 234 C.
- a GCS 213 can be a specialized user device 212 configured to control one or more aspects of a flight of UAVs 234 A, 234 B and 234 C.
- the FPS 200 may store, and maintain, flight operation information associated with a UAV. Flight operation information may include configuration information of each UAV, flight mission and planned flight path, operator information, the UAV's precise three-dimensional (3D) location in space, velocity information, UAV status (e.g., health of components included in the UAV), contingency plans, and so on.
- the FPS 200 can receive flight path data (e.g., from an operator's device or an interactive user interface), and determine, information describing a flight plan.
- the FPS 200 can provide a flight package 244 associated with the flight plan to a UAV (e.g., UAV 234 A, 234 B, 234 C) to implement.
- the flight package 244 may be provided to the GCS 213 and transmitted to the UAV, or the FPS 200 may transmit the flight package 244 direction to the UAV.
- the FPS 200 can store flight plan information, flight data log information, job information in the various databases.
- the example FPS 200 includes a flight description module 210 that can generate one or more interactive user interfaces (e.g., HTML or XML content for web pages) for rendering on a user device (e.g., user device 212 ).
- the interactive user interfaces may optionally be transmitted for display to the user device via a wireless network or other communication channel.
- User device 212 can receive, from an operator, information describing a flight plan to be performed by the UVA (e.g., performed by UAV 234 A, 234 B or 234 C).
- a user interface may be configured to receive, from an operator, location information associated with the flight plan (e.g., an address of a home or property, geospatial position coordinates of a structure to be inspected, etc.).
- the flight description module 210 can obtain information describing the location.
- the information can include property boundaries associated with an address (e.g., boundaries of a home, obtained from a database, or system that stores or configured to access property boundary information), obstacles associated with the location (e.g., nearby trees, electrical towers, telephone poles) and/or other information.
- the flight description module 210 can obtain imagery, such as georectified imagery (e.g., satellite imagery), associated with the entered location information.
- the flight description module 210 can include some or all of the information describing the location (e.g., the obtained imagery or boundary information) in an interactive user interface to be presented on the user device 212 to an operator.
- the operator of the user device 212 may interact with user interfaces to describe a flight boundary geofence (as described further below) for a UAV to enforce.
- the user device 212 can receive imagery associated with operator-entered location information, and present one or more geofence shapes layered on the imagery.
- the user interface provides functionality for the operator to select a presented shape (e.g., a polygon), and further functionality enabling the user to drag and/or drop the shape to surround an area of interest in the received imagery to limit allowable locations of a UAV to locations within the shape.
- the user interface may allow the user device 212 to receive input (e.g., of a finger or stylus) tracing a particular shape onto a touch-screen display of the user device 212 .
- the flight description module 210 can store information describing the trace as a flight boundary geofence. Accordingly, the user device 212 can provide information describing the traced shape to the flight description module 210 (e.g., coordinates associated with the imagery). The flight description module 210 can correlate the traced shape to location information in a real-world coordinate system (e.g., geospatial coordinates in a geodetic (lat/lon) coordinate frame that correspond to the traced shape).
- a real-world coordinate system e.g., geospatial coordinates in a geodetic (lat/lon) coordinate frame that correspond to the traced shape.
- a user interface can enable the operator to describe safe locations for a UAV to begin the flight plan (e.g., a launching location where the UAV takes off from the ground) and end the flight plan (e.g., a landing location where the UAV lands).
- the flight description module 210 can analyze the obtained imagery associated with the entered location information and identify a geometric center of a convex area (e.g., a biggest convex area) within the geofence boundary that does not include obstructions (e.g., trees). For example, the flight description module 210 can determine an open area, such as an open pasture.
- the flight description module 210 can obtain topographical information associated with the entered location information and can detect substantially flat areas (e.g., areas with less than a threshold of variance in height). For instance, the flight description module 210 can determine that an open space (e.g., an open clearing that is substantially flat) is a safe launching location for the UAV to take-off from, and can provide information recommending the open space in an interactive user interface presented on the user device 212 . Additionally, the flight description module 210 can analyze the obtained imagery and locate physical features that are generally known to be safe locations for take-off and landing. For example, the flight description module 210 can determine that a driveway of a home associated with the flight plan is safe and can select the driveway as a safe take off and landing location, or can recommend the driveway as a safe launching and landing location.
- substantially flat areas e.g., areas with less than a threshold of variance in height. For instance, the flight description module 210 can determine that an open space (e.g., an open clearing that is substantially flat) is a
- the flight description module 210 can receive (e.g., from a user interface) survey or flight mission information, for instance information indicating a particular type of survey for a UAV to perform (e.g., damage inspection, inspection of a vertical structure, inspection of a rooftop).
- the flight description module 210 can receive waypoints for the UAV to travel to, including an order in which the waypoints are to be traveled to, a ranking or importance of each, or a group of, waypoints, and specific actions for the UAV to take while traveling to, or after reaching, each waypoint.
- a user interface can optionally enable the operator of the user device 212 to specify that upon reaching a particular waypoint, the UAV is to activate a particular sensor, or other payload devices, such as an infrared camera, a sensor measuring radiation, and so on. Additionally, a user interface can optionally enable the operator to specify transition speeds the UAV is to use when travelling between waypoints, or between particular waypoints.
- operations to be performed at a particular location, or waypoint may be identified by an operator using the FPS 200 or GCS 213 via a user interface.
- the user interface can allow an operator plan a survey to obtain sensor information (e.g., photograph or videotape) of a specified location, or of a structure.
- Operations of the UAV may be automatically configured by either the FPS 200 or GCS 213 depending on the type of inspection to be performed.
- the flight description module 210 can receive information describing, or relevant to, configuration information of a UAV, such as a type of UAV (e.g., fixed-wing, single rotor, multi-rotor, and so on).
- a type of UAV e.g., fixed-wing, single rotor, multi-rotor, and so on.
- the flight description module 210 can receive information describing, or relevant to, configuration information of sensors or other payload devices required for the survey or flight mission information, and general functionality to be performed.
- the flight description module 210 can then determine recommendations of particular UAVs (e.g., UAVs available to perform the flight plan) that comport with the received information.
- the flight description module 210 can determine that based on the received survey type, a UAV will require particular configuration information, and recommend the configuration information to the operator.
- the flight description module 210 can receive information identifying that hail damage is expected, or is to be looked for, and can determine that a UAV which includes particular sensors, and specific visual classifiers to identify hail damage, is needed. For example, the flight description module 210 can determine that a heat and/or thermal imaging sensor that includes specific visual classifiers that can discriminate hail damage from other types of damage (e.g., wind damage, rain damage, and so on) is needed.
- a heat and/or thermal imaging sensor that includes specific visual classifiers that can discriminate hail damage from other types of damage (e.g., wind damage, rain damage, and so on) is needed.
- the flight description module 210 can utilize received survey or flight mission information to determine a flight pattern for a UAV to follow. For instance, the flight description module 210 can determine a path for the UAV to follow between each waypoint (e.g., ensuring that the UAV remains in the geofence boundary). Additionally, the flight description module 210 can determine, or receive information indicating, a safe minimum altitude for the UAV to enforce, the safe minimum altitude being an altitude at which the UAV is safe to travel between waypoints. The safe minimum altitude can be an altitude at which the UAV will not encounter obstacles within the geofence boundary (e.g., a height above buildings, trees, towers, poles, etc.).
- the safe minimum altitude can be based on a ground sampling distance (GSD) indicating a minimum resolution that will be required from imagery obtained by the UAV while implementing the flight plan (e.g., based in part on capabilities of an included camera, such as sensor resolution, sensor size, and so on).
- GSD ground sampling distance
- the flight description module 210 can receive a time that the flight plan is to be performed (e.g., a particular day, a particular time at a particular day, a range of times, and so on). The flight description module 210 can then determine an availability of UAVs and/or operators at the received time(s). For example, the module 210 can obtain scheduling information). Additionally, the flight description module 210 can filter available UAVs according to determined configuration information (e.g., as described above). Optionally, the flight description module 210 can access weather information associated with the received time(s), and determine an optimal time or range of times for the job to be performed.
- a time that the flight plan is to be performed e.g., a particular day, a particular time at a particular day, a range of times, and so on.
- the flight description module 210 can then determine an availability of UAVs and/or operators at the received time(s). For example, the module 210 can obtain scheduling information). Additionally, the flight description module 210 can filter available UAVs according to
- a UAV that includes particular sensors can obtain better real-world information at particular times of day (e.g., at noon on a sunny day can provide better imagery by maximizing image contrast and minimizing the effects of shadows).
- the flight description module 210 can determine the flight plan accordingly.
- the FPS 200 can provide a flight package 244 that includes the determined flight plan directly to a UAV (e.g., the UAV 234 A, 234 B or 234 C).
- the FPS 200 can provide the flight package 244 to a user device 212 or GCS 213 .
- the user device 212 or GCS 213 can modify the flight plan or preserve the flight plan in the flight package 244 .
- the user device 212 or GCS 213 can transmit the flight package 244 to the UAV 234 A, 234 B, 234 C.
- the flight package 244 can include a flight manifest file (e.g., an XML file) identifying necessary application and version information to conduct the flight plan.
- a flight manifest file e.g., an XML file
- the UAV can be required to execute a particular application (e.g., “app” downloaded from an electronic application store) that provides functionality necessary to conduct the flight plan.
- a particular application e.g., “app” downloaded from an electronic application store
- an application can effect a flight plan associated with inspecting vertical structures, and the UAV can be required to execute the application prior to initiation of the flight plan.
- the FPS 200 may create a flight plan for automated or partially automated flight of a UAV, taking into consideration structural data to avoid situations where the UAV may fly out of VLOS of a base location.
- the base location can include one or more locations of an operator of a UAV.
- the base location can be a geospatial position of the user device 212 or a launching location of the UAV.
- the FPS 200 may receive, via a user interface 214 , a location for an aerial survey to be conducted by an unmanned aerial vehicle.
- One or more images may be displayed depicting a view of the location.
- the interface allows for a selection of a launching location of the UAV. As the images have associated geospatial positions, the FPS 200 can determine an associated latitude/longitude for the launching location.
- the user interface 214 may receive an input or selections for one or more flight waypoints. Similar to the launching locations, the flight waypoints having an associated geospatial position.
- the FPS 200 may assign altitudes for the flight waypoints, or altitudes for the flight waypoints may be determined by a user, and specific numeric altitudes values may be set.
- the FPS 200 may determine based on the launching location and altitude of the one or more flight waypoints whether a flight waypoint may cause a non-VLOS occurrence. From the launching location, a flight plan may be generated using waypoints having associated latitude and longitude coordinates, and an associated altitude. The FPS 200 may not allow a UAV waypoint where the VLOS from the base location (e.g., the launching location, or an area around the launching location) would be blocked because of a structure. The FPS 200 may use 3D polygonal data, topographical data or other structure data in generating the flight plan.
- the FPS system 200 could use a 3D coordinate system to determine, based on a base location and each waypoint location, whether the UAV would likely enter into a non-VLOS situation.
- the FPS 200 can then generate a flight plan that avoids the non-VLOS situation and includes only the flight waypoints that would not cause a non-VLOS occurrence.
- the FPS 200 may determine a geofence boundary to limit flight of the UAV to a bounded area.
- the user interface 214 may display the geofence boundary over one or more location images.
- the flight planning system 200 may determine a survey area, and set the survey area within the geofence boundary.
- the FPS 200 receives, from a GCS 213 (or directly from the UAV), data (such as flight log data and collected sensor data).
- data such as flight log data and collected sensor data.
- a user interface of the FPS 200 displays at least a portion of sensor data collected by the UAV, information describing a planned survey, and information associated with the flight data package.
- the GCS 213 may also be used for flight and contingency planning.
- the GCS 213 can receive flight plans from the FPS 200 for transmission to the UAV.
- the GCS 213 also allows for manual override of a UAV operating in an autopilot mode.
- a flight plan may be transmitted to the UAV either via a wireless or tethered connection.
- the GCS 213 is a mobile device, such a laptop, mobile phone, tablet device, with a cellular and other wireless connection for data transmission over the Internet or other network.
- Each of user devices 212 can be a system of one or more computers, or software executing on a system of one or more computers, which is in communication with, or maintains, one or more databases, e.g., databases, storing information describing UAV flight operations and components.
- Each of the user devices 212 can be a system of one or more processors, graphics processors, logic circuits, analog circuits, associated volatile and/or non-volatile memory, associated I/O data ports, power ports, etc., and include one or more software processes executing on one or more processors or computers.
- the FPS 200 may be primarily used to create and transmit a flight package 244 to a UAV or GCS 213
- the UAV or GCS 213 can initiate the request for the flight package 244 from the FPS 200 .
- An operator may take the UAV or GCS 213 to a property location.
- the UAV or GCS 213 may then request the flight package 244 , or an updated flight package 244 using a current position of the UAV or GCS 213 .
- the UAV or GCS 213 can determine its geospatial position via a GNSS receiver (e.g., using GPS, GLONASS, Galileo, or Beidou system).
- a GNSS receiver e.g., using GPS, GLONASS, Galileo, or Beidou system.
- the UAV or GCS 213 can then transmit its geospatial position to the FPS 200 , along with other identifying information about the requesting device, such as, for example, a unique identifier (UID) or media access control (MAC) address.
- the FPS 200 will receive the request, and determine if an updated or changed flight package exists by comparing the device identifier with identifiers in a database storing the new or updated flight package information. If flight planning system 200 finds a new or updated flight package, then the flight planning system 200 transmits the flight package from the flight planning system 200 .
- the UAV or GCS 213 can receive the flight package. A confirmation acknowledging receipt of the flight package may then be transmitted from the UAV or GCS 213 to the FPS 200 .
- the flight planning system 200 will then update a database record to indicate that the particular flight package has been received. Moreover, the UAV or GCS 213 can supply the property location, and a new job request can be sent to the FPS 200 . The flight planning system 200 may create a new flight package for the UAV or GCS 213 .
- a flight plan may be created and transmitted to the UAV.
- the flight plan instructs the UAV with regard to a particular flight path.
- a flight plan may be created using the flight planning system 200 , or GCS 213 .
- a flight plan instructs the UAV where it should fly in a 3D space.
- the flight plan includes a series of connected waypoints that define where the UAV should fly and what actions that the UAV should complete during a particular flight.
- the UAV may have an autopilot flight module operating on the UAV computer system that uses the flight plan to automatically fly the UAV.
- the flight plan information may be provided to the GCS 213 (and then to the UAV) or directly to the UAV, in a flight package 244 comprising the flight plan and other information (such as contingency event instructions).
- a user may select a series of geographically-based waypoints and a launching location for the UAV. Based on the waypoints, a flight plan may be constructed allowing the UAV to autonomously navigate itself. In some implementations, the flight planning system 200 , or GCS 213 may automatically define a flight plan based on various criteria, such as an inspection, or survey type.
- While the primary computer system 100 autopilot module is navigating the UAV according to a flight plan, certain aspects of the flight pattern may be controlled by the operator's user device 212 , or GCS 213 .
- the flight plan or pattern may be configured such that for a particular waypoint, a vertical ascent/descent rate, UAV altitude, horizontal UAV rotation, payload gimbal, payload direction, waypoint transition speed, or trigger of a payload sensor may be controlled by the UAV operator.
- the user device 212 may have a physical control device such as a toggle or joystick, or virtual control in a user interface that allows the operator to control vertical ascent/descent rate, UAV altitude, UAV attitude, horizontal UAV rotation, payload gimbal, payload direction.
- the user device 212 , or GCS 213 can trigger a payload sensor while conducting the inspection. For example, the UAV may navigate via autopilot to a position over an inspection location. An operator then can provide input to the user device 212 , or GCS 213 .
- the user device, or GCS may transmit a signal or information corresponding to the user input to the UAV via radio communication.
- the signal or information can control the vertical ascent/descent rate, UAV altitude, UAV attitude, horizontal UAV rotation, payload gimbal, or payload direction, or waypoint transition speed.
- the signal or information can trigger a payload sensor to turn on or turn off. This particular mode allows for partial autopilot control and partial or complete manual control of the UAV. Even though the operator may manually control certain aspects of the flight plan, if one has been set, the UAV can remain within a geofence boundary envelope and to remain within VLOS of the operator operating user device 212 .
- the UAV may be partially manually controlled by an operator using the user device 212 while the UAV is in autopilot mode.
- the UAV may receive a command from the user device 212 to nudge the UAV in a particular direction.
- the control input of the user device 212 causes the user device 212 to send a command to the UAV, instructing the UAV to move slightly, for example between 0.1 to 3 meters, in a particular direction (in an x, y, or z axis, or diagonally).
- the particular distance can be predetermined, or be variable based on the proximity to a structure. Nudging the UAV allows the operator to move the UAV away from the structure if the operator sees that the UAV is flying too close to the structure.
- the nudge command may be provided any time to the UAV while it is operating in an autopilot mode.
- the UAV should still enforce geofence boundaries (if one has been set) and not allow a nudge to cause the UAV to move beyond a geofence boundary.
- the flight planning system 200 can include a report generation module 230 and a permission control module 240 .
- the report generation module 230 is configured to generate one or more flight reports.
- the flight reports can include flight data (e.g., path, duration and actions of control surfaces), sensor data (e.g., air pressure, temperature and humidity), and payload data (e.g., information gathered by a payload camera).
- the permission control module 240 is configured to impose one or more limits on flights of the UAV. The limits can include, for example, that the UAV shall stay inside or outside an envelope defined by geofences or by geographic coordinates or that the UAV shall stay within VLOS of a base location (e.g., a location of user device 212 ).
- FIG. 3 illustrates an example user interface 300 for determining a geofence boundary.
- the user interface 300 is an example of an interactive user interface, generated by a system (e.g., the flight planning system 200 , or a presentation system in communication with the flight planning system 200 ) that is configured to receive user inputs, access one or more databases, and update the user interface 300 in response to received user inputs.
- the user interface 300 can include a document (e.g., an interactive document such as a web page), presented on a user device (e.g., a desktop, laptop, or tablet computer, a smart-phone, or a wearable device, etc.).
- the user interface 300 includes image 302 (e.g., satellite image as depicted) of a location entered by the user of the user interface 300 .
- the image 302 included in the user interface 300 can be interactive.
- a user can zoom in and out of the image 302 to target a greater or smaller real-world area.
- the user can interact with a zoom control, or the user can utilize a touch surface (e.g., a touch screen) to zoom in and out (e.g., the user can pinch to zoom).
- the user interface 300 enables the user to select areas on the image 302 that are defined by a user-specified shape.
- the user interface 300 can receive a user selection of particular vertices that define the illustrated polygon (e.g., vertices 304 A-E).
- the system can shade, or otherwise highlight, the internal portion of the user-specified shape.
- the user interface 300 enables the user to select a particular vertex of the illustrated polygon (e.g., vertex 304 A), and drag the shape into existence by moving a finger or stylus on a touch sensitive screen of the user device.
- the user interface 300 can receive input for generating a flight path 306 for the UAV to include a launching and landing location 310 .
- the user interface 300 may include a menu 308 for creating different representative layers of a flight plan.
- menu 308 shows a flight plan specifying a geofence, a photo survey area, a launch/land area, and a base map.
- the menu 308 includes a geofence menu item that refers to the geofence as represented by the connected vertices 304 A- 304 E.
- the menu 308 includes a photo survey area menu item representing the flight path 306 .
- the menu 308 includes a launch/land area menu item representing the launch/land locations 310 .
- the menu 308 includes a base map layer menu item that represents the base image layer 302 , which includes image 302 .
- the image 302 includes a highlighted area that defines a geofence boundary to be enforced by a UAV when implementing a flight plan.
- a geofence can include a two-dimensional (2D) or 3D location-based boundary.
- a geofence can be understood as a virtual boundary for a geographic location or a virtual surface around a geographic location in 3D space.
- the geofence boundary can be represented on a map as one or more polygonal shapes or rounded shapes, for example a circle, rectangle, sphere, cylinder, cube, or other shapes or bodies.
- a geofence may also be a time-based (four-dimensional) virtual boundary where the geofence exists for a particular duration, for example, a number of hours or days, or for a specific time period, for example, from 2:00 PM to 4:00 PM occurring on certain days, or other periods of time.
- a 3D geofence may exist in a particular space above ground.
- a geofence may be represented by latitudinal and longitudinal connected points, or other coordinate systems.
- a geofence may be created such that the geofence has dynamic aspects where the geofence may increase or decrease in size based on various conditions. For UAV flight operations, geofence structures are received by the UAV and stored in non-volatile memory.
- a 3D geofence may be created.
- Data representing the flight boundary geofence can be transmitted to the UAV operating system.
- the exemplary flight planning system or GCS may be used to create the geofence and transmit the geofence data structure to the UAV.
- the UAV can be limited to flight within a flight boundary geofence. If for example, an operator of the UAV in a manually controlled mode attempts to maneuver the UAV outside of the flight boundary geofence, the UAV may detect a contingency condition (e.g., the UAV is about to fly outside of the geofence) and then automatically direct the UAV to return to a specified predetermined landing location. Furthermore, if the UAV is capable of hovering, such as a multi-rotor UAV, the UAV may be inhibited from moving across a boundary of a flight boundary geofence and the UAV could be set to hover and not continue past the boundary of the geofence.
- a contingency condition e.g., the UAV is about to fly outside of the geofence
- the UAV may be inhibited from moving across a boundary of a flight boundary geofence and the UAV could be set to hover and not continue past the boundary of the geofence.
- the system can utilize property information, such as property boundaries, and automatically include a highlighted portion of the image 302 as being a possible flight boundary geofence. For instance, as illustrated in FIG. 3 , portions of the flight boundary geofence defined by connected vertices 304 A, 304 B, 304 D and 304 E abut roads included in the geographic area depicted in the image 302 .
- the primary computer system 100 can determine that the entered location information describes a particular property (e.g., an open clearing that borders the road), and can highlight the particular property.
- the primary computer system 100 can include a buffer from the property boundaries of the location to ensure that even when facing forces of nature (e.g., in a strong gust of wind) the UAV will remain within the property boundaries.
- Property boundary information from a database can be used to create the flight boundary geofence to limit flight of the UAV within the property's boundary.
- the UAV can then be constrained for flight operations only within this geofence.
- the property information used to create the flight boundary geofence can be of various data types, for example, parcel polygons, vector, rasterized, shape files, or other data types.
- the FPS 200 may create the flight boundary geofence based on the property shape data.
- the various data types ideally can have geolocation and/or coordinate information, such as latitudinal/longitudinal points for use in orienting and creating the flight boundary geofence.
- the geofence envelope may be identical in shape to the property boundary.
- the boundary of the geofence may be reduced in size.
- the flight boundary geofence may be reduced in size by a set distance, for example 5 meters, towards a centroid of the property. Reduction of the flight boundary geofence creates a buffer zone. The buffer zone may help avoid an unintentional flyover of an adjacent property boundary.
- the FPS 200 may display an area with parcel polygonal data. An interface of the FPS 200 may then receive a selection of one or more parcels. The FPS 200 then can use the selections to create one or more jobs and multiple geofence envelopes. For the multiple parcels, the operator would go to each parcel property and conduct multiple jobs.
- the user interface 300 can be utilized by a UAV operator to indicate waypoints to be traveled to during the flight plan. For instance, the user can select portions of the image 302 to designate as waypoints, and the user interface 300 can be updated to present selectable options associated with each waypoint. As an example, the user can designate an order that each waypoint is to be traveled to, actions the UAV is to take at the waypoint, a transition speed between each or all waypoints, and so on.
- the system can determine the flight boundary geofence from the waypoints, such that the geofence perimeter encompasses the waypoints. The determined flight boundary geofence can be presented to the user for review, and the user can modify the boundary by interacting with the user interface 300 .
- the user interface 300 can include text provided by the user that describes the flight plan.
- a different user can access the user interface 300 , and quickly view the determined flight boundary geofence along with text describing the flight plan.
- a user can quickly describe flight plan information sufficient for a UAV to implement, and other users can quickly view graphical representations of the flight plan (e.g., graphical representation of the flight boundary geofence along with textual data describing the flight plan).
- a privacy geofence may be used independent of, or in conjunction with a flight boundary geofence.
- a privacy geofence can be created and transmitted to the UAV such that the UAV avoids flying over the privacy geofence, or avoids flying through the privacy geofence. Also, particular operations of the UAV while flying over, or flying through a privacy geofence may be performed or inhibited. Also, certain sensors or other devices connected to the UAV may be wholly or partially disabled, wholly or partially enabled, or otherwise prevented from being used while flying over or through the privacy geofence.
- the flight boundary geofence and the privacy geofence may have an associated buffer zone.
- the buffer zone allows the UAV to fly a particular distance over the boundary of the geofence, and not create a contingency condition. Occasionally, a UAVs estimated location may have a certain level of error, for example, possibly caused by, for example, GNSS clock drift.
- An associated buffer zone allows the UAV to fly a certain distance to either side of the boundary of the geofence.
- the buffer zone could be a fixed distance, say 20 meters. So long as the UAV determines it is located within this buffer zone, the UAV will be considered to be operating within the geofence.
- the buffer zone can be a function of the size of the geofence. A smaller geofence will have a smaller buffer zone, for example maybe 2 meters and a larger geofence will have a larger buffer zone.
- an electronic database may be referenced by the FPS 200 or GCS 213 to search for predetermined privacy geofences.
- a database may include a listing of residential addresses associated with privacy geofences.
- a user could enter an address, or latitudinal/longitudinal coordinates of the address, in the FPS 200 or GCS 213 .
- the search based on the entered address or coordinates will then return a result of privacy geofences that are associated with the address, or with the latitudinal/longitudinal coordinates of the address.
- the privacy geofence for example may be the bounds of the property associated with the address.
- the privacy geofence may also be added by a user of the FPS 200 or GCS 213 .
- the user may select an area over a displayed image of an area where the privacy geofence should be placed.
- the privacy geofence may be displayed in colors or patterns that distinguish the privacy geofence from a flight boundary geofence.
- a user may draw in a graphical user interface a shape on a displayed map, and the system identifies any privacy geofences within the area enclosed by the shape. After a flight boundary geofence is created, the FPS 200 or GCS 213 can query the database to find geographic areas enclosed within the flight boundary geofence.
- the FPS 200 can then create privacy geofences that would be transmitted to the UAV for use in autonomous or manual flight operations.
- the database may include home addresses, geographic coordinates, geographic polygonal shapes, property bounds (either 2- or 3-dimensional).
- the flight boundary geofence and the privacy geofence can be temporarily active for specified durations or for particular time periods.
- the geofence could be active during certain weather conditions, times of day, or lighting conditions.
- a database may be accessed to find addresses or other areas identified as private, and use those addresses or other areas identified as private to create a flight plan with privacy operations taken into consideration.
- flight plan contingencies may be created.
- a flight plan contingency can instruct the UAV to perform an operation based on certain contingency criteria.
- contingency criteria may be the detection of one or more of a low battery or fuel state, a malfunctioning of an onboard sensor, motor, actuator, a deviation from the flight plan, or a crossing over a geofence boundary by the UAV.
- Other contingency events may include a ground control system power or system failure, a lost or degraded telemetry link to/from the UAV and ground control system, a stuck motor or actuator, GNSS failure or degradation, autopilot sensor failure (e.g., airspeed, barometer, magnetometer, IMU sensors), a control surface failure, a gear failure, a parachute deployment failure, an adverse weather condition, a nearby aircraft in airspace, a vehicle vibration, an aircraft fly-away) and any other situation that would cause the UAV to alter its flight path or flight plan.
- autopilot sensor failure e.g., airspeed, barometer, magnetometer, IMU sensors
- a control surface failure e.g., a gear failure
- parachute deployment failure e.g., an adverse weather condition, a nearby aircraft in airspace, a vehicle vibration, an aircraft fly-away
- FIGS. 4A-4D illustrate examples of a UAV performing privacy operations while flying in autopilot mode.
- a UAV 400 , 420 , 440 and 460 is shown navigating along path 402 , 422 , 442 and 462 respectively.
- the primary computer system 100 periodically determines the geospatial position of the UAV using information received from an onboard GNSS receiver 150 .
- FIGS. 4A-4D illustrate different types of privacy geofences and how the UAV may take action.
- the primary computer system 100 may have received information about one more of the different types geofences from the GCS 213 or the FPS 200 .
- FIG. 4A illustrates the UAV 400 traveling along flight path 402 .
- the primary computer system 100 of UAV 400 determines at a periodic frequency if the UAV 400 will be heading into or near a privacy geofence.
- the periodic frequency may be a set rate, or be a variable rate based on the speed of the UAV 400 . The faster the UAV 400 travels the greater the sampling rate.
- the UAV 400 has determined that, at its current altitude and direction, the UAV 400 will enter into the privacy geofence 406 .
- the UAV 400 determines the type of geofence, for example, based on a geofence type number that indicates the type of privacy geofence.
- the geofence 406 is identified as an avoid privacy geofence (more restrictive) or a fly-though geofence (less restrictive).
- geofence 406 is identified as an avoid privacy geofence and the UAV automatically calculates an avoidance flight path around, above, or below the geofence 406 , as shown in FIG. 4A .
- the UAV may receive data identifying new geofences that recently became active. In such cases, the flight planning system of UAV 400 can pre-calculate a new flight path 402 to avoid the new geofences.
- the UAV 400 may orient the UAV with a fixed camera, or orient a gimbaled camera to a direction pointing, for example, in an Eastern direction.
- the UAV computer system may refer to an onboard data store, to determine what operation should be taken for the particular geofence type.
- the FPS 200 may have generated a flight path to avoid geofence 406 , and at particular waypoints along the flight path, point the camera away from the geofence 406 , or cause the operation of certain sensors to be temporarily suspended or inactivated due to the proximity to the geofence 406 .
- FIG. 4B illustrates the UAV 420 flying along flight path 422 .
- the primary computer system 100 of UAV 420 determines that it will intersect with 3D geofence 426 (modeled as a cylinder). Based on the geofence type number the UAV 420 determines that geofence 426 is an avoid type geofence (i.e., that entry into the geofence is prohibited). In this instance, the UAV 420 will determine an avoidance flight path around, above, or below the geofence 406 . In this case, the UAV determined the flight path should ascend above, fly over, and then descend around the geofence 426 , as shown in FIG. 4B .
- the primary computer system 100 of the UAV 420 will determine its proximity to the intersecting boundary of the flight path 422 and the geofence 426 .
- the UAV 420 may disable or limit operation of a connected payload device, such as a camera, so that as the UAV 420 flies over the privacy geofence 426 no image data can be taken of the private residence 429 .
- a gimbaled camera for example, may be placed into a horizontal position (e.g., with respect to a local level coordinate system or camera platform coordinate system) so that the camera field of view cannot take images of the private residence 429 below.
- This privacy operation can be set so that even if the UAV 420 is flown manually, the UAV 420 will not be able to take pictures of the private residence 429 designated with the privacy geofence 426 .
- the UAV 420 may have stored in non-volatile memory, a database of privacy geofences for a general geographic area. While flying over a general geographic area, the primary computer system 100 of UAV 420 may refer to this onboard database, and limit camera operations and other sensors accordingly. With regard to FIGS. 4C and 4D , the UAVs 440 , 460 are shown traveling through a less restrictive fly-through geofences 446 and 466 respectively.
- the primary computer systems 100 of UAVs 440 , 460 will determine the respective positions of UAV 440 , 460 , and then limit or deactivate use of the respective sensors of the UAVs 440 , 460 before the UAVs fly through the geofences 446 , 466 .
- FIG. 5 is a flowchart of an example process of the UAV performing a privacy operation.
- the process 500 will be described as being performed by a system of one or more processors, such as the system 100 described in reference to FIG. 1 .
- the UAV obtains privacy geofence data describing one or more privacy geofences (block 502 ).
- the privacy geofence data may be received from a GCS 213 , FPS 200 , or the UAV may be configured to wirelessly access the Internet or other network to access sites that contain privacy geofences.
- the privacy geofences may also be created by the UAV computer system.
- Address information, and other geolocation information may be provided to the UAV for storage in onboard memory. If the geolocation information for an address, for example, only includes a point of information such as latitudinal/longitudinal coordinates, the UAV computer system may use a radial distance to avoid flying near the address.
- the radial distance may be a set distance, or may be a variable distance.
- multiple addresses may be located near each other.
- the computer system may use a distance general greater than a distance that may be used for a single address.
- These radial avoid distances may also be generated by the GCS 213 or FPS 200 and provided to the UAV.
- the privacy geofence for different locations thus may have different sized boundaries that may be based on characteristics of the location.
- the UAV navigates along a flight path and periodically obtains flight information of the UAV from a GNSS receiver, where the flight information is a geospatial position identifying the location of the UAV in 3D-space.
- the UAV determines a distance of the UAV to a boundary (e.g., a closest boundary) of the nearest geofence (block 504 ).
- the UAV computer system determines the privacy geofence type and in response determines whether a new flight path should be calculated (block 506 ). The UAV will then continue along an existing flight path and enter the privacy geofence or, optionally, will calculate a path around the geofence (block 508 ).
- the UAV then performs a privacy operation if the UAV's primary computer system determines that the UAV is within a predetermined distance to the boundary of the privacy geofence (e.g., a closest boundary) or is flying through a geofence with privacy controls enabled. (block 510 ). Also, the UAV primary computer system can log a sensor name/ID, date, time, geospatial coordinates of when the UAV disabled the sensor (e.g., disable a camera) or other device (e.g., a payload device).
- the UAV primary computer system allows the operation of the sensor to resume.
- a sensor may be enabled after the UAV has flown a predetermined distance from the geofence, or enabled if the direction of data capture of the sensor (e.g., the direction of a camera lens) is pointing away from the privacy geofence so that images of structures within the privacy geofence cannot be captured.
- the position of a sensor or a payload device may be controlled based on proximity to the boundary of the privacy geofence.
- the UAV may be configured such that a sensor (e.g., a camera), points away from the boundary of the geofence, temporarily inactivates, or suspends operation based on a threshold distance from the boundary of the privacy geofence.
- the computer system of the UAV determines its geospatial position. Based on proximity of the geospatial position of the UAV to the boundary of the privacy geofence, the UAV can maintain the orientation of the sensor so that the sensor will point away from the boundary of the privacy geofence.
- the UAV may orient the body of the UAV, or in the case of a gimbaled sensor, position the sensor to point away from the boundary of the privacy geofence. Also, if the UAV is flying over a privacy geofence then a gimbaled sensor may be rotated upwardly (e.g., rotated upwardly with respect to a local level coordinate frame) so that the sensor cannot collect data over the privacy geofence. Additionally, the computer system may limit power or control of a gimbal system for the sensor or payload devices.
- the UAV When the UAV conducts a flight operation, via autopilot or manual control, functional operation of sensors or payload devices may be controlled based on proximity to the boundary of the geofence.
- the UAV may be configured such that a connected sensor (e.g., a camera) will not trigger.
- the computer system of the UAV determines its geospatial position. Based on the geospatial position of the UAV in proximity to the boundary of the geofence, the UAV will not trigger the sensor. For example, for a UAV flight over a particular geofence, the UAV may inhibit operation of a camera or other sensor.
- Operation may be inhibited when the UAV is approaching the geofence boundary within a threshold distance, or when the UAV is flying at a particular distance above a privacy geofence (Above Ground), or when the UAV is flying through a privacy geofence.
- a privacy geofence Above Ground
- the threshold distance can either be a predetermined distance from a geofence ceiling or from the ground.
- Table 1 shows an example of how particular sensors may be configured for operation near a privacy geofence. Table 1 is illustrative, and not meant to limit the available sensors or distance that may be associated with a sensor.
- the UAV may determine its position to a privacy geofence. If a camera is attached as a payload device, the UAV primary computer system 100 may inhibit operation of the camera when the UAV is within 20 meters of the boundary of the privacy geofence. In another example, when the UAV travels above a privacy geofence at 100 meters operation of the camera is inhibited.
- the UAV can determine a time that the UAV will be within the threshold distance of the boundary of the privacy geofence or will cross over the privacy geofence boundary.
- the time can be calculated based on the velocity and trajectory of the UAV towards the geofence.
- the distance can be variable based on the altitude of the UAV. For example, the field of view of a camera or a sensor is greater at a higher altitude.
- a geofence may be assigned to a privacy geofence type which determines how various sensors or payload devices may be operable or inoperable. For example, for a particular privacy geofence certain sensors or other devices may be allowed to operate while other sensors or devices may not be allowed to operate.
- An inoperable sensor is a sensor that is deactivated or a sensor that is restricted from collecting data.
- a flight path of a UAV may be configured where the UAV flies over, through, or near two different privacy geofences.
- a first geofence may be set around a first parcel of land.
- a second geofence may be set around a nearby second parcel of land.
- the first geofence may have a geofence type 1 and the second geofence may have a second geofence type 2.
- the geofence type associated with a geofence may specify how a sensor or payload device operates.
- Table 2 is illustrative, and not meant to limit the scope or functions that may be assigned to a geofence type.
- the UAV sensors may allow for certain operations.
- the first geofence is assigned Type 1
- neither a camera, infrared camera, or UV camera would be operable if the UAV flies over the first geofence.
- a camera would not be operable, but an infrared camera or UV camera would be operable.
- each sensor is restricted to an azimuth of ⁇ 60° to 60° and an elevation of 0° to 85°. Accordingly, a sensor can be operable and but still have its functionality restricted based on a geofence type.
- the UAV may take into consideration whether the UAV is flying above an area with a certain population density, which is a measurement of human population per unit area or unit volume. For example, a flight planning system, or ground control system, may analyze population density at a given geographic location. If a flight path is set over a populated area having a threshold population density, then the UAV may be configured to limit certain flight operations, or sensor or payload device operations or functionality. For example, if the UAV flies over a threshold population density, then the UAV may be configured to disable or restrict use of its camera, or other sensors or payload devices. Moreover, an FPS 200 or GCS 213 may plan the flight path such that the UAV maintains a particular altitude or distance above the populated area.
- a flight planning system or ground control system
- a privacy geofence may be generated based on the population density for a given area. For example, in a highly populated or dense area, a privacy geofence may be created by the FPS covering the dense areas. The altitude or above ground limits of the privacy geofence may be adjusted according to the density of the area. The higher the population density, the higher ceiling of the privacy geofence may be set. In other words, in dense areas the UAV would be required to fly at a higher altitude, or above ground level distance height, than in a less dense area. Moreover, the UAV may store on board in memory, information, or heat maps, indicating population density level, and then the UAV may dynamically adjust its altitude based on the density level. Also, the UAV may dynamically create a flight plan to fly around a densely populated area. Privacy controls may be automatically enabled, or “turned-on” based on population density, or criteria.
- the UAV may optionally be configured to detect certain wireless signals from a signal source, such as a radio frequency (RF) or microwave signals source.
- a signal source such as a radio frequency (RF) or microwave signals source.
- the UAV may enter into a privacy mode where certain computer processing operations or flight operations are performed, or are disabled or restricted. For example, the UAV may disable or restrict operation of a camera.
- the UAV may receive or calculate signal strength indicator (RSSI) values to determine how close the UAV is to the signal source. Based on an estimated distance to the signal source, the UAV may perform operations meant to maintain the privacy of the area around the signal source.
- RSSI signal strength indicator
- a UAV may detect Wi-Fi router signals, cell phone signals or other radio tower signals, frequency modulation (FM) signals, amplitude modulation (AM) signals, microwave signals, Bluetooth signals or other wireless signals.
- FM frequency modulation
- AM amplitude modulation
- the UAV may conduct a flight operation, such as increase altitude or change the flight path to maintain a certain distance from the signal source.
- the UAV may inhibit operation of certain sensors or payload devices.
- the UAV may detect proximity to a wireless signal emanating from a mobile device, such as a mobile phone or tablet computer.
- the UAV may be configured such that when the UAV detects the signal from the mobile device, the UAV stays a minimally safe distance from the mobile device, such as 2 meters from the mobile device.
- the UAV can determine a proximate distance to the mobile device.
- the UAV may maintain a distance from the signal source so that a determined signal strength is below a threshold value.
- a signal source can be associated with a property and broadcast information pertaining to the property. Such information may include, for example, property boundaries and a preference for privacy for an individual or business.
- a residential property owner broadcasts their residential property boundaries (e.g., latitude, longitude) to assist a UAV to avoid flying over the property or provide a flight path or corridor over the property for the UAV.
- UAVs e.g., fixed wing
- a signal source may broadcast landing coordinates defining a safe landing zone on the property, for example, to deliver a package or make a contingency landing.
- the FPS 200 or GCS 200 may also determine a flight plan with way points that include commands to cause sensors and devices to maintain privacy as described above, such as inhibiting operation or maintaining orientation of payload devices or field of view of sensors.
- the FPS may calculate a flight path for a UAV within a flight boundary geofence.
- the FPS considers the boundary of the flight boundary geofence, and of any privacy geofences included wholly or partially within the flight boundary geofence.
- a flight path may be determined based on the type of privacy geofence where for example, flight of the UAV is allowed anywhere within the flight boundary geofence, but inhibits the path of the UAV through a privacy geofence.
- the FPS would not create a flight path, or allow a user manually determined flight path, that would pass through the privacy geofence.
- the flight path may be allowed through privacy geofence, but commands to orient or inhibit the sensors would be included with the flight plan that is provided to the GCS or to the UAV.
- a waypoint may be added to the flight plan at a position before the UAV would enter into a privacy geofence.
- the waypoint could include an associated sensor command, such as inhibit the sensor, or orient the sensor such that the sensor cannot obtain sensor information describing anything within the privacy geofence.
- Another waypoint could be set when the UAV exits the privacy geofence that reorients the sensor, or allows the sensor, to resume obtaining sensor information.
- a UAV is collecting data close to a boundary of a geofence.
- images or other information may be collected beyond the geofence boundary.
- a UAV is hovering close to a geofence boundary
- a still image or video taken may include an image or frame with image data that partially exceeds the boundary of the geofence.
- Images obtained from an aerial inspection can be modified to automatically blur or otherwise obfuscate portions that illustrate other properties besides the property being inspected.
- the UAV operating system can determine whether the camera will take, or has taken, still images or video, or whether the other sensors will collect image data beyond the geofence boundary. Additionally, the UAV operating system can limit the number of still images or video taken with another property in the camera's field of view.
- a mission can be simulated on a flight simulator (e.g., by the flight description module 210 ) prior to performing the actual mission so that the flight path or camera parameters can be adjusted appropriately along a mission timeline and/or flight path to ensure adherence to any privacy protocol or regulation.
- camera settings or UAV attitude or altitude adjustments may be made automatically according to a flight plan. For example, arrival of the UAV at a particular waypoint and/or at a particular time in the mission, the attitude or altitude of the UAV may be altered, or camera settings may be adjusted, based on privacy concerns.
- the camera settings can be set to change the depth of field (DOF) so that the building or object in the background of the image scene is blurred, thus allowing the image to be taken, so that an object of interest in the foreground of the image scene can be captured.
- DOF depth of field
- object recognition algorithm can be applied to the image to detect the window, door or other private object so that the private object can be obfuscated (e.g., blurred or occluded) locally during the mission or remotely using, for example, a network server.
- a network server can be used by the primary computer system 200 to perform a mission.
- image data can be tagged with metadata indicating the postal address of a private residence that was captured in the image.
- the metadata may also include a timestamp and camera settings.
- the tags can be used to index images in a database for later retrieval by property owners.
- the database can be made accessible to property owners over a network (e.g., the Internet) who may view the images and make their own determination of whether to have images deleted. For example, a property owner can enter their property address in a search engine to retrieve and review images tagged with their home address.
- FIG. 6 illustrates an example of a UAV 606 navigating and collecting images, where the camera captures image data beyond a geofence 602 boundary.
- a UAV flight module controls UAV navigation along a flight path over a house 610 to be inspected.
- a ground operator 604 places the UAV at the starting location and then initiates an autopilot mode using a GCS.
- the UAV may initiate an autopilot mode after the UAV is placed in a starting location for a predetermined period of time.
- the UAV ascends to a safe vertical height above the roof of the house 610 .
- the UAV primary computer system 100 periodically determines the geospatial position of the UAV using information received from an onboard GNSS receiver.
- the UAV orients a camera so that the camera's field of view points towards the house 610 .
- the UAV takes a series of digital images or video of the roof house as it flies along the flight path.
- the field of view 612 of an onboard data gathering device such as a digital camera, may include a portion of an adjacent property, or house, 614 .
- FIG. 7 is a flowchart of an example process 700 of the UAV performing an image obfuscation process.
- the process 700 will be described as being performed by a system of one or more processors, such as the primary computer system 100 described in reference to FIG. 1 .
- Privacy controls can be implemented to obfuscate or blur portions of images that are outside of a property boundary (e.g., a UAV can effect this while in flight, or a flight planning system can later process images after a flight operation is conducted).
- sensor data is captured with one or more sensors.
- An example sensor is a digital camera or video camera.
- the UAV computer operating system can utilize GNSS data, IMU data, ground altitude data, imagery (e.g., satellite imagery) to determine portions of captured images that are to be obfuscated. Based on the direction of the camera, and camera's field of view with respect to a boundary of the flight boundary geofence, the computer system can determine whether the digital image may include a portion of an image showing property outside of the flight boundary geofence.
- the UAV may with on-board processors obfuscate the portion of the image, or may delete the image if the image is determined to include only data showing an image beyond the flight boundary geofence.
- the images may be transferred to another system, such a flight planning system, or ground control system where the digital images may be similarly processed.
- the portion of the image that exceeds the flight boundary geofence can be obfuscated by a number of techniques.
- the image can be obfuscated via a process running on the UAV computer system, or can be performed post-image acquisition via a process on a separate computer or data system.
- An entire image may be blurred or scrambled, or image data modified such that the subject of the image cannot be determined.
- the UAV computer system may log various data associated with the digital images. For example, when the computer system instructs an attached digital camera to take a picture various information can be recorded, for example: heading of the UAV, the geospatial position of the UAV, the altitude of the UAV, the camera field of view, focal length, aperture settings, time the image was taken. If the camera is gimbaled (that is, rotatable) controllable on various axis, a sensor may keep track of the position of the camera to determine its position relative to the heading of the UAV.
- process 700 begins by combining one or more of GNSS coordinates, Above Ground Level (AGL) altitude, direction of the camera, camera field of view, camera focal length and other camera parameters to project the image onto a latitude/longitude coordinate system (block 702 ), warping it as needed, for example, with a Mercator Map projection.
- AGL Above Ground Level
- Process 700 then overlays the projection with a flight boundary geofence (which is already defined in a latitude/longitude coordinate system) (block 704 ).
- Process 700 determines the area in the latitude/longitude coordinate system that falls outside the flight boundary geofence but inside the projection of the taken image (block 706 ).
- Process 700 then projects this area back into the pixel coordinate system of the image to determine the pixels that that are outside of the flight boundary geofence (block 708 ), as described in reference to FIGS. 8A and 8B .
- Process 700 then obfuscates the pixels determined to be outside the flight boundary geofence (block 710 ).
- the image area outside of the flight boundary geofence may be deleted, blurred, scrambled or removed from the image.
- the pixel Red Green Blue (RGB) values of the area outside of the flight boundary geofence may be overwritten to show a single color (e.g., black, white, red, etc.).
- a Gaussian blur could be used to blur the pixels determined to be outside the flight boundary geofence.
- a separate computer system, user device, or FPS may perform the image obfuscation process.
- These other systems may receive the digital images obtained by the UAV.
- the images may be obtained during flight of the UAV, or post-flight.
- log data from the UAV may be provided to these other systems during flight or post flight operations.
- the log data may include information identifying geospatial locations of where the one or more images where taken, time images were taken, the pose, or position of the camera, and other information used by the process to determine the field of view of the camera in relationship to a flight boundary, or privacy boundary geofence.
- These system may determine that a portion of a digital image includes at least a portion of an object located outside of a boundary of the geofence, and obfuscate a portion of the digital image as discussed herein.
- the primary computer system of the UAV can determine a percentage of the pixels of the image that comprise the digital image. And then delete the digital image, or identify the image for deletion, if the percentage exceeds a predetermined threshold value. For example, if the image area outside the geofence is greater than 50%, than the system can be configured to delete the entire image.
- the flight planning system can automatically implement privacy controls when generating a 3D model. For instance, a UAV can capture imagery of a particular floor of a skyscraper. When generating a 3D model, other portions of the skyscraper (e.g., other floors) can be blurred out, and/or windows can be recognized and blurred to hide the inside. Additionally, sensors can be utilized that are opaque to glass (e.g., opaque to glass windows).
- FIGS. 8A and 8B illustrate the process described in FIG. 7 .
- FIG. 8A shows a projection 802 (e.g., using a homograph transformation) of a camera field of view (e.g., a sensor image) onto a geodetic (latitude/longitude) map projection 800 (e.g., Mercator map projection) that also depicts a flight boundary geofence 801 .
- An area 803 that lies within the projection 802 and outside the flight boundary geofence 801 is determined. If the flight boundary geofence 801 is labeled “A” and the projection 802 of the camera field of view is labeled “B”, then applying set theory mathematics, area 803 is the relative complement of A (left) in B (right) or,
- FIG. 8B illustrates captured image data in pixel coordinates 804 with pixels 805 corresponding to area 803 obfuscated.
- pixels 805 can be identified for obfuscation by using image segmentation or other edge detector algorithms and pixel occlusion algorithms.
- pixels 805 can be blurred using, for example, a Gaussian blur filter or scrambled or blocked out (e.g., with a solid color).
- objects in area 803 that are identified as private can be recognized using object recognition techniques (e.g., image segmentation, machine learning, edge detection, corner detection, blob detection, genetic algorithms, feature-based methods) and then obfuscated, locally or remotely using, for example, a network server.
- object recognition techniques e.g., image segmentation, machine learning, edge detection, corner detection, blob detection, genetic algorithms, feature-based methods
- images to be processed can be batch processed.
- a recognized object can be labeled (e.g., labeled as a “window”) and compared to a predetermined list of labeled objects that are designated private and if there is a match between labels, the object can be obfuscated. In some implementations, only the object is obfuscated.
- images to be processed either locally or remotely can be post-processed using a batch process.
- the processes and operations are described above in terms of one or more processors.
- the processor or processors can be onboard a UAV, onboard a user device, or part of a cloud-based processing system.
- a user device can be designated as a GCS and perform functions of a GCS.
- a user device and a UAV computer system can be designated as a FPS and perform functions of an FPS.
- functions of both the GCS and FPS can be performed by a cloud-based processing system.
- the image obfuscation techniques described herein can be performed in whole or in part on one or more of the UAV, the GCS or other user device (e.g., a personal computer) or a network-based server.
- UAVs may be used to implement the inventions described herein (for example, a fixed wing airplane, helicopter, a multi-rotor vehicle (e.g., a quad-copter in single propeller and coaxial configurations), a vertical take-off and landing vehicle, lighter than air aircraft).
- a multi-rotor vehicle in a coaxial configuration may use the same propeller pitch and diameter propellers, use different pitch and diameter propellers, or variable pitch propellers.
- UAVs such as drones, un-operated aerial vehicles, remotely operated aircraft, unmanned aircraft systems, any aircraft covered under Circular 328 AN/190 classified by the International Civil Aviation Organization, and so on.
- Sensors which are included in the general term payload (e.g., any hardware, software, module, and so on, that is not critical to the flight operation of the UAV) can include any device that captures real-world information, including cameras, radiation measuring instruments, distance detectors such as Lidar, and so on.
- code modules executed by one or more computer systems or computer processors comprising computer hardware.
- the code modules may be stored on any type of non-transitory computer-readable medium or computer storage device, such as hard drives, solid state memory, optical disc, and/or the like.
- the systems and modules may also be transmitted as generated data signals (for example, as part of a carrier wave or other analog or digital propagated signal) on a variety of computer-readable transmission mediums, including wireless-based and wired/cable-based mediums, and may take a variety of forms (for example, as part of a single or multiplexed analog signal, or as multiple discrete digital packets or frames).
- the processes and algorithms may be implemented partially or wholly in application-specific circuitry.
- the results of the disclosed processes and process steps may be stored, persistently or otherwise, in any type of non-transitory computer storage such as, for example, volatile or non-volatile storage.
- User interfaces described herein are optionally presented (and user instructions may be received) via a user computing device using a browser, other network resource viewer, a dedicated application, or otherwise.
- Various features described or illustrated as being present in different embodiments or user interfaces may be combined into the same embodiment or user interface.
- Commands and information received from the user may be stored and acted on by the various systems disclosed herein using the processes disclosed herein. While the disclosure may reference to a user hovering over, pointing at, or clicking on a particular item, other techniques may be used to detect an item of user interest. For example, the user may touch the item via a touch screen, or otherwise indicate an interest.
- the user interfaces described herein may be presented on a user terminal, such as a laptop computer, desktop computer, tablet computer, smart-phone, virtual reality headset, augmented reality headset, or other terminal type.
- the user terminals may be associated with user input devices, such as touch screens, microphones, touch pads, keyboards, mice, styluses, cameras, etc. While the foregoing discussion and figures may illustrate various types of menus, other types of menus may be used. For example, menus may be provided via a drop down menu, a toolbar, a pop up menu, interactive voice response system, or otherwise.
- engine and “module”, as used herein, refer to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, Java, Lua, C or C++.
- a software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts.
- Software modules configured for execution on computing devices may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, or any other tangible medium.
- Such software code may be stored, partially or fully, on a memory device of the executing computing device.
- Software instructions may be embedded in firmware, such as an EPROM.
- hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors.
- the modules described herein are preferably implemented as software modules, but may be represented in hardware or firmware. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage.
- Electronic data sources can include databases, volatile/non-volatile memory, and any memory system or subsystem that maintains information.
- a general purpose computer comprising one or more processors should not be interpreted as excluding other computer components, and may possibly include such components as memory, input/output devices, and/or network interfaces, among others.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Astronomy & Astrophysics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
- This application claims the benefit of priority from U.S. Provisional Application No. 62/292,783, filed Feb. 8, 2016, and U.S. Provisional Application No. 62/298,429, filed Feb. 22, 2016, which provisional applications are incorporated by reference herein in their entirety.
- Many unmanned aerial vehicles are configured to collect images, video and other data. As unmanned aerial vehicle (UAV, also referred to as a “drone”) operations become more prevalent, UAVs may increasingly fly around areas where privacy is of a heightened concern. Flight and post-flight operations of UAVs may be configured to address privacy concerns with regard to data collection and flight path operation.
- A UAV computer system may be configured to limit or disable the use of certain sensors or payload devices connected to the UAV in or about a geofence. The UAV can be configured to limit sensor data being captured of another property beyond the geofence within the sensor's field of view (FOV). The UAV computer system may be configured so UAV operation of the sensors or payload devices are temporarily disabled or their functionality restricted when in proximity to a geofence boundary.
- Subject matter described in this specification can be embodied in a system, method or computer program product including the action of navigating a UAV within or in proximity to a geofence. While the UAV is in flight, the UAV may obtain flight information from, for example, a global navigation satellite system (GNSS) receiver and determine a geospatial position of the UAV. A UAV computer system can then determine the UAV geospatial position relative to the boundary of the geofence (e.g., a closest boundary). The UAV computer system may perform one or more privacy operations if the UAV is within a particular distance to the boundary of the geofence. Some privacy operations include but are not limited to positioning a sensor (e.g., pointing a camera lens) away from the direction of the geofence boundary or disabling operation of a payload device. In the context of the camera, the positioning may be done by orienting the body of the UAV or a gimbaled camera to a position where the lens of the camera points in a direction away from the geofence boundary. If the UAV is flying above a geofence over a private property, a gimbaled camera can be positioned such that the camera cannot capture images of the private property. In some implementations, the camera can be disabled, its functionality restricted or the camera parameters adjusted (e.g., adjust aperture, focal length) to obfuscate an object in the image (e.g., in the background of the image).
- Additionally, images obtained from an aerial inspection of a particular property may be modified to obfuscate portions of images that may contain imagery of other properties besides the property being inspected. The UAV computer system can be configured to allow for real-time graphics processing to obfuscate or delete portions of still images or videos taken by the UAV. Additionally, post-flight processing may be performed on the still images or videos. For example, a flight planning system (FPS) or a ground control system (GCS) may use UAV flight log data to correlate a still image or video frame with a particular location and an orientation of the camera with respect to a geofence boundary. The UAV, GCS, or FPS, can utilize one or more of GNSS data, inertial measurement unit (IMU) data, ground altitude data and imagery data (e.g., satellite imagery) to determine portions of the imagery that are to be obfuscated (e.g., blurred). The FPS or GCS system can identify those images, video frames, or portions of images or video frames that may include imagery captured outside of the geofence boundary and obfuscate the images, video frames or portions of the images or video frames.
- The details of one or more embodiments of the subject matter of this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages of the subject matter will become apparent from the description, the drawings, and from the claims.
-
FIG. 1 is a block diagram of an example flight control system architecture for a UAV. -
FIG. 2 is a block diagram of an example flight planning system. -
FIG. 3 illustrates an example user interface for determining a flight boundary geofence. -
FIGS. 4A-4D illustrate examples of a UAV privacy operation where the UAV is navigating around and through two different privacy geofence types. -
FIG. 5 is a flowchart of an example process of the UAV performing a privacy operation. -
FIG. 6 illustrates an example of a UAV navigating and collecting images beyond a flight boundary geofence. -
FIG. 7 is a flowchart of an example process of the UAV performing an image obfuscation process. -
FIGS. 8A and 8B illustrate the process described inFIG. 7 . -
FIG. 1 is a block diagram of an example Unmanned Aerial Vehicle (UAV) architecture for implementing the features and processes described herein. A UAV can include aprimary computer system 100 and asecondary computer system 102. The UAVprimary computer system 100 can be a system of one or more computers, or software executing on a system of one or more computers, which is in communication with, or maintains, one or more databases. The UAVprimary computer system 100 can include aprocessing subsystem 130 including of one ormore processors 135, graphics processor units (GPUs) 136, input/output (I/O)subsystem 134 and an inertial measurement unit (IMU) 132. In addition, theprimary computer system 100 can include logic circuits, analog circuits, associated volatile and/or non-volatile memory, associated input/output data ports, power ports, etc., and include one or more software processes executing on one or more processors or computers. Theprimary computer system 100 can includememory 118.Memory 118 may include non-volatile memory, such as one or more magnetic disk storage devices, solid-state hard drives, or flash memory. Other volatile memory such a RAM, DRAM, SRAM may be used for temporary storage of data while the UAV is operational. Databases may store information describing UAV flight operations, flight plans, contingency events, geofence information, component information, and other information. - The
primary computer system 100 may be coupled to one or more sensors, such as GNSS receivers 150 (e.g., GPS receivers), temperature sensor 154 (e.g., a thermometer),gyroscopes 156,accelerometers 158, pressure sensors (static or differential) 152, current sensors, voltage sensors, magnetometers, hydrometers, and motor sensors. The UAV may use IMU 132 for use in inertial navigation of the UAV. Sensors can be coupled to theprimary computer system 100 or to controller boards coupled to theprimary computer system 100. One or more communication buses, such as, for example, a controller area network (CAN) bus, or signal lines, may communicatively couple the various sensor and components. - Various sensors, devices, firmware and other systems may be interconnected to support multiple functions and operations of the UAV. For example, the
primary computer system 100 may use various sensors to determine the UAV's current geospatial position, attitude, altitude, velocity, direction, pitch, roll, yaw and/or airspeed. Theprimary computer system 100 may also use various sensors to pilot the UAV along a specified flight path and/or to a specified location and/or to control the UAV's attitude, velocity, altitude, and/or airspeed (optionally even when not navigating the UAV along a specific flight path or to a specific location). - The
flight control module 122 handles flight control operations of the UAV. The module interacts with one ormore controllers 140 that control operation ofmotors 142 and/oractuators 144. For example, the motors may be used for rotation of propellers, and the actuators may be used for flight surface control such as ailerons, rudders, flaps, landing gear, and parachute deployment. - The
contingency module 124 monitors and handles contingency events. For example, the contingency module may detect that the UAV has crossed a boundary of a geofence, and then instruct theflight control module 122 to return to a predetermined landing location. In some implementations, thecontingency module 124 may detect that the UAV is flying out of a visual line-of-sight (VLOS) from a ground operator and instruct theflight control module 122 to perform a contingency action, e.g., to land at a landing location. Other contingency criteria may be the detection of a low battery or fuel state, malfunctioning of an onboard sensor, motor or actuator, or a deviation from the flight plan. The foregoing is not meant to be limiting, as other contingency events may be detected. In some instances, if equipped on the UAV, a parachute may be deployed if the motors or actuators fail. - The
mission module 129 processes the flight plan, waypoints, and other associated information with the flight plan as provided to the UAV in a flight package. Themission module 129 works in conjunction with theflight control module 122. For example, themission module 129 may send information concerning the flight plan to theflight control module 122, for example waypoints (e.g., latitude, longitude, altitude), flight velocity, so that theflight control module 122 can autopilot the UAV. - The UAV may have various sensor and other devices connected to the UAV for performing a variety of task, such as data collection. For example, the UAV may carry a
camera 149, which can be, for example, a still image camera, a video camera, an infrared camera, or a multispectral camera. In addition, the UAV may carry a Lidar, radio transceiver, sonar, and traffic collision avoidance system (TCAS). Data collected by the devices may be stored on the sensor or device collecting the data, or the data may be stored onmemory 118 of theprimary computer system 100. - The
primary computer system 100 may be coupled to various radios, e.g.,transceivers 159 for manual control of the UAV, and for wireless or wired data transmission to and from the UAVprimary computer system 100, and optionally a UAVsecondary computer system 102. The UAV may use one or more communications subsystems, such as a wireless communication or wired subsystem, to facilitate communication to and from the UAV. Wireless communication subsystems may include radio transceivers, infrared, optical ultrasonic and electromagnetic devices. Wired communication systems may include ports such as Ethernet, USB ports, serial ports, or other types of port to establish a wired connection to the UAV with other devices, such as a GCS, an FPS, or other devices, for example a mobile phone, tablet, personal computer, display monitor, or other network-enabled devices. The UAV may use a light-weight wire tethered to a GCS for communication with the UAV. The wire may be affixed to the UAV, for example via a magnetic coupler. - Flight data logs may be generated by reading various information from the UAV sensors and
operating system 120 and storing the information in computer-readable media (e.g., memory 118). The data logs may include a combination of various data, such as time, altitude, heading, ambient temperature, processor temperatures, atmospheric pressure, battery level, fuel level, absolute or relative position, position coordinates (e.g., GPS coordinates), pitch, roll, yaw, ground speed, humidity level, velocity, acceleration and contingency information. This foregoing is not meant to be limiting, and other data may be captured and stored in the flight data logs. The flight data logs may be stored on a removable medium. The medium can be installed on the GCS or onboard the UAV. The data logs, and individual data from the sensor or operating system, may be wirelessly transmitted to the GCS or to the FPS. - Modules, programs or instructions for performing flight operations, contingency maneuvers, and other functions may be performed with
operating system 120. In some implementations, theoperating system 120 can be a real-time operating system (RTOS), UNIX, LINUX, OS X, WINDOWS, ANDROID or other operating system. Additionally, other software modules and applications may run on theoperating system 120, such as theflight control module 122,contingency module 124,application module 126,database module 128 andmission module 129. Typically flight critical functions will be performed using theprimary computer system 100.Operating system 120 may include instructions for handling basic system services and for performing hardware dependent tasks. - In addition to the UAV
primary computer system 100, thesecondary computer system 102 may be used to run anotheroperating system 172 to perform other functions. The UAVsecondary computer system 102 can be a system of one or more computers, or software executing on a system of one or more computers, which is in communication with, or maintains, one or more databases. The UAVsecondary computer system 102 can include aprocessing subsystem 190 of one ormore processors 194,GPUs 192 and I/O subsystem 193. The UAVsecondary computer system 102 can include logic circuits, analog circuits, associated volatile and/or non-volatile memory, associated I/O data ports, power ports, etc., and include one or more software processes executing on one or more processors or computers. The UAVsecond computer system 102 can includememory 170.Memory 170 may include non-volatile memory, such as one or more magnetic disk storage devices, solid state hard drives, flash memory. Other volatile memory such a RAM, DRAM, SRAM may be used for storage of data while the UAV is operational. - Ideally modules, applications and other functions running on the
secondary computer system 102 will be non-critical functions in nature. If the function fails, the UAV will still be able to safely operate. The UAVsecondary computer system 102 can includeoperating system 172. In some implementations, theoperating system 172 can be based on real time operating system (RTOS), UNIX, LINUX, OS X, WINDOWS, ANDROID or other operating system. Additionally, other software modules and applications may run on theoperating system 172, such as anapplication module 174,database module 176,mission module 178 andcontingency module 180.Operating system 172 may include instructions for handling basic system services and for performing hardware dependent tasks. - The UAV can include
controllers 146.Controllers 146 may be used to interact with and operate apayload device 148, and other devices such ascamera 149.Camera 149 can include a still-image camera, video camera, infrared camera, multispectral camera, stereo camera pair. In addition,controllers 146 may interact with a Lidar, radio transceiver, sonar, laser ranger, altimeter, TCAS and Automatic dependent surveillance-broadcast (ADS-B) transponder. Optionally, thesecondary computer system 102 may include controllers to control payload devices. -
FIG. 2 is a block diagram illustrating anexample FPS 200. The various illustrated components may communicate over wired and/or wireless communication channels (e.g., networks, peripheral buses, etc.).FPS 200 can be a system of one or more computer processors, or software executing on a system of one or more computers. TheFPS 200 can maintain and communicate with one or more databases (e.g., databases 202-209), storing information describing prior implemented flight plans and information associated with each flight plan (e.g., information describing a UAV, an operator, property/map, mission, database, etc.). The databases can includeoperator database 202,operational database 204,UAV configuration database 206, UAVmission information database 208 and property andmap database 209. - The
FPS 200 can be a system of one or more processors, graphics processors, logic circuits, analog circuits, associated volatile and/or non-volatile memory, associated input/output data ports, power ports, etc., and include one or more software processes executing on one or more processors or computers. TheFPS 200 can be a component of, or be coupled to, one or more devices including one or more processors and configured to send data to and receive data from one ormore UAVs GCS 213 can be aspecialized user device 212 configured to control one or more aspects of a flight ofUAVs - The
FPS 200 may store, and maintain, flight operation information associated with a UAV. Flight operation information may include configuration information of each UAV, flight mission and planned flight path, operator information, the UAV's precise three-dimensional (3D) location in space, velocity information, UAV status (e.g., health of components included in the UAV), contingency plans, and so on. TheFPS 200 can receive flight path data (e.g., from an operator's device or an interactive user interface), and determine, information describing a flight plan. TheFPS 200 can provide aflight package 244 associated with the flight plan to a UAV (e.g.,UAV flight package 244 may be provided to theGCS 213 and transmitted to the UAV, or theFPS 200 may transmit theflight package 244 direction to the UAV. Additionally, theFPS 200 can store flight plan information, flight data log information, job information in the various databases. - The
example FPS 200 includes aflight description module 210 that can generate one or more interactive user interfaces (e.g., HTML or XML content for web pages) for rendering on a user device (e.g., user device 212). The interactive user interfaces may optionally be transmitted for display to the user device via a wireless network or other communication channel.User device 212 can receive, from an operator, information describing a flight plan to be performed by the UVA (e.g., performed byUAV - To describe one or more locations where the flight plan is to be conducted, a user interface may be configured to receive, from an operator, location information associated with the flight plan (e.g., an address of a home or property, geospatial position coordinates of a structure to be inspected, etc.). The
flight description module 210 can obtain information describing the location. For instance, the information can include property boundaries associated with an address (e.g., boundaries of a home, obtained from a database, or system that stores or configured to access property boundary information), obstacles associated with the location (e.g., nearby trees, electrical towers, telephone poles) and/or other information. Additionally, theflight description module 210 can obtain imagery, such as georectified imagery (e.g., satellite imagery), associated with the entered location information. Theflight description module 210 can include some or all of the information describing the location (e.g., the obtained imagery or boundary information) in an interactive user interface to be presented on theuser device 212 to an operator. - The operator of the
user device 212 may interact with user interfaces to describe a flight boundary geofence (as described further below) for a UAV to enforce. For instance, theuser device 212 can receive imagery associated with operator-entered location information, and present one or more geofence shapes layered on the imagery. The user interface provides functionality for the operator to select a presented shape (e.g., a polygon), and further functionality enabling the user to drag and/or drop the shape to surround an area of interest in the received imagery to limit allowable locations of a UAV to locations within the shape. Optionally, the user interface may allow theuser device 212 to receive input (e.g., of a finger or stylus) tracing a particular shape onto a touch-screen display of theuser device 212. Theflight description module 210 can store information describing the trace as a flight boundary geofence. Accordingly, theuser device 212 can provide information describing the traced shape to the flight description module 210 (e.g., coordinates associated with the imagery). Theflight description module 210 can correlate the traced shape to location information in a real-world coordinate system (e.g., geospatial coordinates in a geodetic (lat/lon) coordinate frame that correspond to the traced shape). - Similarly, a user interface can enable the operator to describe safe locations for a UAV to begin the flight plan (e.g., a launching location where the UAV takes off from the ground) and end the flight plan (e.g., a landing location where the UAV lands). As an example, the
flight description module 210 can analyze the obtained imagery associated with the entered location information and identify a geometric center of a convex area (e.g., a biggest convex area) within the geofence boundary that does not include obstructions (e.g., trees). For example, theflight description module 210 can determine an open area, such as an open pasture. Similarly, theflight description module 210 can obtain topographical information associated with the entered location information and can detect substantially flat areas (e.g., areas with less than a threshold of variance in height). For instance, theflight description module 210 can determine that an open space (e.g., an open clearing that is substantially flat) is a safe launching location for the UAV to take-off from, and can provide information recommending the open space in an interactive user interface presented on theuser device 212. Additionally, theflight description module 210 can analyze the obtained imagery and locate physical features that are generally known to be safe locations for take-off and landing. For example, theflight description module 210 can determine that a driveway of a home associated with the flight plan is safe and can select the driveway as a safe take off and landing location, or can recommend the driveway as a safe launching and landing location. - The
flight description module 210 can receive (e.g., from a user interface) survey or flight mission information, for instance information indicating a particular type of survey for a UAV to perform (e.g., damage inspection, inspection of a vertical structure, inspection of a rooftop). Theflight description module 210 can receive waypoints for the UAV to travel to, including an order in which the waypoints are to be traveled to, a ranking or importance of each, or a group of, waypoints, and specific actions for the UAV to take while traveling to, or after reaching, each waypoint. For instance, a user interface can optionally enable the operator of theuser device 212 to specify that upon reaching a particular waypoint, the UAV is to activate a particular sensor, or other payload devices, such as an infrared camera, a sensor measuring radiation, and so on. Additionally, a user interface can optionally enable the operator to specify transition speeds the UAV is to use when travelling between waypoints, or between particular waypoints. - In addition to the navigation of the UAV to the waypoints, operations to be performed at a particular location, or waypoint, may be identified by an operator using the
FPS 200 orGCS 213 via a user interface. The user interface can allow an operator plan a survey to obtain sensor information (e.g., photograph or videotape) of a specified location, or of a structure. Operations of the UAV may be automatically configured by either theFPS 200 orGCS 213 depending on the type of inspection to be performed. - The
flight description module 210 can receive information describing, or relevant to, configuration information of a UAV, such as a type of UAV (e.g., fixed-wing, single rotor, multi-rotor, and so on). In addition, theflight description module 210 can receive information describing, or relevant to, configuration information of sensors or other payload devices required for the survey or flight mission information, and general functionality to be performed. Theflight description module 210 can then determine recommendations of particular UAVs (e.g., UAVs available to perform the flight plan) that comport with the received information. Similarly, theflight description module 210 can determine that based on the received survey type, a UAV will require particular configuration information, and recommend the configuration information to the operator. For instance, theflight description module 210 can receive information identifying that hail damage is expected, or is to be looked for, and can determine that a UAV which includes particular sensors, and specific visual classifiers to identify hail damage, is needed. For example, theflight description module 210 can determine that a heat and/or thermal imaging sensor that includes specific visual classifiers that can discriminate hail damage from other types of damage (e.g., wind damage, rain damage, and so on) is needed. - The
flight description module 210 can utilize received survey or flight mission information to determine a flight pattern for a UAV to follow. For instance, theflight description module 210 can determine a path for the UAV to follow between each waypoint (e.g., ensuring that the UAV remains in the geofence boundary). Additionally, theflight description module 210 can determine, or receive information indicating, a safe minimum altitude for the UAV to enforce, the safe minimum altitude being an altitude at which the UAV is safe to travel between waypoints. The safe minimum altitude can be an altitude at which the UAV will not encounter obstacles within the geofence boundary (e.g., a height above buildings, trees, towers, poles, etc.). Similarly, the safe minimum altitude can be based on a ground sampling distance (GSD) indicating a minimum resolution that will be required from imagery obtained by the UAV while implementing the flight plan (e.g., based in part on capabilities of an included camera, such as sensor resolution, sensor size, and so on). - The
flight description module 210 can receive a time that the flight plan is to be performed (e.g., a particular day, a particular time at a particular day, a range of times, and so on). Theflight description module 210 can then determine an availability of UAVs and/or operators at the received time(s). For example, themodule 210 can obtain scheduling information). Additionally, theflight description module 210 can filter available UAVs according to determined configuration information (e.g., as described above). Optionally, theflight description module 210 can access weather information associated with the received time(s), and determine an optimal time or range of times for the job to be performed. For instance, a UAV that includes particular sensors (e.g., electro-optic sensors) can obtain better real-world information at particular times of day (e.g., at noon on a sunny day can provide better imagery by maximizing image contrast and minimizing the effects of shadows). Theflight description module 210 can determine the flight plan accordingly. - The
FPS 200 can provide aflight package 244 that includes the determined flight plan directly to a UAV (e.g., theUAV FPS 200 can provide theflight package 244 to auser device 212 orGCS 213. Theuser device 212 orGCS 213 can modify the flight plan or preserve the flight plan in theflight package 244. Theuser device 212 orGCS 213 can transmit theflight package 244 to theUAV flight package 244 can include a flight manifest file (e.g., an XML file) identifying necessary application and version information to conduct the flight plan. For instance, the UAV can be required to execute a particular application (e.g., “app” downloaded from an electronic application store) that provides functionality necessary to conduct the flight plan. As an example, an application can effect a flight plan associated with inspecting vertical structures, and the UAV can be required to execute the application prior to initiation of the flight plan. - In particular, the
FPS 200 may create a flight plan for automated or partially automated flight of a UAV, taking into consideration structural data to avoid situations where the UAV may fly out of VLOS of a base location. The base location can include one or more locations of an operator of a UAV. In some implementations, the base location can be a geospatial position of theuser device 212 or a launching location of the UAV. - The
FPS 200 may receive, via a user interface 214, a location for an aerial survey to be conducted by an unmanned aerial vehicle. One or more images may be displayed depicting a view of the location. The interface allows for a selection of a launching location of the UAV. As the images have associated geospatial positions, theFPS 200 can determine an associated latitude/longitude for the launching location. The user interface 214 may receive an input or selections for one or more flight waypoints. Similar to the launching locations, the flight waypoints having an associated geospatial position. TheFPS 200 may assign altitudes for the flight waypoints, or altitudes for the flight waypoints may be determined by a user, and specific numeric altitudes values may be set. - The
FPS 200 may determine based on the launching location and altitude of the one or more flight waypoints whether a flight waypoint may cause a non-VLOS occurrence. From the launching location, a flight plan may be generated using waypoints having associated latitude and longitude coordinates, and an associated altitude. TheFPS 200 may not allow a UAV waypoint where the VLOS from the base location (e.g., the launching location, or an area around the launching location) would be blocked because of a structure. TheFPS 200 may use 3D polygonal data, topographical data or other structure data in generating the flight plan. TheFPS system 200 could use a 3D coordinate system to determine, based on a base location and each waypoint location, whether the UAV would likely enter into a non-VLOS situation. TheFPS 200 can then generate a flight plan that avoids the non-VLOS situation and includes only the flight waypoints that would not cause a non-VLOS occurrence. - Additionally, the
FPS 200 may determine a geofence boundary to limit flight of the UAV to a bounded area. The user interface 214 may display the geofence boundary over one or more location images. Additionally, theflight planning system 200 may determine a survey area, and set the survey area within the geofence boundary. - During or after a flight plan by a UAV is conducted, the
FPS 200 receives, from a GCS 213 (or directly from the UAV), data (such as flight log data and collected sensor data). A user interface of theFPS 200 then displays at least a portion of sensor data collected by the UAV, information describing a planned survey, and information associated with the flight data package. - Similar to the
FPS 200, theGCS 213 may also be used for flight and contingency planning. TheGCS 213 can receive flight plans from theFPS 200 for transmission to the UAV. TheGCS 213 also allows for manual override of a UAV operating in an autopilot mode. A flight plan may be transmitted to the UAV either via a wireless or tethered connection. Ideally, theGCS 213 is a mobile device, such a laptop, mobile phone, tablet device, with a cellular and other wireless connection for data transmission over the Internet or other network. - Each of
user devices 212, includingspecialized user device 212 designated asGCS 213, can be a system of one or more computers, or software executing on a system of one or more computers, which is in communication with, or maintains, one or more databases, e.g., databases, storing information describing UAV flight operations and components. Each of theuser devices 212 can be a system of one or more processors, graphics processors, logic circuits, analog circuits, associated volatile and/or non-volatile memory, associated I/O data ports, power ports, etc., and include one or more software processes executing on one or more processors or computers. - Although in one embodiment of the invention, the
FPS 200 may be primarily used to create and transmit aflight package 244 to a UAV orGCS 213, the UAV orGCS 213 can initiate the request for theflight package 244 from theFPS 200. An operator may take the UAV orGCS 213 to a property location. The UAV orGCS 213 may then request theflight package 244, or an updatedflight package 244 using a current position of the UAV orGCS 213. For example, the UAV orGCS 213 can determine its geospatial position via a GNSS receiver (e.g., using GPS, GLONASS, Galileo, or Beidou system). The UAV orGCS 213 can then transmit its geospatial position to theFPS 200, along with other identifying information about the requesting device, such as, for example, a unique identifier (UID) or media access control (MAC) address. TheFPS 200 will receive the request, and determine if an updated or changed flight package exists by comparing the device identifier with identifiers in a database storing the new or updated flight package information. Ifflight planning system 200 finds a new or updated flight package, then theflight planning system 200 transmits the flight package from theflight planning system 200. The UAV orGCS 213 can receive the flight package. A confirmation acknowledging receipt of the flight package may then be transmitted from the UAV orGCS 213 to theFPS 200. Theflight planning system 200 will then update a database record to indicate that the particular flight package has been received. Moreover, the UAV orGCS 213 can supply the property location, and a new job request can be sent to theFPS 200. Theflight planning system 200 may create a new flight package for the UAV orGCS 213. - For autonomous flight of a UAV (
UAV flight planning system 200, orGCS 213. A flight plan instructs the UAV where it should fly in a 3D space. The flight plan includes a series of connected waypoints that define where the UAV should fly and what actions that the UAV should complete during a particular flight. The UAV may have an autopilot flight module operating on the UAV computer system that uses the flight plan to automatically fly the UAV. The flight plan information may be provided to the GCS 213 (and then to the UAV) or directly to the UAV, in aflight package 244 comprising the flight plan and other information (such as contingency event instructions). - Using the
flight planning system 200, or aGCS 213, a user may select a series of geographically-based waypoints and a launching location for the UAV. Based on the waypoints, a flight plan may be constructed allowing the UAV to autonomously navigate itself. In some implementations, theflight planning system 200, orGCS 213 may automatically define a flight plan based on various criteria, such as an inspection, or survey type. - While the
primary computer system 100 autopilot module is navigating the UAV according to a flight plan, certain aspects of the flight pattern may be controlled by the operator'suser device 212, orGCS 213. The flight plan or pattern may be configured such that for a particular waypoint, a vertical ascent/descent rate, UAV altitude, horizontal UAV rotation, payload gimbal, payload direction, waypoint transition speed, or trigger of a payload sensor may be controlled by the UAV operator. Theuser device 212, orGCS 213, may have a physical control device such as a toggle or joystick, or virtual control in a user interface that allows the operator to control vertical ascent/descent rate, UAV altitude, UAV attitude, horizontal UAV rotation, payload gimbal, payload direction. Theuser device 212, orGCS 213, can trigger a payload sensor while conducting the inspection. For example, the UAV may navigate via autopilot to a position over an inspection location. An operator then can provide input to theuser device 212, orGCS 213. The user device, or GCS, may transmit a signal or information corresponding to the user input to the UAV via radio communication. The signal or information can control the vertical ascent/descent rate, UAV altitude, UAV attitude, horizontal UAV rotation, payload gimbal, or payload direction, or waypoint transition speed. The signal or information can trigger a payload sensor to turn on or turn off. This particular mode allows for partial autopilot control and partial or complete manual control of the UAV. Even though the operator may manually control certain aspects of the flight plan, if one has been set, the UAV can remain within a geofence boundary envelope and to remain within VLOS of the operatoroperating user device 212. - In another example, the UAV may be partially manually controlled by an operator using the
user device 212 while the UAV is in autopilot mode. The UAV may receive a command from theuser device 212 to nudge the UAV in a particular direction. In this case, the control input of theuser device 212 causes theuser device 212 to send a command to the UAV, instructing the UAV to move slightly, for example between 0.1 to 3 meters, in a particular direction (in an x, y, or z axis, or diagonally). The particular distance can be predetermined, or be variable based on the proximity to a structure. Nudging the UAV allows the operator to move the UAV away from the structure if the operator sees that the UAV is flying too close to the structure. The nudge command may be provided any time to the UAV while it is operating in an autopilot mode. The UAV should still enforce geofence boundaries (if one has been set) and not allow a nudge to cause the UAV to move beyond a geofence boundary. - The
flight planning system 200 can include areport generation module 230 and apermission control module 240. Thereport generation module 230 is configured to generate one or more flight reports. The flight reports can include flight data (e.g., path, duration and actions of control surfaces), sensor data (e.g., air pressure, temperature and humidity), and payload data (e.g., information gathered by a payload camera). Thepermission control module 240 is configured to impose one or more limits on flights of the UAV. The limits can include, for example, that the UAV shall stay inside or outside an envelope defined by geofences or by geographic coordinates or that the UAV shall stay within VLOS of a base location (e.g., a location of user device 212). -
FIG. 3 illustrates anexample user interface 300 for determining a geofence boundary. Theuser interface 300 is an example of an interactive user interface, generated by a system (e.g., theflight planning system 200, or a presentation system in communication with the flight planning system 200) that is configured to receive user inputs, access one or more databases, and update theuser interface 300 in response to received user inputs. Theuser interface 300 can include a document (e.g., an interactive document such as a web page), presented on a user device (e.g., a desktop, laptop, or tablet computer, a smart-phone, or a wearable device, etc.). - The
user interface 300 includes image 302 (e.g., satellite image as depicted) of a location entered by the user of theuser interface 300. Theimage 302 included in theuser interface 300 can be interactive. A user can zoom in and out of theimage 302 to target a greater or smaller real-world area. For instance, the user can interact with a zoom control, or the user can utilize a touch surface (e.g., a touch screen) to zoom in and out (e.g., the user can pinch to zoom). - The
user interface 300 enables the user to select areas on theimage 302 that are defined by a user-specified shape. For example, theuser interface 300 can receive a user selection of particular vertices that define the illustrated polygon (e.g.,vertices 304A-E). The system can shade, or otherwise highlight, the internal portion of the user-specified shape. Additionally, theuser interface 300 enables the user to select a particular vertex of the illustrated polygon (e.g.,vertex 304A), and drag the shape into existence by moving a finger or stylus on a touch sensitive screen of the user device. - The
user interface 300 can receive input for generating aflight path 306 for the UAV to include a launching andlanding location 310. Theuser interface 300 may include amenu 308 for creating different representative layers of a flight plan. For example,menu 308 shows a flight plan specifying a geofence, a photo survey area, a launch/land area, and a base map. Themenu 308 includes a geofence menu item that refers to the geofence as represented by theconnected vertices 304A-304E. Themenu 308 includes a photo survey area menu item representing theflight path 306. Themenu 308 includes a launch/land area menu item representing the launch/land locations 310. Themenu 308 includes a base map layer menu item that represents thebase image layer 302, which includesimage 302. - As illustrated in
FIG. 3 , theimage 302 includes a highlighted area that defines a geofence boundary to be enforced by a UAV when implementing a flight plan. Different types of geofences may be used by the UAV during flight operations. A geofence can include a two-dimensional (2D) or 3D location-based boundary. A geofence can be understood as a virtual boundary for a geographic location or a virtual surface around a geographic location in 3D space. The geofence boundary can be represented on a map as one or more polygonal shapes or rounded shapes, for example a circle, rectangle, sphere, cylinder, cube, or other shapes or bodies. A geofence may also be a time-based (four-dimensional) virtual boundary where the geofence exists for a particular duration, for example, a number of hours or days, or for a specific time period, for example, from 2:00 PM to 4:00 PM occurring on certain days, or other periods of time. A 3D geofence may exist in a particular space above ground. A geofence may be represented by latitudinal and longitudinal connected points, or other coordinate systems. A geofence may be created such that the geofence has dynamic aspects where the geofence may increase or decrease in size based on various conditions. For UAV flight operations, geofence structures are received by the UAV and stored in non-volatile memory. - For UAV operations, different types of geofences may be created. To limit flight operations within a particular volumetric space, a 3D geofence may be created. Data representing the flight boundary geofence can be transmitted to the UAV operating system. The exemplary flight planning system or GCS may be used to create the geofence and transmit the geofence data structure to the UAV.
- For both autonomous UAV flight operations and manually controlled flight operations, the UAV can be limited to flight within a flight boundary geofence. If for example, an operator of the UAV in a manually controlled mode attempts to maneuver the UAV outside of the flight boundary geofence, the UAV may detect a contingency condition (e.g., the UAV is about to fly outside of the geofence) and then automatically direct the UAV to return to a specified predetermined landing location. Furthermore, if the UAV is capable of hovering, such as a multi-rotor UAV, the UAV may be inhibited from moving across a boundary of a flight boundary geofence and the UAV could be set to hover and not continue past the boundary of the geofence.
- Optionally, the system can utilize property information, such as property boundaries, and automatically include a highlighted portion of the
image 302 as being a possible flight boundary geofence. For instance, as illustrated inFIG. 3 , portions of the flight boundary geofence defined byconnected vertices image 302. Theprimary computer system 100 can determine that the entered location information describes a particular property (e.g., an open clearing that borders the road), and can highlight the particular property. Optionally, theprimary computer system 100 can include a buffer from the property boundaries of the location to ensure that even when facing forces of nature (e.g., in a strong gust of wind) the UAV will remain within the property boundaries. - Property boundary information from a database can be used to create the flight boundary geofence to limit flight of the UAV within the property's boundary. The UAV can then be constrained for flight operations only within this geofence. The property information used to create the flight boundary geofence can be of various data types, for example, parcel polygons, vector, rasterized, shape files, or other data types. For the particular property, the
FPS 200 may create the flight boundary geofence based on the property shape data. The various data types ideally can have geolocation and/or coordinate information, such as latitudinal/longitudinal points for use in orienting and creating the flight boundary geofence. The geofence envelope may be identical in shape to the property boundary. Optionally, the boundary of the geofence may be reduced in size. For example, the flight boundary geofence may be reduced in size by a set distance, for example 5 meters, towards a centroid of the property. Reduction of the flight boundary geofence creates a buffer zone. The buffer zone may help avoid an unintentional flyover of an adjacent property boundary. Optionally, theFPS 200 may display an area with parcel polygonal data. An interface of theFPS 200 may then receive a selection of one or more parcels. TheFPS 200 then can use the selections to create one or more jobs and multiple geofence envelopes. For the multiple parcels, the operator would go to each parcel property and conduct multiple jobs. - Optionally, the
user interface 300 can be utilized by a UAV operator to indicate waypoints to be traveled to during the flight plan. For instance, the user can select portions of theimage 302 to designate as waypoints, and theuser interface 300 can be updated to present selectable options associated with each waypoint. As an example, the user can designate an order that each waypoint is to be traveled to, actions the UAV is to take at the waypoint, a transition speed between each or all waypoints, and so on. The system can determine the flight boundary geofence from the waypoints, such that the geofence perimeter encompasses the waypoints. The determined flight boundary geofence can be presented to the user for review, and the user can modify the boundary by interacting with theuser interface 300. - Additionally, the
user interface 300 can include text provided by the user that describes the flight plan. A different user can access theuser interface 300, and quickly view the determined flight boundary geofence along with text describing the flight plan. In this way, a user can quickly describe flight plan information sufficient for a UAV to implement, and other users can quickly view graphical representations of the flight plan (e.g., graphical representation of the flight boundary geofence along with textual data describing the flight plan). - Another type of geofence that may be created is a privacy geofence. A privacy geofence may be used independent of, or in conjunction with a flight boundary geofence. A privacy geofence can be created and transmitted to the UAV such that the UAV avoids flying over the privacy geofence, or avoids flying through the privacy geofence. Also, particular operations of the UAV while flying over, or flying through a privacy geofence may be performed or inhibited. Also, certain sensors or other devices connected to the UAV may be wholly or partially disabled, wholly or partially enabled, or otherwise prevented from being used while flying over or through the privacy geofence.
- The flight boundary geofence and the privacy geofence may have an associated buffer zone. The buffer zone allows the UAV to fly a particular distance over the boundary of the geofence, and not create a contingency condition. Occasionally, a UAVs estimated location may have a certain level of error, for example, possibly caused by, for example, GNSS clock drift. An associated buffer zone allows the UAV to fly a certain distance to either side of the boundary of the geofence. For example, the buffer zone could be a fixed distance, say 20 meters. So long as the UAV determines it is located within this buffer zone, the UAV will be considered to be operating within the geofence. The buffer zone can be a function of the size of the geofence. A smaller geofence will have a smaller buffer zone, for example maybe 2 meters and a larger geofence will have a larger buffer zone.
- Additionally, using databases having property boundary information, a user can query the database and create a flight boundary geofence to ensure that the UAV stays within a particular property boundary. Also, an electronic database may be referenced by the
FPS 200 orGCS 213 to search for predetermined privacy geofences. For example, a database may include a listing of residential addresses associated with privacy geofences. A user could enter an address, or latitudinal/longitudinal coordinates of the address, in theFPS 200 orGCS 213. The search based on the entered address or coordinates will then return a result of privacy geofences that are associated with the address, or with the latitudinal/longitudinal coordinates of the address. The privacy geofence for example may be the bounds of the property associated with the address. The privacy geofence may also be added by a user of theFPS 200 orGCS 213. For example, the user may select an area over a displayed image of an area where the privacy geofence should be placed. The privacy geofence may be displayed in colors or patterns that distinguish the privacy geofence from a flight boundary geofence. Also, in an implementation, a user may draw in a graphical user interface a shape on a displayed map, and the system identifies any privacy geofences within the area enclosed by the shape. After a flight boundary geofence is created, theFPS 200 orGCS 213 can query the database to find geographic areas enclosed within the flight boundary geofence. TheFPS 200 can then create privacy geofences that would be transmitted to the UAV for use in autonomous or manual flight operations. The database may include home addresses, geographic coordinates, geographic polygonal shapes, property bounds (either 2- or 3-dimensional). In some implementations, the flight boundary geofence and the privacy geofence can be temporarily active for specified durations or for particular time periods. In some implementations, the geofence could be active during certain weather conditions, times of day, or lighting conditions. - Also, for a flight path over a populated area, a database may be accessed to find addresses or other areas identified as private, and use those addresses or other areas identified as private to create a flight plan with privacy operations taken into consideration.
- Additionally, flight plan contingencies may be created. In some implementations, a flight plan contingency can instruct the UAV to perform an operation based on certain contingency criteria. For example, contingency criteria may be the detection of one or more of a low battery or fuel state, a malfunctioning of an onboard sensor, motor, actuator, a deviation from the flight plan, or a crossing over a geofence boundary by the UAV. Other contingency events may include a ground control system power or system failure, a lost or degraded telemetry link to/from the UAV and ground control system, a stuck motor or actuator, GNSS failure or degradation, autopilot sensor failure (e.g., airspeed, barometer, magnetometer, IMU sensors), a control surface failure, a gear failure, a parachute deployment failure, an adverse weather condition, a nearby aircraft in airspace, a vehicle vibration, an aircraft fly-away) and any other situation that would cause the UAV to alter its flight path or flight plan.
-
FIGS. 4A-4D illustrate examples of a UAV performing privacy operations while flying in autopilot mode. AUAV path primary computer system 100 periodically determines the geospatial position of the UAV using information received from anonboard GNSS receiver 150.FIGS. 4A-4D illustrate different types of privacy geofences and how the UAV may take action. Theprimary computer system 100 may have received information about one more of the different types geofences from theGCS 213 or theFPS 200. -
FIG. 4A illustrates theUAV 400 traveling alongflight path 402. In this example, theprimary computer system 100 ofUAV 400 determines at a periodic frequency if theUAV 400 will be heading into or near a privacy geofence. The periodic frequency may be a set rate, or be a variable rate based on the speed of theUAV 400. The faster theUAV 400 travels the greater the sampling rate. Atpoint 404 theUAV 400 has determined that, at its current altitude and direction, theUAV 400 will enter into theprivacy geofence 406. TheUAV 400 determines the type of geofence, for example, based on a geofence type number that indicates the type of privacy geofence. Based on the geofence type number, thegeofence 406 is identified as an avoid privacy geofence (more restrictive) or a fly-though geofence (less restrictive). In this example,geofence 406 is identified as an avoid privacy geofence and the UAV automatically calculates an avoidance flight path around, above, or below thegeofence 406, as shown inFIG. 4A . In some implementations, while in flight the UAV may receive data identifying new geofences that recently became active. In such cases, the flight planning system ofUAV 400 can pre-calculate anew flight path 402 to avoid the new geofences. - As the
UAV 400 flies closer to theprivacy geofence 406, the UAV will nearly pass the boundary of theprivacy geofence 406. In this case, the UAV may orient the UAV with a fixed camera, or orient a gimbaled camera to a direction pointing, for example, in an Eastern direction. The UAV computer system may refer to an onboard data store, to determine what operation should be taken for the particular geofence type. Additionally, theFPS 200 may have generated a flight path to avoidgeofence 406, and at particular waypoints along the flight path, point the camera away from thegeofence 406, or cause the operation of certain sensors to be temporarily suspended or inactivated due to the proximity to thegeofence 406. -
FIG. 4B illustrates theUAV 420 flying along flight path 422. Atpoint 424 theprimary computer system 100 ofUAV 420 determines that it will intersect with 3D geofence 426 (modeled as a cylinder). Based on the geofence type number theUAV 420 determines thatgeofence 426 is an avoid type geofence (i.e., that entry into the geofence is prohibited). In this instance, theUAV 420 will determine an avoidance flight path around, above, or below thegeofence 406. In this case, the UAV determined the flight path should ascend above, fly over, and then descend around thegeofence 426, as shown inFIG. 4B . - As the
UAV 420 flies along theflight path 424, theprimary computer system 100 of theUAV 420 will determine its proximity to the intersecting boundary of the flight path 422 and thegeofence 426. Depending on how the privacy controls are configured, theUAV 420 may disable or limit operation of a connected payload device, such as a camera, so that as theUAV 420 flies over theprivacy geofence 426 no image data can be taken of theprivate residence 429. Optionally, a gimbaled camera for example, may be placed into a horizontal position (e.g., with respect to a local level coordinate system or camera platform coordinate system) so that the camera field of view cannot take images of theprivate residence 429 below. This privacy operation can be set so that even if theUAV 420 is flown manually, theUAV 420 will not be able to take pictures of theprivate residence 429 designated with theprivacy geofence 426. Optionally, theUAV 420 may have stored in non-volatile memory, a database of privacy geofences for a general geographic area. While flying over a general geographic area, theprimary computer system 100 ofUAV 420 may refer to this onboard database, and limit camera operations and other sensors accordingly. With regard toFIGS. 4C and 4D , theUAVs geofences primary computer systems 100 ofUAVs UAV UAVs geofences -
FIG. 5 is a flowchart of an example process of the UAV performing a privacy operation. For convenience, theprocess 500 will be described as being performed by a system of one or more processors, such as thesystem 100 described in reference toFIG. 1 . - The UAV obtains privacy geofence data describing one or more privacy geofences (block 502). The privacy geofence data may be received from a
GCS 213,FPS 200, or the UAV may be configured to wirelessly access the Internet or other network to access sites that contain privacy geofences. The privacy geofences may also be created by the UAV computer system. Address information, and other geolocation information may be provided to the UAV for storage in onboard memory. If the geolocation information for an address, for example, only includes a point of information such as latitudinal/longitudinal coordinates, the UAV computer system may use a radial distance to avoid flying near the address. The radial distance may be a set distance, or may be a variable distance. In certain situations, multiple addresses may be located near each other. In such instances, the computer system may use a distance general greater than a distance that may be used for a single address. These radial avoid distances may also be generated by theGCS 213 orFPS 200 and provided to the UAV. The privacy geofence for different locations thus may have different sized boundaries that may be based on characteristics of the location. - Next the UAV navigates along a flight path and periodically obtains flight information of the UAV from a GNSS receiver, where the flight information is a geospatial position identifying the location of the UAV in 3D-space. As the UAV travels along the flight path, the UAV determines a distance of the UAV to a boundary (e.g., a closest boundary) of the nearest geofence (block 504).
- The UAV computer system determines the privacy geofence type and in response determines whether a new flight path should be calculated (block 506). The UAV will then continue along an existing flight path and enter the privacy geofence or, optionally, will calculate a path around the geofence (block 508).
- The UAV then performs a privacy operation if the UAV's primary computer system determines that the UAV is within a predetermined distance to the boundary of the privacy geofence (e.g., a closest boundary) or is flying through a geofence with privacy controls enabled. (block 510). Also, the UAV primary computer system can log a sensor name/ID, date, time, geospatial coordinates of when the UAV disabled the sensor (e.g., disable a camera) or other device (e.g., a payload device).
- Once the UAV has flown through the geofence, or around the geofence, the UAV primary computer system allows the operation of the sensor to resume. In the case of the UAV, having to avoid the geofence, a sensor may be enabled after the UAV has flown a predetermined distance from the geofence, or enabled if the direction of data capture of the sensor (e.g., the direction of a camera lens) is pointing away from the privacy geofence so that images of structures within the privacy geofence cannot be captured.
- When the UAV conducts a flight operation, via autopilot or manual control, the position of a sensor or a payload device may be controlled based on proximity to the boundary of the privacy geofence. For example, the UAV may be configured such that a sensor (e.g., a camera), points away from the boundary of the geofence, temporarily inactivates, or suspends operation based on a threshold distance from the boundary of the privacy geofence. The computer system of the UAV determines its geospatial position. Based on proximity of the geospatial position of the UAV to the boundary of the privacy geofence, the UAV can maintain the orientation of the sensor so that the sensor will point away from the boundary of the privacy geofence. The UAV may orient the body of the UAV, or in the case of a gimbaled sensor, position the sensor to point away from the boundary of the privacy geofence. Also, if the UAV is flying over a privacy geofence then a gimbaled sensor may be rotated upwardly (e.g., rotated upwardly with respect to a local level coordinate frame) so that the sensor cannot collect data over the privacy geofence. Additionally, the computer system may limit power or control of a gimbal system for the sensor or payload devices.
- When the UAV conducts a flight operation, via autopilot or manual control, functional operation of sensors or payload devices may be controlled based on proximity to the boundary of the geofence. For example, the UAV may be configured such that a connected sensor (e.g., a camera) will not trigger. The computer system of the UAV determines its geospatial position. Based on the geospatial position of the UAV in proximity to the boundary of the geofence, the UAV will not trigger the sensor. For example, for a UAV flight over a particular geofence, the UAV may inhibit operation of a camera or other sensor. Operation may be inhibited when the UAV is approaching the geofence boundary within a threshold distance, or when the UAV is flying at a particular distance above a privacy geofence (Above Ground), or when the UAV is flying through a privacy geofence.
-
TABLE 1 Sensor Threshold Distance AG Limits Photographic Camera 20 meters 100 meters Video Camera 30 meters 100 meters Infrared Camera 5 meters 200 meters UV Camera 10 meters 200 meters - The threshold distance can either be a predetermined distance from a geofence ceiling or from the ground. Table 1 shows an example of how particular sensors may be configured for operation near a privacy geofence. Table 1 is illustrative, and not meant to limit the available sensors or distance that may be associated with a sensor.
- As shown in Table 1, the UAV may determine its position to a privacy geofence. If a camera is attached as a payload device, the UAV
primary computer system 100 may inhibit operation of the camera when the UAV is within 20 meters of the boundary of the privacy geofence. In another example, when the UAV travels above a privacy geofence at 100 meters operation of the camera is inhibited. - Based on a current position of the UAV, the UAV can determine a time that the UAV will be within the threshold distance of the boundary of the privacy geofence or will cross over the privacy geofence boundary. The time can be calculated based on the velocity and trajectory of the UAV towards the geofence. Also, the distance can be variable based on the altitude of the UAV. For example, the field of view of a camera or a sensor is greater at a higher altitude.
- Additionally, a geofence may be assigned to a privacy geofence type which determines how various sensors or payload devices may be operable or inoperable. For example, for a particular privacy geofence certain sensors or other devices may be allowed to operate while other sensors or devices may not be allowed to operate. An inoperable sensor is a sensor that is deactivated or a sensor that is restricted from collecting data. A flight path of a UAV may be configured where the UAV flies over, through, or near two different privacy geofences. As an example, a first geofence may be set around a first parcel of land. A second geofence may be set around a nearby second parcel of land. The first geofence may have a
geofence type 1 and the second geofence may have a second geofence type 2. - As shown in Table 2, the geofence type associated with a geofence may specify how a sensor or payload device operates. Table 2 is illustrative, and not meant to limit the scope or functions that may be assigned to a geofence type.
-
TABLE 2 AZ Geofence Type Sensor Operable Range EL Range Type 1 Camera NO — — Type 1Infrared Camera NO — — Type 1UV Camera NO — — Type 2 Camera NO — — Type 2 Infrared Camera YES — — Type 2 UV Camera YES — — Type 3 Camera YES −60° to 60° 0° to 85° Type 3 Infrared Camera YES −60° to 60° 0° to 85° Type 3 UV Camera YES −60° to 60° 0° to 85° - If a flight path is set where the UAV flies over the first, second geofence and third, the UAV sensors may allow for certain operations. As an example referring to Table 2, if the first geofence is assigned
Type 1, then neither a camera, infrared camera, or UV camera would be operable if the UAV flies over the first geofence. If the UAV flies over the second geofence, then a camera would not be operable, but an infrared camera or UV camera would be operable. If the UAV flies over the third geofence, the camera, infrared camera and UV camera would be operable but the direction of each camera's field of view would be restricted to, for example, a range of azimuth (AZ) and elevation (EL) angles in a suitable coordinate frame (e.g., local level coordinate frame, camera platform coordinates). In Table 2, each sensor is restricted to an azimuth of −60° to 60° and an elevation of 0° to 85°. Accordingly, a sensor can be operable and but still have its functionality restricted based on a geofence type. - The UAV may take into consideration whether the UAV is flying above an area with a certain population density, which is a measurement of human population per unit area or unit volume. For example, a flight planning system, or ground control system, may analyze population density at a given geographic location. If a flight path is set over a populated area having a threshold population density, then the UAV may be configured to limit certain flight operations, or sensor or payload device operations or functionality. For example, if the UAV flies over a threshold population density, then the UAV may be configured to disable or restrict use of its camera, or other sensors or payload devices. Moreover, an
FPS 200 orGCS 213 may plan the flight path such that the UAV maintains a particular altitude or distance above the populated area. Population density for geographic areas may be found via available public data sources or estimated. A privacy geofence may be generated based on the population density for a given area. For example, in a highly populated or dense area, a privacy geofence may be created by the FPS covering the dense areas. The altitude or above ground limits of the privacy geofence may be adjusted according to the density of the area. The higher the population density, the higher ceiling of the privacy geofence may be set. In other words, in dense areas the UAV would be required to fly at a higher altitude, or above ground level distance height, than in a less dense area. Moreover, the UAV may store on board in memory, information, or heat maps, indicating population density level, and then the UAV may dynamically adjust its altitude based on the density level. Also, the UAV may dynamically create a flight plan to fly around a densely populated area. Privacy controls may be automatically enabled, or “turned-on” based on population density, or criteria. - The UAV may optionally be configured to detect certain wireless signals from a signal source, such as a radio frequency (RF) or microwave signals source. When the UAV receives indication of a particular wireless signal, the UAV may enter into a privacy mode where certain computer processing operations or flight operations are performed, or are disabled or restricted. For example, the UAV may disable or restrict operation of a camera. The UAV may receive or calculate signal strength indicator (RSSI) values to determine how close the UAV is to the signal source. Based on an estimated distance to the signal source, the UAV may perform operations meant to maintain the privacy of the area around the signal source. For example, a UAV may detect Wi-Fi router signals, cell phone signals or other radio tower signals, frequency modulation (FM) signals, amplitude modulation (AM) signals, microwave signals, Bluetooth signals or other wireless signals. When the UAV detects proximity to the signal source, the UAV may conduct a flight operation, such as increase altitude or change the flight path to maintain a certain distance from the signal source. Additionally, the UAV may inhibit operation of certain sensors or payload devices. As a safety consideration, the UAV may detect proximity to a wireless signal emanating from a mobile device, such as a mobile phone or tablet computer. The UAV may be configured such that when the UAV detects the signal from the mobile device, the UAV stays a minimally safe distance from the mobile device, such as 2 meters from the mobile device. If the device uses BTLE (Bluetooth low energy) for example, the UAV can determine a proximate distance to the mobile device. The UAV may maintain a distance from the signal source so that a determined signal strength is below a threshold value. In some implementations, a signal source can be associated with a property and broadcast information pertaining to the property. Such information may include, for example, property boundaries and a preference for privacy for an individual or business. In an example scenario, a residential property owner broadcasts their residential property boundaries (e.g., latitude, longitude) to assist a UAV to avoid flying over the property or provide a flight path or corridor over the property for the UAV. Other preferences can include that only certain types of UAVs (e.g., fixed wing) may fly over the property and only at certain altitudes above the property. In another example scenario, a signal source may broadcast landing coordinates defining a safe landing zone on the property, for example, to deliver a package or make a contingency landing.
- While the above discusses various operations of a UAV with regard to inflight privacy operations, the
FPS 200 orGCS 200 may also determine a flight plan with way points that include commands to cause sensors and devices to maintain privacy as described above, such as inhibiting operation or maintaining orientation of payload devices or field of view of sensors. For example, the FPS may calculate a flight path for a UAV within a flight boundary geofence. The FPS considers the boundary of the flight boundary geofence, and of any privacy geofences included wholly or partially within the flight boundary geofence. A flight path may be determined based on the type of privacy geofence where for example, flight of the UAV is allowed anywhere within the flight boundary geofence, but inhibits the path of the UAV through a privacy geofence. In such case, the FPS would not create a flight path, or allow a user manually determined flight path, that would pass through the privacy geofence. In those instances, where the privacy geofence is of a type that allows pass through, but requires inhibition of certain sensors or payload devices, the flight path may be allowed through privacy geofence, but commands to orient or inhibit the sensors would be included with the flight plan that is provided to the GCS or to the UAV. For example, a waypoint may be added to the flight plan at a position before the UAV would enter into a privacy geofence. The waypoint could include an associated sensor command, such as inhibit the sensor, or orient the sensor such that the sensor cannot obtain sensor information describing anything within the privacy geofence. Another waypoint could be set when the UAV exits the privacy geofence that reorients the sensor, or allows the sensor, to resume obtaining sensor information. - There may be instances when a UAV is collecting data close to a boundary of a geofence. In such instances, images or other information may be collected beyond the geofence boundary. For example, if a UAV is hovering close to a geofence boundary, a still image or video taken may include an image or frame with image data that partially exceeds the boundary of the geofence. Images obtained from an aerial inspection can be modified to automatically blur or otherwise obfuscate portions that illustrate other properties besides the property being inspected. Based on the position and attitude of the UAV and/or the direction/heading of the camera, the UAV operating system can determine whether the camera will take, or has taken, still images or video, or whether the other sensors will collect image data beyond the geofence boundary. Additionally, the UAV operating system can limit the number of still images or video taken with another property in the camera's field of view.
- To reduce the amount of real-time computations performed by the UAV, in some applications a mission can be simulated on a flight simulator (e.g., by the flight description module 210) prior to performing the actual mission so that the flight path or camera parameters can be adjusted appropriately along a mission timeline and/or flight path to ensure adherence to any privacy protocol or regulation. For example, camera settings or UAV attitude or altitude adjustments may be made automatically according to a flight plan. For example, arrival of the UAV at a particular waypoint and/or at a particular time in the mission, the attitude or altitude of the UAV may be altered, or camera settings may be adjusted, based on privacy concerns. If the mission simulation (using a 3D building model) reveals that at a particular waypoint and sensor orientation the camera field of view will capture a building not within the geofence boundary that has a “facet” that includes a window, door or other object (e.g., a swimming pool), then the camera settings can be set to change the depth of field (DOF) so that the building or object in the background of the image scene is blurred, thus allowing the image to be taken, so that an object of interest in the foreground of the image scene can be captured. Alternatively, object recognition algorithm can be applied to the image to detect the window, door or other private object so that the private object can be obfuscated (e.g., blurred or occluded) locally during the mission or remotely using, for example, a network server. The latter allows for more complex and time consuming algorithms to be applied without interfering with the mission objectives or timeline. The
flight description module 210 can generate a flight description or flight plan that can be loaded into UAV memory and used by theprimary computer system 200 to perform a mission. - In some implementations, image data can be tagged with metadata indicating the postal address of a private residence that was captured in the image. The metadata may also include a timestamp and camera settings. The tags can be used to index images in a database for later retrieval by property owners. In an example scenario, the database can be made accessible to property owners over a network (e.g., the Internet) who may view the images and make their own determination of whether to have images deleted. For example, a property owner can enter their property address in a search engine to retrieve and review images tagged with their home address.
-
FIG. 6 illustrates an example of aUAV 606 navigating and collecting images, where the camera captures image data beyond ageofence 602 boundary. A UAV flight module controls UAV navigation along a flight path over ahouse 610 to be inspected. Aground operator 604 places the UAV at the starting location and then initiates an autopilot mode using a GCS. Optionally the UAV may initiate an autopilot mode after the UAV is placed in a starting location for a predetermined period of time. The UAV ascends to a safe vertical height above the roof of thehouse 610. During flight, the UAVprimary computer system 100 periodically determines the geospatial position of the UAV using information received from an onboard GNSS receiver. Initially, the UAV orients a camera so that the camera's field of view points towards thehouse 610. The UAV takes a series of digital images or video of the roof house as it flies along the flight path. Occasionally, depending on the orientation, position and altitude of the UAV, the field ofview 612 of an onboard data gathering device, such as a digital camera, may include a portion of an adjacent property, or house, 614. -
FIG. 7 is a flowchart of anexample process 700 of the UAV performing an image obfuscation process. For convenience, theprocess 700 will be described as being performed by a system of one or more processors, such as theprimary computer system 100 described in reference toFIG. 1 . Privacy controls can be implemented to obfuscate or blur portions of images that are outside of a property boundary (e.g., a UAV can effect this while in flight, or a flight planning system can later process images after a flight operation is conducted). While the UAV is flying within a flight boundary geofence, sensor data is captured with one or more sensors. An example sensor is a digital camera or video camera. The UAV computer operating system, or flight planning system, can utilize GNSS data, IMU data, ground altitude data, imagery (e.g., satellite imagery) to determine portions of captured images that are to be obfuscated. Based on the direction of the camera, and camera's field of view with respect to a boundary of the flight boundary geofence, the computer system can determine whether the digital image may include a portion of an image showing property outside of the flight boundary geofence. The UAV may with on-board processors obfuscate the portion of the image, or may delete the image if the image is determined to include only data showing an image beyond the flight boundary geofence. Also, the images may be transferred to another system, such a flight planning system, or ground control system where the digital images may be similarly processed. - The portion of the image that exceeds the flight boundary geofence can be obfuscated by a number of techniques. The image can be obfuscated via a process running on the UAV computer system, or can be performed post-image acquisition via a process on a separate computer or data system. An entire image may be blurred or scrambled, or image data modified such that the subject of the image cannot be determined.
- While the UAV is conducting a flight operation, via autopilot navigation or via manually controlled operation by a wireless control device, the UAV computer system may log various data associated with the digital images. For example, when the computer system instructs an attached digital camera to take a picture various information can be recorded, for example: heading of the UAV, the geospatial position of the UAV, the altitude of the UAV, the camera field of view, focal length, aperture settings, time the image was taken. If the camera is gimbaled (that is, rotatable) controllable on various axis, a sensor may keep track of the position of the camera to determine its position relative to the heading of the UAV.
- In some implementations,
process 700 begins by combining one or more of GNSS coordinates, Above Ground Level (AGL) altitude, direction of the camera, camera field of view, camera focal length and other camera parameters to project the image onto a latitude/longitude coordinate system (block 702), warping it as needed, for example, with a Mercator Map projection. -
Process 700 then overlays the projection with a flight boundary geofence (which is already defined in a latitude/longitude coordinate system) (block 704). -
Process 700 then determines the area in the latitude/longitude coordinate system that falls outside the flight boundary geofence but inside the projection of the taken image (block 706). -
Process 700 then projects this area back into the pixel coordinate system of the image to determine the pixels that that are outside of the flight boundary geofence (block 708), as described in reference toFIGS. 8A and 8B . -
Process 700 then obfuscates the pixels determined to be outside the flight boundary geofence (block 710). For example, the image area outside of the flight boundary geofence may be deleted, blurred, scrambled or removed from the image. Also, the pixel Red Green Blue (RGB) values of the area outside of the flight boundary geofence may be overwritten to show a single color (e.g., black, white, red, etc.). Additionally, a Gaussian blur could be used to blur the pixels determined to be outside the flight boundary geofence. - Although the above process is described by a UAV computer system performing image obfuscation. A separate computer system, user device, or FPS may perform the image obfuscation process. These other systems may receive the digital images obtained by the UAV. The images may be obtained during flight of the UAV, or post-flight. Additionally, log data from the UAV may be provided to these other systems during flight or post flight operations. The log data may include information identifying geospatial locations of where the one or more images where taken, time images were taken, the pose, or position of the camera, and other information used by the process to determine the field of view of the camera in relationship to a flight boundary, or privacy boundary geofence. These system may determine that a portion of a digital image includes at least a portion of an object located outside of a boundary of the geofence, and obfuscate a portion of the digital image as discussed herein.
- In some implementations, the primary computer system of the UAV can determine a percentage of the pixels of the image that comprise the digital image. And then delete the digital image, or identify the image for deletion, if the percentage exceeds a predetermined threshold value. For example, if the image area outside the geofence is greater than 50%, than the system can be configured to delete the entire image.
- The flight planning system can automatically implement privacy controls when generating a 3D model. For instance, a UAV can capture imagery of a particular floor of a skyscraper. When generating a 3D model, other portions of the skyscraper (e.g., other floors) can be blurred out, and/or windows can be recognized and blurred to hide the inside. Additionally, sensors can be utilized that are opaque to glass (e.g., opaque to glass windows).
-
FIGS. 8A and 8B illustrate the process described inFIG. 7 .FIG. 8A shows a projection 802 (e.g., using a homograph transformation) of a camera field of view (e.g., a sensor image) onto a geodetic (latitude/longitude) map projection 800 (e.g., Mercator map projection) that also depicts aflight boundary geofence 801. Anarea 803 that lies within theprojection 802 and outside theflight boundary geofence 801 is determined. If theflight boundary geofence 801 is labeled “A” and theprojection 802 of the camera field of view is labeled “B”, then applying set theory mathematics,area 803 is the relative complement of A (left) in B (right) or, -
B=∩Âc=B\A. -
FIG. 8B illustrates captured image data in pixel coordinates 804 withpixels 805 corresponding toarea 803 obfuscated. In some implementations,pixels 805 can be identified for obfuscation by using image segmentation or other edge detector algorithms and pixel occlusion algorithms. In some implementations,pixels 805 can be blurred using, for example, a Gaussian blur filter or scrambled or blocked out (e.g., with a solid color). In some implementations, objects inarea 803 that are identified as private (e.g., windows, doors, swimming pools, sun roofs, skylights) can be recognized using object recognition techniques (e.g., image segmentation, machine learning, edge detection, corner detection, blob detection, genetic algorithms, feature-based methods) and then obfuscated, locally or remotely using, for example, a network server. In some implementations, images to be processed can be batch processed. In some implementations, a recognized object can be labeled (e.g., labeled as a “window”) and compared to a predetermined list of labeled objects that are designated private and if there is a match between labels, the object can be obfuscated. In some implementations, only the object is obfuscated. In some implementations, images to be processed either locally or remotely can be post-processed using a batch process. - The processes and operations are described above in terms of one or more processors. The processor or processors can be onboard a UAV, onboard a user device, or part of a cloud-based processing system. In particular, a user device can be designated as a GCS and perform functions of a GCS. A user device and a UAV computer system can be designated as a FPS and perform functions of an FPS. Likewise, functions of both the GCS and FPS can be performed by a cloud-based processing system. The image obfuscation techniques described herein can be performed in whole or in part on one or more of the UAV, the GCS or other user device (e.g., a personal computer) or a network-based server.
- Various types of UAVs may be used to implement the inventions described herein (for example, a fixed wing airplane, helicopter, a multi-rotor vehicle (e.g., a quad-copter in single propeller and coaxial configurations), a vertical take-off and landing vehicle, lighter than air aircraft). A multi-rotor vehicle in a coaxial configuration may use the same propeller pitch and diameter propellers, use different pitch and diameter propellers, or variable pitch propellers. In this specification, UAVs, such as drones, un-operated aerial vehicles, remotely operated aircraft, unmanned aircraft systems, any aircraft covered under Circular 328 AN/190 classified by the International Civil Aviation Organization, and so on. In addition, certain aspects of the disclosure can be utilized with other types of unmanned vehicles (e.g., wheeled, tracked, and/or water vehicles). Sensors, which are included in the general term payload (e.g., any hardware, software, module, and so on, that is not critical to the flight operation of the UAV) can include any device that captures real-world information, including cameras, radiation measuring instruments, distance detectors such as Lidar, and so on.
- Each of the processes, methods, instructions, applications and algorithms described in the preceding sections may be embodied in, and fully or partially automated by, code modules executed by one or more computer systems or computer processors comprising computer hardware. The code modules (or “engines”) may be stored on any type of non-transitory computer-readable medium or computer storage device, such as hard drives, solid state memory, optical disc, and/or the like. The systems and modules may also be transmitted as generated data signals (for example, as part of a carrier wave or other analog or digital propagated signal) on a variety of computer-readable transmission mediums, including wireless-based and wired/cable-based mediums, and may take a variety of forms (for example, as part of a single or multiplexed analog signal, or as multiple discrete digital packets or frames). The processes and algorithms may be implemented partially or wholly in application-specific circuitry. The results of the disclosed processes and process steps may be stored, persistently or otherwise, in any type of non-transitory computer storage such as, for example, volatile or non-volatile storage.
- User interfaces described herein are optionally presented (and user instructions may be received) via a user computing device using a browser, other network resource viewer, a dedicated application, or otherwise. Various features described or illustrated as being present in different embodiments or user interfaces may be combined into the same embodiment or user interface. Commands and information received from the user may be stored and acted on by the various systems disclosed herein using the processes disclosed herein. While the disclosure may reference to a user hovering over, pointing at, or clicking on a particular item, other techniques may be used to detect an item of user interest. For example, the user may touch the item via a touch screen, or otherwise indicate an interest. The user interfaces described herein may be presented on a user terminal, such as a laptop computer, desktop computer, tablet computer, smart-phone, virtual reality headset, augmented reality headset, or other terminal type. The user terminals may be associated with user input devices, such as touch screens, microphones, touch pads, keyboards, mice, styluses, cameras, etc. While the foregoing discussion and figures may illustrate various types of menus, other types of menus may be used. For example, menus may be provided via a drop down menu, a toolbar, a pop up menu, interactive voice response system, or otherwise.
- In general, the terms “engine” and “module”, as used herein, refer to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, Java, Lua, C or C++. A software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts. Software modules configured for execution on computing devices may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, or any other tangible medium. Such software code may be stored, partially or fully, on a memory device of the executing computing device. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors. The modules described herein are preferably implemented as software modules, but may be represented in hardware or firmware. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage. Electronic data sources can include databases, volatile/non-volatile memory, and any memory system or subsystem that maintains information.
- The various features and processes described above may be used independently of one another, or may be combined in various ways. All possible combinations and sub-combinations are intended to fall within the scope of this disclosure. In addition, certain method or process blocks may be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate. For example, described blocks or states may be performed in an order other than that specifically disclosed, or multiple blocks or states may be combined in a single block or state. The example blocks or states may be performed in serial, in parallel, or in some other manner. Blocks or states may be added to or removed from the disclosed example embodiments. The example systems and components described herein may be configured differently than described. For example, elements may be added to, removed from, or rearranged compared to the disclosed example embodiments.
- Conditional language used herein, such as, among others, “can,” “could,” “might,” “may,” “for example,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment. The terms “comprising,” “including,” “having,” and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. Also, the term “or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list. Conjunctive language such as the phrase “at least one of X, Y and Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to convey that an item, term, etc. may be either X, Y or Z. Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of X, at least one of Y and at least one of Z to each be present.
- The term “a” as used herein should be given an inclusive rather than exclusive interpretation. For example, unless specifically noted, the term “a” should not be understood to mean “exactly one” or “one and only one”; instead, the term “a” means “one or more” or “at least one,” whether used in the claims or elsewhere in the specification and regardless of uses of quantifiers such as “at least one,” “one or more,” or “a plurality” elsewhere in the claims or specification.
- The term “comprising” as used herein should be given an inclusive rather than exclusive interpretation. For example, a general purpose computer comprising one or more processors should not be interpreted as excluding other computer components, and may possibly include such components as memory, input/output devices, and/or network interfaces, among others.
- While certain example embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the disclosure. Nothing in the description is intended to imply that any particular element, feature, characteristic, step, module, or block is necessary or indispensable. The novel methods and systems described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions, and changes in the form of the methods and systems described herein may be made without departing from the spirit of the inventions disclosed herein. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of certain of the inventions disclosed herein.
- Any process descriptions, elements, or blocks in the flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process. Alternate implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those skilled in the art.
- It should be emphasized that many variations and modifications may be made to the to the above-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of the disclosure. The foregoing description details certain embodiments of the invention. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the invention can be practiced in many ways. As is also stated above, it should be noted that the use of particular terminology when describing certain features or aspects of the invention should not be taken to imply that the terminology is being re-defined herein to be restricted to including any specific characteristics of the features or aspects of the invention with which that terminology is associated.
Claims (30)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/088,005 US20180025649A1 (en) | 2016-02-08 | 2016-03-31 | Unmanned aerial vehicle privacy controls |
PCT/US2017/016860 WO2017139282A1 (en) | 2016-02-08 | 2017-02-07 | Unmanned aerial vehicle privacy controls |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662292783P | 2016-02-08 | 2016-02-08 | |
US201662298429P | 2016-02-22 | 2016-02-22 | |
US15/088,005 US20180025649A1 (en) | 2016-02-08 | 2016-03-31 | Unmanned aerial vehicle privacy controls |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180025649A1 true US20180025649A1 (en) | 2018-01-25 |
Family
ID=58162231
Family Applications (7)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/088,005 Abandoned US20180025649A1 (en) | 2016-02-08 | 2016-03-31 | Unmanned aerial vehicle privacy controls |
US15/088,042 Active US10762795B2 (en) | 2016-02-08 | 2016-03-31 | Unmanned aerial vehicle privacy controls |
US15/094,802 Active US9588516B1 (en) | 2016-02-08 | 2016-04-08 | Unmanned aerial vehicle visual line of sight control |
US15/449,846 Active US11189180B2 (en) | 2016-02-08 | 2017-03-03 | Unmanned aerial vehicle visual line of sight control |
US17/008,344 Active US11361665B2 (en) | 2016-02-08 | 2020-08-31 | Unmanned aerial vehicle privacy controls |
US17/512,323 Active US11854413B2 (en) | 2016-02-08 | 2021-10-27 | Unmanned aerial vehicle visual line of sight control |
US18/394,152 Pending US20240257654A1 (en) | 2016-02-08 | 2023-12-22 | Unmanned Aerial Vehicle Visual Line Of Sight Control |
Family Applications After (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/088,042 Active US10762795B2 (en) | 2016-02-08 | 2016-03-31 | Unmanned aerial vehicle privacy controls |
US15/094,802 Active US9588516B1 (en) | 2016-02-08 | 2016-04-08 | Unmanned aerial vehicle visual line of sight control |
US15/449,846 Active US11189180B2 (en) | 2016-02-08 | 2017-03-03 | Unmanned aerial vehicle visual line of sight control |
US17/008,344 Active US11361665B2 (en) | 2016-02-08 | 2020-08-31 | Unmanned aerial vehicle privacy controls |
US17/512,323 Active US11854413B2 (en) | 2016-02-08 | 2021-10-27 | Unmanned aerial vehicle visual line of sight control |
US18/394,152 Pending US20240257654A1 (en) | 2016-02-08 | 2023-12-22 | Unmanned Aerial Vehicle Visual Line Of Sight Control |
Country Status (1)
Country | Link |
---|---|
US (7) | US20180025649A1 (en) |
Cited By (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9959772B2 (en) * | 2016-06-10 | 2018-05-01 | ETAK Systems, LLC | Flying lane management systems and methods for unmanned aerial vehicles |
US20180121878A1 (en) * | 2016-10-27 | 2018-05-03 | International Business Machines Corporation | Intelligent package delivery |
US20180150984A1 (en) * | 2016-11-30 | 2018-05-31 | Gopro, Inc. | Map View |
US20180182168A1 (en) * | 2015-09-02 | 2018-06-28 | Thomson Licensing | Method, apparatus and system for facilitating navigation in an extended scene |
US20180265194A1 (en) * | 2014-12-17 | 2018-09-20 | Picpocket, Inc. | Drone based systems and methodologies for capturing images |
US20180324662A1 (en) * | 2017-05-03 | 2018-11-08 | Qualcomm Incorporated | Determining whether a drone-coupled user equipment is engaged in a flying state |
US20190035287A1 (en) * | 2016-06-10 | 2019-01-31 | ETAK Systems, LLC | Drone collision avoidance via Air Traffic Control over wireless networks |
US20190067812A1 (en) * | 2017-08-23 | 2019-02-28 | The United States Of America, As Represented By The Secretary Of The Navy | Search Track Acquire React System (STARS) Drone Integrated Acquisition Tracker (DIAT) |
US20190197254A1 (en) * | 2017-12-27 | 2019-06-27 | Honeywell International Inc. | Systems and methods for dynamically masking video and images captured a drone device camera |
US20190210722A1 (en) * | 2016-09-30 | 2019-07-11 | Optim Corporation | System, method, and program for controlling drone |
US20190235501A1 (en) * | 2018-01-31 | 2019-08-01 | Walmart Apollo, Llc | System and method for identifying vehicle delivery locations utilizing scout autonomous vehicles |
US10377487B2 (en) * | 2016-03-29 | 2019-08-13 | Brother Kogyo Kabushiki Kaisha | Display device and display control method |
US20200036886A1 (en) * | 2019-08-16 | 2020-01-30 | Lg Electronics Inc. | Method for photographing an unmanned aerial robot and a device for supporting the same in an unmanned aerial vehicle system |
EP3671679A1 (en) * | 2018-12-17 | 2020-06-24 | Robert Bosch GmbH | Dynamic masking or hiding of areas of a field of view |
US10721375B1 (en) * | 2016-08-26 | 2020-07-21 | Amazon Technologies, Inc. | Vehicle camera contamination detection |
US10749952B2 (en) * | 2016-06-01 | 2020-08-18 | Cape Mcuas, Inc. | Network based operation of an unmanned aerial vehicle based on user commands and virtual flight assistance constraints |
US10762795B2 (en) | 2016-02-08 | 2020-09-01 | Skydio, Inc. | Unmanned aerial vehicle privacy controls |
US10794986B2 (en) * | 2017-06-02 | 2020-10-06 | Apple Inc. | Extending a radio map |
US10866597B1 (en) * | 2018-05-07 | 2020-12-15 | Securus Technologies, Llc | Drone detection and interception |
US10878679B2 (en) * | 2017-07-31 | 2020-12-29 | Iain Matthew Russell | Unmanned aerial vehicles |
US20210005092A1 (en) * | 2018-03-23 | 2021-01-07 | SZ DJI Technology Co., Ltd. | Control method, device, and system |
US10909861B2 (en) * | 2016-12-23 | 2021-02-02 | Telefonaktiebolaget Lm Ericsson (Publ) | Unmanned aerial vehicle in controlled airspace |
US10962650B2 (en) | 2017-10-31 | 2021-03-30 | United States Of America As Represented By The Administrator Of Nasa | Polyhedral geofences |
US10979854B2 (en) | 2017-06-02 | 2021-04-13 | Apple Inc. | Extending a radio map |
US20210266461A1 (en) * | 2018-07-04 | 2021-08-26 | c/o H3 DYNAMICS PTE. LTD. | Defect detection system using a camera equipped uav for building facades on complex asset geometry with optimal automatic obstacle deconflicted flightpath |
US11112249B1 (en) * | 2018-09-24 | 2021-09-07 | Rockwell Collins, Inc. | Systems and methods for four-dimensional routing around concave polygon avoidances |
US11122424B1 (en) | 2019-05-14 | 2021-09-14 | Hood Mountain, LLC | Systems, methods and apparatus for data privacy protection based on geofence networks |
US11150654B2 (en) * | 2016-06-30 | 2021-10-19 | Skydio, Inc. | Dynamically adjusting UAV flight operations based on radio frequency signal data |
US20210375143A1 (en) * | 2015-03-31 | 2021-12-02 | SZ DJI Technology Co., Ltd. | Systems and methods for geo-fencing device communications |
US11242143B2 (en) | 2016-06-13 | 2022-02-08 | Skydio, Inc. | Unmanned aerial vehicle beyond visual line of sight control |
US11288107B2 (en) * | 2016-11-23 | 2022-03-29 | Google Llc | Selective obfuscation of notifications |
US11295621B2 (en) * | 2016-12-01 | 2022-04-05 | SZ DJI Technology Co., Ltd. | Methods and associated systems for managing 3D flight paths |
US11292602B2 (en) * | 2016-11-04 | 2022-04-05 | Sony Corporation | Circuit, base station, method, and recording medium |
US20220130259A1 (en) * | 2020-10-22 | 2022-04-28 | Toyota Jidosha Kabushiki Kaisha | Control device, system, program, control instrument, flying object, sensor, and method of operating system |
US20220147066A1 (en) * | 2020-03-02 | 2022-05-12 | Clrobur Co., Ltd. | Drone control system and intelligent drone flight planning method thereof |
US20220166939A1 (en) * | 2019-03-13 | 2022-05-26 | Sony Group Corporation | Information processing apparatus, method, and recording medium |
US20220201253A1 (en) * | 2020-12-22 | 2022-06-23 | Axis Ab | Camera and a method therein for facilitating installation of the camera |
US11377231B2 (en) * | 2019-02-06 | 2022-07-05 | Honeywell International Inc. | Automatically adjustable landing lights for aircraft |
IT202100009311A1 (en) * | 2021-04-14 | 2022-10-14 | Vlab S R L | MONITORING METHOD FOR MONITORING ENVIRONMENTS AND RELATED MONITORING DEVICE |
US11518510B1 (en) * | 2016-10-06 | 2022-12-06 | Gopro, Inc. | Systems and methods for controlling an unmanned aerial vehicle |
US20230038872A1 (en) * | 2019-12-19 | 2023-02-09 | 4Dream Co., Ltd. | Method, apparatus, and system for protecting private information from illegal photography by unmanned aerial vehicle |
US11579611B1 (en) * | 2020-03-30 | 2023-02-14 | Amazon Technologies, Inc. | Predicting localized population densities for generating flight routes |
US11640764B1 (en) | 2020-06-01 | 2023-05-02 | Amazon Technologies, Inc. | Optimal occupancy distribution for route or path planning |
US20230171318A1 (en) * | 2021-12-01 | 2023-06-01 | International Business Machines Corporation | Management of devices in a smart environment |
US20230328477A1 (en) * | 2022-04-07 | 2023-10-12 | Caterpillar Paving Products Inc. | System and method for defining an area of a worksite |
US11868145B1 (en) | 2019-09-27 | 2024-01-09 | Amazon Technologies, Inc. | Selecting safe flight routes based on localized population densities and ground conditions |
US11961093B2 (en) | 2015-03-31 | 2024-04-16 | SZ DJI Technology Co., Ltd. | Authentication systems and methods for generating flight regulations |
US12254779B2 (en) | 2016-02-08 | 2025-03-18 | Skydio, Inc. | Unmanned aerial vehicle privacy controls |
Families Citing this family (81)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220187472A1 (en) * | 2013-10-31 | 2022-06-16 | Invisible Intelligence, Llc | Recording system and apparatus including user-defined polygon geofencing |
EP3435188B1 (en) | 2014-01-10 | 2021-11-10 | Pictometry International Corp. | Structure evaluation system and method using an unmanned aircraft |
US12007763B2 (en) | 2014-06-19 | 2024-06-11 | Skydio, Inc. | Magic wand interface and other user interaction paradigms for a flying digital assistant |
US9678506B2 (en) | 2014-06-19 | 2017-06-13 | Skydio, Inc. | Magic wand interface and other user interaction paradigms for a flying digital assistant |
US9798322B2 (en) | 2014-06-19 | 2017-10-24 | Skydio, Inc. | Virtual camera interface and other user interaction paradigms for a flying digital assistant |
US10134135B1 (en) * | 2015-08-27 | 2018-11-20 | Hrl Laboratories, Llc | System and method for finding open space efficiently in three dimensions for mobile robot exploration |
JP6633460B2 (en) | 2015-09-04 | 2020-01-22 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Notification method, notification device and terminal |
DE102015012477A1 (en) * | 2015-09-29 | 2017-03-30 | Airbus Defence and Space GmbH | Unmanned aerial vehicle and method for the safe landing of an unmanned aerial vehicle |
US9740200B2 (en) | 2015-12-30 | 2017-08-22 | Unmanned Innovation, Inc. | Unmanned aerial vehicle inspection system |
US10083616B2 (en) * | 2015-12-31 | 2018-09-25 | Unmanned Innovation, Inc. | Unmanned aerial vehicle rooftop inspection system |
CA3001023A1 (en) | 2016-01-08 | 2017-07-13 | Pictometry International Corp. | Systems and methods for taking, processing, retrieving, and displaying images from unmanned aerial vehicles |
US10627821B2 (en) * | 2016-04-22 | 2020-04-21 | Yuneec International (China) Co, Ltd | Aerial shooting method and system using a drone |
US10157546B2 (en) * | 2016-06-10 | 2018-12-18 | ETAK Systems, LLC | Anti-drone flight protection systems and methods |
US10460279B2 (en) | 2016-06-28 | 2019-10-29 | Wing Aviation Llc | Interactive transport services provided by unmanned aerial vehicles |
WO2018020659A1 (en) * | 2016-07-29 | 2018-02-01 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッド | Moving body, method for controlling moving body, system for controlling moving body, and program for controlling moving body |
US10520943B2 (en) | 2016-08-12 | 2019-12-31 | Skydio, Inc. | Unmanned aerial image capture platform |
US10555133B1 (en) | 2016-09-22 | 2020-02-04 | Apple Inc. | Systems and methods for locating mobile devices within a vehicle |
US10353388B2 (en) * | 2016-10-17 | 2019-07-16 | X Development Llc | Drop-off location planning for delivery vehicle |
CN106444848B (en) * | 2016-11-28 | 2018-11-30 | 广州极飞科技有限公司 | Control the method and device of unmanned plane during flying |
JP6575493B2 (en) * | 2016-11-30 | 2019-09-18 | 株式会社豊田中央研究所 | Control device, distributed control program for moving body |
KR102706191B1 (en) * | 2016-11-30 | 2024-09-13 | 삼성전자주식회사 | Unmanned flying vehicle and flying control method thereof |
US11295458B2 (en) | 2016-12-01 | 2022-04-05 | Skydio, Inc. | Object tracking by an unmanned aerial vehicle using visual sensors |
US11017679B2 (en) * | 2017-01-13 | 2021-05-25 | Skydio, Inc. | Unmanned aerial vehicle visual point cloud navigation |
WO2018170797A1 (en) * | 2017-03-22 | 2018-09-27 | Nokia Technologies Oy | Systems and apparatuses for detecting unmanned aerial vehicle |
US10455520B2 (en) | 2017-03-30 | 2019-10-22 | At&T Intellectual Property I, L.P. | Altitude based device management in a wireless communications system |
RU2731942C1 (en) | 2017-03-31 | 2020-09-09 | Телефонактиеболагет Лм Эрикссон (Пабл) | Broadcasting transmission of geolocation information in radio frame transmitted from unmanned aerial vehicle |
WO2018178758A1 (en) * | 2017-03-31 | 2018-10-04 | Telefonaktiebolaget Lm Ericsson (Publ) | Enhanced flight plan for unmanned traffic aircraft systems |
US11218840B2 (en) | 2017-03-31 | 2022-01-04 | Telefonaktiebolaget Lm Ericsson (Publ) | Methods and systems for using network location services in a unmanned aircraft systems traffic management framework |
CN114326811B (en) * | 2017-04-09 | 2024-08-02 | 深圳市大疆创新科技有限公司 | Flight processing method and control equipment |
WO2018189576A1 (en) | 2017-04-14 | 2018-10-18 | Telefonaktiebolaget Lm Ericsson (Publ) | Optimal unmanned aerial vehicle flight route planning based on quality-of-service requirements for data, telemetry, and command and control requirements in 3gpp networks |
US10472090B2 (en) * | 2017-04-27 | 2019-11-12 | Qualcomm Incorporated | Environmentally aware status LEDs for use in drones |
AT16013U1 (en) * | 2017-04-28 | 2018-10-15 | Ars Electronica Linz Gmbh & Co Kg | Unmanned aerial vehicle with a modular swarm control unit |
US11166208B2 (en) | 2017-05-05 | 2021-11-02 | Telefonaktiebolaget Lm Ericsson (Publ) | Methods and systems for using an unmanned aerial vehicle (UAV) flight path to coordinate an enhanced handover in 3rd generation partnership project (3GPP) networks |
US11073830B2 (en) * | 2017-05-31 | 2021-07-27 | Geomni, Inc. | System and method for mission planning and flight automation for unmanned aircraft |
US10736154B2 (en) * | 2017-06-13 | 2020-08-04 | Rumfert, Llc | Wireless real-time data-link sensor method and system for small UAVs |
EP3652985B1 (en) | 2017-07-10 | 2020-11-18 | Telefonaktiebolaget LM Ericsson (publ) | Optimization of radio resource allocation based on unmanned aerial vehicle flight path information |
JP6943674B2 (en) * | 2017-08-10 | 2021-10-06 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Mobiles, control methods and control programs |
GB2560396B (en) * | 2017-09-01 | 2019-01-30 | Matthew Russell Iain | Unmanned aerial vehicles |
US10952113B2 (en) | 2017-09-05 | 2021-03-16 | Telefonaktiebolaget Lm Ericsson (Publ) | Planned continuity of unmanned aerial vehicle (UAV) link connectivity in UAV traffic management systems |
USD861968S1 (en) | 2017-10-06 | 2019-10-01 | Talon Aerolytics (Holding), Inc. | Strobe component |
CN111225843B (en) * | 2017-10-27 | 2023-04-21 | 日产自动车株式会社 | Parking control method and parking control device |
CN108011659B (en) * | 2017-10-30 | 2024-02-09 | 歌尔股份有限公司 | UAV communication method, device and UAV |
FR3074347B1 (en) * | 2017-11-24 | 2022-10-14 | Thales Sa | ELECTRONIC SYSTEM FOR REMOTE CONTROL OF DRONES, METHOD FOR COMPUTER PROGRAM ASSOCIATED |
WO2019132898A1 (en) * | 2017-12-27 | 2019-07-04 | Intel Corporation | Dynamic generation of restricted flight zones for drones |
WO2019130050A1 (en) | 2017-12-29 | 2019-07-04 | Telefonaktiebolaget Lm Ericsson (Publ) | Using a cellular interface for unmanned aerial vehicle communications |
WO2019152693A2 (en) * | 2018-01-31 | 2019-08-08 | Walmart Apollo, Llc | System and method for autonomous remote drone control |
CN110262540A (en) * | 2018-03-12 | 2019-09-20 | 杭州海康机器人技术有限公司 | The method and apparatus that flight control is carried out to aircraft |
WO2019186245A1 (en) | 2018-03-30 | 2019-10-03 | Telefonaktiebolaget Lm Ericsson (Publ) | Network coverage and policy information generation and distribution for unmanned aerial vehicle flight planning |
US11867529B2 (en) | 2018-06-01 | 2024-01-09 | Rumfert, Llc | Altitude initialization and monitoring system and method for remote identification systems (remote Id) monitoring and tracking unmanned aircraft systems (UAS) in the national airspace system (NAS) |
FR3084199B1 (en) * | 2018-07-20 | 2021-04-16 | Thales Sa | FLIGHT CONTROL FROM A NON-AVIONIC DEVICE |
US10692389B2 (en) * | 2018-07-20 | 2020-06-23 | Aurora Flight Services Corporation, a subsidiary of The Boeing Company | Flight control systems for aerial vehicles and related methods |
DE102018120010A1 (en) * | 2018-08-16 | 2020-02-20 | Autel Robotics Europe Gmbh | ROUTE DISPLAY METHOD, DEVICE AND SYSTEM, GROUND STATION AND COMPUTER READABLE STORAGE MEDIUM |
US11170656B2 (en) * | 2018-11-28 | 2021-11-09 | The Boeing Company | Predicting low visibility set-up options for an airport moving map |
US12177734B2 (en) * | 2019-01-09 | 2024-12-24 | Whelen Engineering Company, Inc. | System and method for velocity-based geofencing for emergency vehicle |
KR102235589B1 (en) * | 2019-02-19 | 2021-04-02 | 주식회사 아르고스다인 | UAV landing system |
CN110597283A (en) * | 2019-09-06 | 2019-12-20 | 深圳市道通智能航空技术有限公司 | Flight method, terminal device, aircraft and flight system |
WO2021053670A1 (en) * | 2019-09-20 | 2021-03-25 | Parazero Ltd. | Damage mitigating for an aerial vehicle having a deployable parachute |
US11238292B2 (en) * | 2019-11-26 | 2022-02-01 | Toyota Research Institute, Inc. | Systems and methods for determining the direction of an object in an image |
US11280608B1 (en) * | 2019-12-11 | 2022-03-22 | Sentera, Inc. | UAV above ground level determination for precision agriculture |
KR102263152B1 (en) | 2020-03-06 | 2021-06-09 | 주식회사 카르타 | Method and apparatus for object detection in 3d point clouds |
US11747832B2 (en) | 2020-07-14 | 2023-09-05 | Easy Aerial Inc. | Unmanned aerial vehicle (UAV) systems and methods for maintaining continuous UAV operation |
US11661190B2 (en) | 2020-07-24 | 2023-05-30 | Easy Aerial Inc. | Rapid aircraft inspection with autonomous drone base station systems |
US11220335B1 (en) | 2020-08-03 | 2022-01-11 | Easy Aerial Inc. | Hybrid unmanned aerial vehicle systems with quick release tether assembly |
US20220068145A1 (en) * | 2020-08-26 | 2022-03-03 | Michael A. Cummings | System, apparatus and method for improved airport and related vehicle operations and tracking |
US11797896B2 (en) | 2020-11-30 | 2023-10-24 | At&T Intellectual Property I, L.P. | Autonomous aerial vehicle assisted viewing location selection for event venue |
US11443518B2 (en) | 2020-11-30 | 2022-09-13 | At&T Intellectual Property I, L.P. | Uncrewed aerial vehicle shared environment privacy and security |
US12183110B2 (en) | 2020-11-30 | 2024-12-31 | At&T Intellectual Property I, L.P. | Autonomous aerial vehicle projection zone selection |
US11726475B2 (en) | 2020-11-30 | 2023-08-15 | At&T Intellectual Property I, L.P. | Autonomous aerial vehicle airspace claiming and announcing |
US12012094B2 (en) * | 2020-12-07 | 2024-06-18 | Ford Global Technologies, Llc | Detecting vehicle presence at a site |
US11673690B2 (en) | 2021-01-22 | 2023-06-13 | Easy Aerial Inc. | Modular collapsible and portable drone in a box |
US11582392B2 (en) * | 2021-03-25 | 2023-02-14 | International Business Machines Corporation | Augmented-reality-based video record and pause zone creation |
US12106462B2 (en) | 2021-04-01 | 2024-10-01 | Allstate Insurance Company | Computer vision methods for loss prediction and asset evaluation based on aerial images |
US12051114B2 (en) | 2021-04-01 | 2024-07-30 | Allstate Insurance Company | Computer vision methods for loss prediction and asset evaluation based on aerial images |
US11834199B2 (en) | 2021-04-08 | 2023-12-05 | Easy Aerial Inc. | Hybrid unmanned aerial vehicle systems with automated tether assembly |
US11805424B2 (en) * | 2021-06-02 | 2023-10-31 | Verizon Patent And Licensing Inc. | System and method for wireless equipment deployment |
CN113362487A (en) * | 2021-06-21 | 2021-09-07 | 广西电网有限责任公司电力科学研究院 | Intelligent autonomous inspection unified management and control system for distribution line unmanned aerial vehicle |
US12214862B2 (en) | 2021-10-30 | 2025-02-04 | Beta Air Llc | Systems and methods for hybrid autonomous control of an electric aircraft |
WO2023139628A1 (en) * | 2022-01-18 | 2023-07-27 | 株式会社RedDotDroneJapan | Area setting system and area setting method |
JP7270199B1 (en) | 2022-10-13 | 2023-05-10 | 九州電力株式会社 | Drone Visible Area Display System |
JP7270198B1 (en) * | 2022-10-13 | 2023-05-10 | 九州電力株式会社 | Drone control position search system |
CN118999586B (en) * | 2024-10-24 | 2025-01-21 | 西安山外信息科技有限公司 | A 5G-based visual recognition and path planning system for unmanned mining trucks |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160364579A1 (en) * | 2014-02-24 | 2016-12-15 | Hewlett-Packard Development Company, L.P. | Privacy Zone |
US20160373699A1 (en) * | 2013-10-18 | 2016-12-22 | Aerovironment, Inc. | Privacy Shield for Unmanned Aerial Systems |
US20170148328A1 (en) * | 2015-11-25 | 2017-05-25 | International Business Machines Corporation | Dynamic geo-fence for drone |
US20170169713A1 (en) * | 2015-03-31 | 2017-06-15 | SZ DJI Technology Co., Ltd | Authentication systems and methods for generating flight regulations |
US20170235018A1 (en) * | 2016-01-08 | 2017-08-17 | Pictometry International Corp. | Systems and methods for taking, processing, retrieving, and displaying images from unmanned aerial vehicles |
Family Cites Families (78)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6509926B1 (en) * | 2000-02-17 | 2003-01-21 | Sensormatic Electronics Corporation | Surveillance apparatus for camera surveillance system |
US7127334B2 (en) | 2002-12-03 | 2006-10-24 | Frink Bentley D | System and methods for preventing the unauthorized use of aircraft |
US20040174434A1 (en) * | 2002-12-18 | 2004-09-09 | Walker Jay S. | Systems and methods for suggesting meta-information to a camera user |
US7460148B1 (en) | 2003-02-19 | 2008-12-02 | Rockwell Collins, Inc. | Near real-time dissemination of surveillance video |
US7574220B2 (en) | 2004-12-06 | 2009-08-11 | Interdigital Technology Corporation | Method and apparatus for alerting a target that it is subject to sensing and restricting access to sensed content associated with the target |
US8666661B2 (en) | 2006-03-31 | 2014-03-04 | The Boeing Company | Video navigation |
US8838289B2 (en) | 2006-04-19 | 2014-09-16 | Jed Margolin | System and method for safely flying unmanned aerial vehicles in civilian airspace |
TWI319676B (en) * | 2006-10-18 | 2010-01-11 | Quanta Comp Inc | Image processing apparatus and method |
US7916177B2 (en) * | 2007-08-03 | 2011-03-29 | Panasonic Corporation | Image-capturing apparatus, image-capturing method and program for detecting and correcting image blur |
US8225208B2 (en) * | 2007-08-06 | 2012-07-17 | Apple Inc. | Interactive frames for images and videos displayed in a presentation application |
JP2009043047A (en) * | 2007-08-09 | 2009-02-26 | Seiko Epson Corp | Image display device, image display method, and program |
US8386175B2 (en) | 2008-02-15 | 2013-02-26 | Kutta Technologies, Inc. | Unmanned aerial system position reporting system |
US8244469B2 (en) * | 2008-03-16 | 2012-08-14 | Irobot Corporation | Collaborative engagement for target identification and tracking |
WO2010123611A1 (en) | 2009-02-02 | 2010-10-28 | Aerovironment | Multimode unmanned aerial vehicle |
US8515609B2 (en) | 2009-07-06 | 2013-08-20 | Honeywell International Inc. | Flight technical control management for an unmanned aerial vehicle |
IL200637A0 (en) * | 2009-08-30 | 2011-08-01 | Rafael Advanced Defense Sys | System and method for virtual range estimation |
IL201682A0 (en) * | 2009-10-22 | 2010-11-30 | Bluebird Aero Systems Ltd | Imaging system for uav |
US8366054B2 (en) | 2009-12-17 | 2013-02-05 | The United States Of America As Represented By The Secretary Of The Navy | Hand launchable unmanned aerial vehicle |
US9036861B2 (en) | 2010-04-22 | 2015-05-19 | The University Of North Carolina At Charlotte | Method and system for remotely inspecting bridges and other structures |
US9104202B2 (en) * | 2010-05-11 | 2015-08-11 | Irobot Corporation | Remote vehicle missions and systems for supporting remote vehicle missions |
US8477190B2 (en) * | 2010-07-07 | 2013-07-02 | Pictometry International Corp. | Real-time moving platform management system |
US8843156B2 (en) * | 2010-07-21 | 2014-09-23 | Raytheon Company | Discovering and preventing a communications disruption in a mobile environment |
US8644512B2 (en) | 2011-03-17 | 2014-02-04 | Massachusetts Institute Of Technology | Mission planning interface for accessing vehicle resources |
JP5389879B2 (en) * | 2011-09-20 | 2014-01-15 | 株式会社日立製作所 | Imaging apparatus, surveillance camera, and camera screen masking method |
US8994578B1 (en) * | 2011-09-27 | 2015-03-31 | Rockwell Collins, Inc. | Adjusting a target value for generating a vertical profile view in a weather radar system |
CA2872698C (en) | 2012-05-04 | 2018-07-24 | Aeryon Labs Inc. | System and method for controlling unmanned aerial vehicles |
US9384668B2 (en) | 2012-05-09 | 2016-07-05 | Singularity University | Transportation using network of unmanned aerial vehicles |
US8798922B2 (en) | 2012-11-16 | 2014-08-05 | The Boeing Company | Determination of flight path for unmanned aircraft in event of in-flight contingency |
US20140140575A1 (en) * | 2012-11-19 | 2014-05-22 | Mace Wolf | Image capture with privacy protection |
US20140316614A1 (en) | 2012-12-17 | 2014-10-23 | David L. Newman | Drone for collecting images and system for categorizing image data |
US8862285B2 (en) * | 2013-02-15 | 2014-10-14 | Disney Enterprises, Inc. | Aerial display system with floating pixels |
US9466219B1 (en) * | 2014-06-27 | 2016-10-11 | Rockwell Collins, Inc. | Unmanned vehicle mission planning, coordination and collaboration |
US9058673B2 (en) | 2013-03-15 | 2015-06-16 | Oracle International Corporation | Image mosaicking using a virtual grid |
JP6132636B2 (en) | 2013-04-10 | 2017-05-24 | 東芝メディカルシステムズ株式会社 | RF coil and magnetic resonance imaging apparatus |
WO2014177882A1 (en) * | 2013-05-02 | 2014-11-06 | Bae Systems Plc | Goal-based planning system |
JP5882951B2 (en) | 2013-06-14 | 2016-03-09 | 株式会社トプコン | Aircraft guidance system and aircraft guidance method |
US20150062339A1 (en) | 2013-08-29 | 2015-03-05 | Brian Ostrom | Unmanned aircraft system for video and data communications |
US9363645B2 (en) * | 2013-10-17 | 2016-06-07 | Symbol Technologies, Llc | Locationing system performance in non-line of sight conditions |
EP2997768B1 (en) | 2014-02-10 | 2018-03-14 | SZ DJI Technology Co., Ltd. | Adaptive communication mode switching |
US9643722B1 (en) * | 2014-02-28 | 2017-05-09 | Lucas J. Myslinski | Drone device security system |
US9407881B2 (en) * | 2014-04-10 | 2016-08-02 | Smartvue Corporation | Systems and methods for automated cloud-based analytics for surveillance systems with unmanned aerial devices |
WO2016025044A2 (en) * | 2014-05-12 | 2016-02-18 | Unmanned Innovation, Inc. | Distributed unmanned aerial vehicle architecture |
US9256225B2 (en) * | 2014-05-12 | 2016-02-09 | Unmanned Innovation, Inc. | Unmanned aerial vehicle authorization and geofence envelope determination |
US9412279B2 (en) * | 2014-05-20 | 2016-08-09 | Verizon Patent And Licensing Inc. | Unmanned aerial vehicle network-based recharging |
US10115277B2 (en) * | 2014-07-29 | 2018-10-30 | GeoFrenzy, Inc. | Systems and methods for geofence security |
WO2016015311A1 (en) | 2014-07-31 | 2016-02-04 | SZ DJI Technology Co., Ltd. | System and method for enabling virtual sightseeing using unmanned aerial vehicles |
US10515416B2 (en) * | 2014-09-03 | 2019-12-24 | Infatics, Inc. | System and methods for hosting missions with unmanned aerial vehicles |
WO2016033796A1 (en) | 2014-09-05 | 2016-03-10 | SZ DJI Technology Co., Ltd. | Context-based flight mode selection |
US20160224766A1 (en) * | 2014-10-30 | 2016-08-04 | Chad Steelberg | Apparatus, system, and method for obfuscation and de-obfuscation of digital content |
US9563201B1 (en) * | 2014-10-31 | 2017-02-07 | State Farm Mutual Automobile Insurance Company | Feedback to facilitate control of unmanned aerial vehicles (UAVs) |
US10609270B2 (en) * | 2014-11-18 | 2020-03-31 | The Invention Science Fund Ii, Llc | Devices, methods and systems for visual imaging arrays |
US9752878B2 (en) | 2014-12-09 | 2017-09-05 | Sikorsky Aircraft Corporation | Unmanned aerial vehicle control handover planning |
US9945931B2 (en) | 2014-12-12 | 2018-04-17 | University Of Kansas | Techniques for navigating UAVs using ground-based transmitters |
US20160328983A1 (en) | 2014-12-15 | 2016-11-10 | Kelvin H. Hutchinson | Navigation and collission avoidance systems for unmanned aircraft systems |
US10777000B2 (en) * | 2014-12-27 | 2020-09-15 | Husqvarna Ab | Garden street view |
US20160307447A1 (en) * | 2015-02-13 | 2016-10-20 | Unmanned Innovation, Inc. | Unmanned aerial vehicle remote flight planning system |
WO2016140985A1 (en) * | 2015-03-02 | 2016-09-09 | Izak Van Cruyningen | Flight planning for unmanned aerial tower inspection |
JP6961886B2 (en) | 2015-03-31 | 2021-11-05 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co., Ltd | Flight control unit |
CN112908038A (en) | 2015-03-31 | 2021-06-04 | 深圳市大疆创新科技有限公司 | Method for determining position of unmanned aerial vehicle and air traffic control system |
EP3158553B1 (en) | 2015-03-31 | 2018-11-28 | SZ DJI Technology Co., Ltd. | Authentication systems and methods for identification of authorized participants |
US20160335898A1 (en) | 2015-04-09 | 2016-11-17 | Vulcan, Inc. | Automated drone management system |
US20160332739A1 (en) * | 2015-05-15 | 2016-11-17 | Disney Enterprises, Inc. | Impact absorption apparatus for unmanned aerial vehicle |
EP3313731B1 (en) * | 2015-06-24 | 2021-05-05 | Ent. Services Development Corporation LP | Control aerial movement of drone based on line-of-sight of humans using devices |
WO2017019571A1 (en) * | 2015-07-29 | 2017-02-02 | Lattice Semiconductor Corporation | Angular velocity sensing using arrays of antennas |
US9734723B1 (en) * | 2015-07-15 | 2017-08-15 | Exelis Inc. | Process and system to register and regulate unmanned aerial vehicle operations |
EP3345064A4 (en) * | 2015-09-03 | 2019-05-01 | Commonwealth Scientific and Industrial Research Organisation | CONTROL TECHNIQUES OF AERIAL VEHICLE WITHOUT PILOT |
US9830706B2 (en) * | 2015-09-17 | 2017-11-28 | Skycatch, Inc. | Generating georeference information for aerial images |
EP3357040A4 (en) | 2015-09-30 | 2019-06-26 | Alarm.com Incorporated | DRONES DETECTION SYSTEMS |
AU2016331221B2 (en) | 2015-09-30 | 2020-02-27 | Alarm.Com Incorporated | Drone-augmented emergency response services |
US9721158B2 (en) | 2015-10-19 | 2017-08-01 | Caterpillar Inc. | 3D terrain mapping system and method |
US9508263B1 (en) * | 2015-10-20 | 2016-11-29 | Skycatch, Inc. | Generating a mission plan for capturing aerial images with an unmanned aerial vehicle |
US10008123B2 (en) * | 2015-10-20 | 2018-06-26 | Skycatch, Inc. | Generating a mission plan for capturing aerial images with an unmanned aerial vehicle |
US20170187993A1 (en) * | 2015-12-29 | 2017-06-29 | Echostar Technologies L.L.C. | Unmanned aerial vehicle integration with home automation systems |
US10083616B2 (en) * | 2015-12-31 | 2018-09-25 | Unmanned Innovation, Inc. | Unmanned aerial vehicle rooftop inspection system |
US9744665B1 (en) * | 2016-01-27 | 2017-08-29 | X Development Llc | Optimization of observer robot locations |
US20180025649A1 (en) | 2016-02-08 | 2018-01-25 | Unmanned Innovation Inc. | Unmanned aerial vehicle privacy controls |
CN107195167B (en) | 2016-03-15 | 2019-11-08 | 天津远翥科技有限公司 | The communication system and method for controlled plant and the application controlled plant |
US11242143B2 (en) | 2016-06-13 | 2022-02-08 | Skydio, Inc. | Unmanned aerial vehicle beyond visual line of sight control |
-
2016
- 2016-03-31 US US15/088,005 patent/US20180025649A1/en not_active Abandoned
- 2016-03-31 US US15/088,042 patent/US10762795B2/en active Active
- 2016-04-08 US US15/094,802 patent/US9588516B1/en active Active
-
2017
- 2017-03-03 US US15/449,846 patent/US11189180B2/en active Active
-
2020
- 2020-08-31 US US17/008,344 patent/US11361665B2/en active Active
-
2021
- 2021-10-27 US US17/512,323 patent/US11854413B2/en active Active
-
2023
- 2023-12-22 US US18/394,152 patent/US20240257654A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160373699A1 (en) * | 2013-10-18 | 2016-12-22 | Aerovironment, Inc. | Privacy Shield for Unmanned Aerial Systems |
US20160364579A1 (en) * | 2014-02-24 | 2016-12-15 | Hewlett-Packard Development Company, L.P. | Privacy Zone |
US20170169713A1 (en) * | 2015-03-31 | 2017-06-15 | SZ DJI Technology Co., Ltd | Authentication systems and methods for generating flight regulations |
US20170148328A1 (en) * | 2015-11-25 | 2017-05-25 | International Business Machines Corporation | Dynamic geo-fence for drone |
US20170235018A1 (en) * | 2016-01-08 | 2017-08-17 | Pictometry International Corp. | Systems and methods for taking, processing, retrieving, and displaying images from unmanned aerial vehicles |
Cited By (84)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10640208B2 (en) * | 2014-12-17 | 2020-05-05 | Picpocket Labs, Inc. | Drone based systems and methodologies for capturing images |
US20180265194A1 (en) * | 2014-12-17 | 2018-09-20 | Picpocket, Inc. | Drone based systems and methodologies for capturing images |
US20210375143A1 (en) * | 2015-03-31 | 2021-12-02 | SZ DJI Technology Co., Ltd. | Systems and methods for geo-fencing device communications |
US12067885B2 (en) * | 2015-03-31 | 2024-08-20 | SZ DJI Technology Co., Ltd. | Systems and methods for geo-fencing device communications |
US11961093B2 (en) | 2015-03-31 | 2024-04-16 | SZ DJI Technology Co., Ltd. | Authentication systems and methods for generating flight regulations |
US11699266B2 (en) * | 2015-09-02 | 2023-07-11 | Interdigital Ce Patent Holdings, Sas | Method, apparatus and system for facilitating navigation in an extended scene |
US20180182168A1 (en) * | 2015-09-02 | 2018-06-28 | Thomson Licensing | Method, apparatus and system for facilitating navigation in an extended scene |
US12254779B2 (en) | 2016-02-08 | 2025-03-18 | Skydio, Inc. | Unmanned aerial vehicle privacy controls |
US11854413B2 (en) | 2016-02-08 | 2023-12-26 | Skydio, Inc | Unmanned aerial vehicle visual line of sight control |
US10762795B2 (en) | 2016-02-08 | 2020-09-01 | Skydio, Inc. | Unmanned aerial vehicle privacy controls |
US11361665B2 (en) | 2016-02-08 | 2022-06-14 | Skydio, Inc. | Unmanned aerial vehicle privacy controls |
US11189180B2 (en) | 2016-02-08 | 2021-11-30 | Skydio, Inc. | Unmanned aerial vehicle visual line of sight control |
US10377487B2 (en) * | 2016-03-29 | 2019-08-13 | Brother Kogyo Kabushiki Kaisha | Display device and display control method |
US10749952B2 (en) * | 2016-06-01 | 2020-08-18 | Cape Mcuas, Inc. | Network based operation of an unmanned aerial vehicle based on user commands and virtual flight assistance constraints |
US10789853B2 (en) * | 2016-06-10 | 2020-09-29 | ETAK Systems, LLC | Drone collision avoidance via air traffic control over wireless networks |
US9959772B2 (en) * | 2016-06-10 | 2018-05-01 | ETAK Systems, LLC | Flying lane management systems and methods for unmanned aerial vehicles |
US20190035287A1 (en) * | 2016-06-10 | 2019-01-31 | ETAK Systems, LLC | Drone collision avoidance via Air Traffic Control over wireless networks |
US11242143B2 (en) | 2016-06-13 | 2022-02-08 | Skydio, Inc. | Unmanned aerial vehicle beyond visual line of sight control |
US11897607B2 (en) | 2016-06-13 | 2024-02-13 | Skydio, Inc. | Unmanned aerial vehicle beyond visual line of sight control |
US20220075375A1 (en) * | 2016-06-30 | 2022-03-10 | Skydio, Inc. | Dynamically adjusting uav flight operations based on radio frequency signal data |
US12153430B2 (en) | 2016-06-30 | 2024-11-26 | Skydio, Inc. | Adjusting a UAV flight plan based on radio frequency signal data |
US11150654B2 (en) * | 2016-06-30 | 2021-10-19 | Skydio, Inc. | Dynamically adjusting UAV flight operations based on radio frequency signal data |
US11709491B2 (en) * | 2016-06-30 | 2023-07-25 | Skydio, Inc. | Dynamically adjusting UAV flight operations based on radio frequency signal data |
US10721375B1 (en) * | 2016-08-26 | 2020-07-21 | Amazon Technologies, Inc. | Vehicle camera contamination detection |
US10472062B2 (en) * | 2016-09-30 | 2019-11-12 | Optim Corporation | System, method, and program for controlling drone |
US20190210722A1 (en) * | 2016-09-30 | 2019-07-11 | Optim Corporation | System, method, and program for controlling drone |
US12122515B2 (en) | 2016-10-06 | 2024-10-22 | Gopro, Inc. | Systems and methods for controlling an unmanned aerial vehicle |
US11518510B1 (en) * | 2016-10-06 | 2022-12-06 | Gopro, Inc. | Systems and methods for controlling an unmanned aerial vehicle |
US11157866B2 (en) * | 2016-10-27 | 2021-10-26 | International Business Machines Corporation | Intelligent package delivery |
US20180121878A1 (en) * | 2016-10-27 | 2018-05-03 | International Business Machines Corporation | Intelligent package delivery |
US20220185487A1 (en) * | 2016-11-04 | 2022-06-16 | Sony Group Corporation | Circuit, base station, method, and recording medium |
US11292602B2 (en) * | 2016-11-04 | 2022-04-05 | Sony Corporation | Circuit, base station, method, and recording medium |
US12060154B2 (en) * | 2016-11-04 | 2024-08-13 | Sony Group Corporation | Circuit, base station, method, and recording medium |
US11288107B2 (en) * | 2016-11-23 | 2022-03-29 | Google Llc | Selective obfuscation of notifications |
US10977846B2 (en) | 2016-11-30 | 2021-04-13 | Gopro, Inc. | Aerial vehicle map determination |
US11704852B2 (en) | 2016-11-30 | 2023-07-18 | Gopro, Inc. | Aerial vehicle map determination |
US10198841B2 (en) * | 2016-11-30 | 2019-02-05 | Gopro, Inc. | Map view |
US20180150984A1 (en) * | 2016-11-30 | 2018-05-31 | Gopro, Inc. | Map View |
US11295621B2 (en) * | 2016-12-01 | 2022-04-05 | SZ DJI Technology Co., Ltd. | Methods and associated systems for managing 3D flight paths |
US11961407B2 (en) | 2016-12-01 | 2024-04-16 | SZ DJI Technology Co., Ltd. | Methods and associated systems for managing 3D flight paths |
US10909861B2 (en) * | 2016-12-23 | 2021-02-02 | Telefonaktiebolaget Lm Ericsson (Publ) | Unmanned aerial vehicle in controlled airspace |
US20180324662A1 (en) * | 2017-05-03 | 2018-11-08 | Qualcomm Incorporated | Determining whether a drone-coupled user equipment is engaged in a flying state |
US11496884B2 (en) | 2017-05-03 | 2022-11-08 | Qualcomm Incorporated | Exchanging a message including drone-coupled capability information between a drone-coupled user equipment and a component of a terrestrial wireless communication subscriber network |
US11490246B2 (en) * | 2017-05-03 | 2022-11-01 | Qualcomm Incorporated | Determining whether a drone-coupled user equipment is engaged in a flying state |
US11438760B2 (en) | 2017-05-03 | 2022-09-06 | Qualcomm Incorporated | Exchanging a message including an in-flight status indicator between a drone-coupled user equipment and a component of a terrestrial wireless communication subscriber network |
US10979854B2 (en) | 2017-06-02 | 2021-04-13 | Apple Inc. | Extending a radio map |
US10794986B2 (en) * | 2017-06-02 | 2020-10-06 | Apple Inc. | Extending a radio map |
US10878679B2 (en) * | 2017-07-31 | 2020-12-29 | Iain Matthew Russell | Unmanned aerial vehicles |
US10938102B2 (en) * | 2017-08-23 | 2021-03-02 | The United States Of America, As Represented By The Secretary Of The Navy | Search track acquire react system (STARS) drone integrated acquisition tracker (DIAT) |
US20190067812A1 (en) * | 2017-08-23 | 2019-02-28 | The United States Of America, As Represented By The Secretary Of The Navy | Search Track Acquire React System (STARS) Drone Integrated Acquisition Tracker (DIAT) |
US10962650B2 (en) | 2017-10-31 | 2021-03-30 | United States Of America As Represented By The Administrator Of Nasa | Polyhedral geofences |
US20190197254A1 (en) * | 2017-12-27 | 2019-06-27 | Honeywell International Inc. | Systems and methods for dynamically masking video and images captured a drone device camera |
US10922431B2 (en) * | 2017-12-27 | 2021-02-16 | Honeywell International Inc. | Systems and methods for dynamically masking video and images captured by a drone device camera |
US20190235501A1 (en) * | 2018-01-31 | 2019-08-01 | Walmart Apollo, Llc | System and method for identifying vehicle delivery locations utilizing scout autonomous vehicles |
US11948465B2 (en) * | 2018-03-23 | 2024-04-02 | SZ DJI Technology Co., Ltd. | Control method, device, and system for locking load carried by flight platform |
US20210005092A1 (en) * | 2018-03-23 | 2021-01-07 | SZ DJI Technology Co., Ltd. | Control method, device, and system |
US10866597B1 (en) * | 2018-05-07 | 2020-12-15 | Securus Technologies, Llc | Drone detection and interception |
US20210266461A1 (en) * | 2018-07-04 | 2021-08-26 | c/o H3 DYNAMICS PTE. LTD. | Defect detection system using a camera equipped uav for building facades on complex asset geometry with optimal automatic obstacle deconflicted flightpath |
US11112249B1 (en) * | 2018-09-24 | 2021-09-07 | Rockwell Collins, Inc. | Systems and methods for four-dimensional routing around concave polygon avoidances |
EP3671679A1 (en) * | 2018-12-17 | 2020-06-24 | Robert Bosch GmbH | Dynamic masking or hiding of areas of a field of view |
US11377231B2 (en) * | 2019-02-06 | 2022-07-05 | Honeywell International Inc. | Automatically adjustable landing lights for aircraft |
US20220166939A1 (en) * | 2019-03-13 | 2022-05-26 | Sony Group Corporation | Information processing apparatus, method, and recording medium |
US11122424B1 (en) | 2019-05-14 | 2021-09-14 | Hood Mountain, LLC | Systems, methods and apparatus for data privacy protection based on geofence networks |
US11910185B1 (en) * | 2019-05-14 | 2024-02-20 | Bennett Hill Branscomb | Systems, methods and apparatus for data privacy protection based on geofence networks |
US20200036886A1 (en) * | 2019-08-16 | 2020-01-30 | Lg Electronics Inc. | Method for photographing an unmanned aerial robot and a device for supporting the same in an unmanned aerial vehicle system |
US11868145B1 (en) | 2019-09-27 | 2024-01-09 | Amazon Technologies, Inc. | Selecting safe flight routes based on localized population densities and ground conditions |
US12039846B2 (en) * | 2019-12-19 | 2024-07-16 | 4Dream Co., Ltd. | Method, apparatus, and system for protecting private information from illegal photography by unmanned aerial vehicle |
US20230038872A1 (en) * | 2019-12-19 | 2023-02-09 | 4Dream Co., Ltd. | Method, apparatus, and system for protecting private information from illegal photography by unmanned aerial vehicle |
US12164309B2 (en) * | 2020-03-02 | 2024-12-10 | Clrobur Co., Ltd. | Drone control system and intelligent drone flight planning method thereof |
US20220147066A1 (en) * | 2020-03-02 | 2022-05-12 | Clrobur Co., Ltd. | Drone control system and intelligent drone flight planning method thereof |
US11579611B1 (en) * | 2020-03-30 | 2023-02-14 | Amazon Technologies, Inc. | Predicting localized population densities for generating flight routes |
US11640764B1 (en) | 2020-06-01 | 2023-05-02 | Amazon Technologies, Inc. | Optimal occupancy distribution for route or path planning |
JP7420048B2 (en) | 2020-10-22 | 2024-01-23 | トヨタ自動車株式会社 | Control devices, systems, programs, control equipment, aircraft, sensors and system operation methods |
US12159542B2 (en) * | 2020-10-22 | 2024-12-03 | Toyota Jidosha Kabushiki Kaisha | Control device, system, program, control instrument, flying object, sensor, and method of operating system |
US20220130259A1 (en) * | 2020-10-22 | 2022-04-28 | Toyota Jidosha Kabushiki Kaisha | Control device, system, program, control instrument, flying object, sensor, and method of operating system |
JP2022068718A (en) * | 2020-10-22 | 2022-05-10 | トヨタ自動車株式会社 | How to operate controls, systems, programs, control equipment, flying objects, sensors and systems |
US20220201253A1 (en) * | 2020-12-22 | 2022-06-23 | Axis Ab | Camera and a method therein for facilitating installation of the camera |
US11825241B2 (en) * | 2020-12-22 | 2023-11-21 | Axis Ab | Camera and a method therein for facilitating installation of the camera |
WO2022219655A1 (en) * | 2021-04-14 | 2022-10-20 | Vlab S.R.L. | Monitoring method for monitoring environments and related monitoring device |
IT202100009311A1 (en) * | 2021-04-14 | 2022-10-14 | Vlab S R L | MONITORING METHOD FOR MONITORING ENVIRONMENTS AND RELATED MONITORING DEVICE |
US20230171318A1 (en) * | 2021-12-01 | 2023-06-01 | International Business Machines Corporation | Management of devices in a smart environment |
US11805175B2 (en) * | 2021-12-01 | 2023-10-31 | International Business Machines Corporation | Management of devices in a smart environment |
US11997561B2 (en) * | 2022-04-07 | 2024-05-28 | Caterpillar Paving Products Inc. | System and method for defining an area of a worksite |
US20230328477A1 (en) * | 2022-04-07 | 2023-10-12 | Caterpillar Paving Products Inc. | System and method for defining an area of a worksite |
Also Published As
Publication number | Publication date |
---|---|
US20180025473A1 (en) | 2018-01-25 |
US11361665B2 (en) | 2022-06-14 |
US10762795B2 (en) | 2020-09-01 |
US20200402410A1 (en) | 2020-12-24 |
US20220148438A1 (en) | 2022-05-12 |
US11189180B2 (en) | 2021-11-30 |
US20240257654A1 (en) | 2024-08-01 |
US20170229022A1 (en) | 2017-08-10 |
US11854413B2 (en) | 2023-12-26 |
US9588516B1 (en) | 2017-03-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11361665B2 (en) | Unmanned aerial vehicle privacy controls | |
US11897607B2 (en) | Unmanned aerial vehicle beyond visual line of sight control | |
US20220388656A1 (en) | Unmanned Aerial Vehicle Area Surveying | |
US20210358315A1 (en) | Unmanned aerial vehicle visual point cloud navigation | |
US12153430B2 (en) | Adjusting a UAV flight plan based on radio frequency signal data | |
US12039875B2 (en) | Unmanned aerial vehicle rooftop inspection system | |
US20220176846A1 (en) | Unmanned Aerial Vehicle Remote Flight Planning System | |
WO2017139282A1 (en) | Unmanned aerial vehicle privacy controls | |
US9915946B2 (en) | Unmanned aerial vehicle rooftop inspection system | |
US12254779B2 (en) | Unmanned aerial vehicle privacy controls | |
WO2017147142A1 (en) | Unmanned aerial vehicle visual line of sight control | |
US12235639B1 (en) | Unmanned aerial vehicle flight control system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: UNMANNED INNOVATION INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CONTRERAS, DANA LIVONIA;DOYLE, LUCAS PALAGE;KUEHN, JUSTIN EUGENE;AND OTHERS;REEL/FRAME:038166/0506 Effective date: 20160331 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: SKYDIO, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AIRWARE, LLC;REEL/FRAME:053144/0633 Effective date: 20200626 Owner name: AIRWARE, LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:UNMMANED INNOVATION, INC.;REEL/FRAME:053144/0591 Effective date: 20190411 |
|
AS | Assignment |
Owner name: AIRWARE, LLC, CALIFORNIA Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE THE CONVEYING PARTY DATA WAS ERRONEOUSLY ENTER AS UNMMANED INNOVATIONS, INC. IT SHOULD READ UNMANNED INNOVATIONS, INC PREVIOUSLY RECORDED AT REEL: 053144 FRAME: 0591. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:UNMANNED INNOVATION, INC.;REEL/FRAME:053210/0586 Effective date: 20190411 |
|
AS | Assignment |
Owner name: SILICON VALLEY BANK, CALIFORNIA Free format text: SECURITY INTEREST;ASSIGNOR:SKYDIO, INC.;REEL/FRAME:058053/0768 Effective date: 20211108 |