US20160042637A1 - Drone Safety Alert Monitoring System and Method - Google Patents
Drone Safety Alert Monitoring System and Method Download PDFInfo
- Publication number
- US20160042637A1 US20160042637A1 US14/824,011 US201514824011A US2016042637A1 US 20160042637 A1 US20160042637 A1 US 20160042637A1 US 201514824011 A US201514824011 A US 201514824011A US 2016042637 A1 US2016042637 A1 US 2016042637A1
- Authority
- US
- United States
- Prior art keywords
- alarm
- drone
- mobile computing
- computing device
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000012544 monitoring process Methods 0.000 title claims description 62
- 238000000034 method Methods 0.000 title description 32
- 230000004044 response Effects 0.000 claims abstract description 139
- 238000004891 communication Methods 0.000 claims description 30
- 230000001960 triggered effect Effects 0.000 claims description 14
- 210000001525 retina Anatomy 0.000 claims description 3
- 230000008569 process Effects 0.000 description 13
- 108010007100 Pulmonary Surfactant-Associated Protein A Proteins 0.000 description 8
- 102100027773 Pulmonary surfactant-associated protein A2 Human genes 0.000 description 8
- 238000010586 diagram Methods 0.000 description 8
- 230000004913 activation Effects 0.000 description 7
- 238000013475 authorization Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 238000004590 computer program Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 239000004973 liquid crystal related substance Substances 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 230000008901 benefit Effects 0.000 description 4
- 210000003811 finger Anatomy 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000012546 transfer Methods 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 238000007726 management method Methods 0.000 description 3
- 208000010125 myocardial infarction Diseases 0.000 description 3
- 239000000047 product Substances 0.000 description 3
- 101150116940 AGPS gene Proteins 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000007792 addition Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000000977 initiatory effect Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 210000003813 thumb Anatomy 0.000 description 2
- 229920000049 Carbon (fiber) Polymers 0.000 description 1
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 239000004917 carbon fiber Substances 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000009429 distress Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000037308 hair color Effects 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- VNWKTOKETHGBQD-UHFFFAOYSA-N methane Chemical compound C VNWKTOKETHGBQD-UHFFFAOYSA-N 0.000 description 1
- 230000004297 night vision Effects 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/10—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using wireless transmission systems
-
- G08G5/0069—
-
- G08G5/0078—
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/029—Location-based management or tracking services
-
- H04W76/023—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W76/00—Connection management
- H04W76/50—Connection management for emergency connections
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
- B64U10/13—Flying platforms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
- B64U2201/104—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS] using satellite radio beacon positioning systems, e.g. GPS
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/20—Remote controls
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19639—Details of the system layout
- G08B13/19647—Systems specially adapted for intrusion detection in or around a vehicle
- G08B13/1965—Systems specially adapted for intrusion detection in or around a vehicle the vehicle being an aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/001—Alarm cancelling procedures or alarm forwarding decisions, e.g. based on absence of alarm confirmation
Definitions
- the present systems and methods relate generally to systems and methods for monitoring by a drone or unmanned aerial vehicle (UAV) that is triggered by an alarm notification message sent by a wearable device and/or a mobile computing device. More particularly, the systems and methods receive an alarm trigger and send an alarm notification message, including location information and a unique identifier representing identifying information, to a server. The server sends the location information to the drone. The drone travels to a location using the location information and begins monitoring the location.
- UAV unmanned aerial vehicle
- Mobile computing devices have gradually become a ubiquitous part of daily life. Traditionally, a mobile computing device such as a smartphone may be carried on a person in a pocket, a purse, a briefcase, a backpack, a messenger bag, etc. In other situations, the mobile computing device may be located nearby a person, such as on a table or in a car. In nearly all of these instances, users of smartphones and tablets have access to a portable device that is capable of communicating with others, capable of executing applications, and capable of sending and receiving information to other devices.
- While mobile computing devices provide users the ability to communicate with others and reach out for help in the event of an emergency, it may be difficult or impossible to efficiently and accurately provide critical information to an emergency dispatch center when time is of the essence.
- first responders may have to travel to the person to provide assistance. While the first responders attempt to arrive as soon as possible, there is typically a period of time before the first responders are able to arrive. During this period of time, valuable evidence may be lost.
- aspects of the present disclosure generally relate to methods and systems for monitoring by a drone or unmanned aerial vehicle (UAV) that is triggered by an alarm notification provided through an application provided through a mobile device.
- a mobile device may be any mobile computing platform, including a smartphone, a tablet computer, a wearable device, etc.
- a user provides identifying information to a wearable safety application executed by a wearable device and/or a mobile safety application executed by a mobile computing device.
- the wearable device and/or the mobile computing device send the identifying information to an alarm response server and the alarm response server stores the identifying information in a database.
- the alarm response server associates the identifying information with a unique identifier and sends the unique identifier to the wearable device and/or the mobile computing device. If an emergency occurs, the user may trigger the wearable safety application and/or the mobile safety application.
- the wearable safety application and/or the mobile safety application send an alarm notification message including location information and the unique identifier to the alarm response server.
- the alarm response server determines one or more personal safety answering points (PSAP) based on the location information. If the alarm notification is verified, e.g., the alarm notification is not a false alarm, the alarm response server sends the location information and the identifying information to a call center server associated with the one or more PSAPs for further action by emergency responders.
- PSAP personal safety answering points
- the alarm response server sends the identifying information and the location information to one or more drones.
- the one or more drones travel to a location based on the location information and begins monitoring.
- the call center server also may send the location information and the identifying information to one or more lifelines, e.g., a person to contact in the event of an emergency.
- a drone safety alert monitoring system includes one or more processors to receive identifying information and transmit the identifying information to an alarm response server from a mobile computing device, receive, by the mobile computing device, a unique identifier that identifies the identifying information in database associated with the alarm response server, receive a trigger of an alarm notification by one of a wearable device and the mobile computing device, determine a current location of the mobile computing device, transmit an alarm notification message to the alarm response server, the alarm notification message including the current location of the mobile computing device and the unique identifier, and transmit the current location of the mobile computing device to at least one drone.
- aspects of the present disclosure may provide police and other emergency responders with a system that supplements or compliments their work.
- a police officer e.g., though a mobile or wearable device, including a computer in a police car
- location information is sent to an alarm response server, which communicates the location information to a drone, which is deployed to the location to capture monitoring information (i.e., photographs, video, audio, etc.) at the location.
- monitoring information i.e., photographs, video, audio, etc.
- a police officer may issue an alarm notification for a location at which a car has been pulled over.
- the officer may place a target on the car that has been pulled over.
- the drone may identify the target as the object to follow. Then, if the car starts moving, the drone may follow the car, and target, without further instruction from the alarm response server or other outside party. Using a target the drone can follow moving objects without the need for further instruction.
- FIG. 1 illustrates a block diagram of a drone safety alert monitoring system according to an example embodiment.
- FIG. 2 illustrates example information in an alarm response (PSAP) database according to an example embodiment.
- PSAP alarm response
- FIG. 3 is a flowchart illustrating a process for monitoring by the drone safety alert monitoring system according to an example embodiment.
- FIG. 4 illustrates a block diagram of an example computer device for use with the example embodiments.
- FIGS. 5-7 illustrate example screenshots of a mobile safety application executed by a mobile computing device according to an example embodiment.
- FIG. 8 illustrates a perspective view of a drone according to an example embodiment.
- FIG. 9 illustrates a perspective view of a wearable device according to an example embodiment.
- FIG. 10 illustrates another a perspective view of a wearable device according to an example embodiment.
- FIG. 11 illustrates a command center graphical user interface (GUI) according to an example embodiment.
- GUI command center graphical user interface
- FIG. 1 illustrates a block diagram of a drone safety alert monitoring system 100 according to an example embodiment.
- the drone safety alert monitoring system 100 includes one or more drones 102 .
- the drone safety alert monitoring system 100 further includes one or more optional wearable devices 104 , one or more mobile computing devices 106 , one or more alarm response servers 108 , one or more databases 110 , one or more call center servers 112 , and a communication network 114 .
- the one or more computing devices communicate and coordinate their actions by passing messages over the communication network 114 .
- the communication network 114 can be one or more of the Internet, an intranet, a cellular communications network, a WiFi network, a packet network, or another wired or wireless communication network.
- the one or more computing devices communicate data in packets, messages, or other communications using a common protocol, e.g., Hypertext Transfer Protocol (HTTP) and/or Hypertext Transfer Protocol Secure (HTTPS).
- HTTP Hypertext Transfer Protocol
- HTTPS Hypertext Transfer Protocol Secure
- the drone safety alert monitoring system 100 may be a cloud-based computer system or a distributed computer system.
- the one or more computing devices may communicate based on representational state transfer (REST) and/or Simple Object Access Protocol (SOAP).
- a first computer e.g., a client computer
- SOAP Simple Object Access Protocol
- a second computer e.g., a server computer
- JSON Javascript Object Notation
- XML Extensible Markup Language
- a second computer e.g., a server computer
- JSON Javascript Object Notation
- XML Extensible Markup Language
- a second computer e.g., a server computer
- the embodiments described herein may be based on Oauth, an open standard for authorization.
- Oauth allows producers of web services to grant third-party access to web resources without sharing usernames and/or passwords.
- the web resources may be the one or more drones 102 , the one or more alarm response servers 108 , the one or more databases 110 , and the one or more call center servers 112 .
- Oauth provides one application with one access token providing access to a subset of web resources on behalf of one user, similar to a valet key.
- the embodiments may be related to Oauth 2.0. While discussed in the context of Oauth, the present disclosure is not limited to Oauth.
- the drone safety alert monitoring system 100 may be deployed or located at a particular site including a city, a town, a college campus, a corporate campus, outdoor venue, e.g., a concert venue, an indoor venue, e.g., an arena, and other locations.
- the drone safety alert monitoring system 100 may include one or more hangars to house the one or more drones 102 .
- a college campus may have a single hangar housing the one or more drones 102 .
- the one or more hangars may be distributed throughout the college campus.
- Each hangar may be located equidistant from other hangars, e.g., each hangar may each cover a particular grid on the particular site.
- each hangar also may be located in a particular location at the particular site based on previously reported emergencies and/or population density.
- the particular site may include four grids each having an equal size of 1000 feet ⁇ 1000 feet.
- One hangar may be located in the center of each grid.
- Each hangar may house one or more drones 102 to quickly and efficiently service any particular location in each grid.
- Each hangar may be outfitted with one or more alternating current (AC) power sockets and one or more chargers for charging the one or more batteries of the drone 102 .
- the drone 102 may be housed in a hangar located on a roof of a building, a garage, or another location.
- FIG. 1 illustrates a block diagram of the drone 102 according to an example embodiment.
- the drone 102 may be a computer having one or more processors 116 and memory 118 , including but not limited to an unmanned aerial system (UAS) and an unmanned aerial vehicle (UAV).
- UAS unmanned aerial system
- UAV unmanned aerial vehicle
- the drone 102 is not limited to an unmanned aircraft device or a UAV and may be other types of unmanned vehicles.
- the drone 102 may be an unmanned ground vehicle (UGV) having wheels, legs, or a continuous track, an unmanned vehicle traveling on rails, e.g., an unmanned train, an unmanned boat, and an unmanned hovercraft, among other vehicles.
- UVS unmanned aerial system
- UAV unmanned aerial vehicle
- the drone 102 may be an autonomous or remote-controlled vehicle, an autonomous or remote-controlled car, an autonomous or remote-controlled train, an autonomous or remote-controlled boat, or an autonomous remote-controlled hovercraft, among other vehicles.
- the majority of the description provided herein refers to the drone 102 as being an UAV.
- the flight and operation of the drone 102 may be controlled autonomously by the one or more processors 116 , another computer (e.g., the one or more alarm response servers 108 or another mobile computing device), and/or by one or more users via a remote control.
- the drone 102 may further include one or more cameras 103 , one or more light sources, one or more microphones 105 , one or more sensors including a gyroscope, an accelerometer, a magnetometer, an ultrasound sensor, an altimeter, an air pressure sensor, a motion sensor, and other sensors, one or more rotors, one or more motors, and one or more batteries for powering the drone 102 .
- the camera 103 may be a high-definition camera capable of recording high-definition video (e.g., any video image with more than 480 horizontal lines and/or captured at rates greater than 60 frames per second).
- the camera 103 may include analog zoom and/or digital zoom and may zoom in or out during operation.
- the camera 103 also may be a thermal vision camera or a night vision camera.
- the drone 102 may be battery-powered and/or powered by another source, e.g., gasoline.
- the drone 102 may have a hull that comprises carbon fiber components, plastic components, metal components, and other components.
- the drone 102 may communicate with another drone 102 , the wearable device 104 , the mobile computing device 106 , the alarm response server 108 , and/or the call center server 112 using at least one of Bluetooth, WiFi, a wired network, a wireless network, and a cellular network. According to an example embodiment, at least the drone 102 and the mobile computing device 106 may communicate wirelessly.
- the drone 102 reverse geocodes a current location of the drone 102 using global positioning system (GPS) hardware.
- GPS global positioning system
- the GPS hardware communicates with a GPS satellite-based positioning system.
- the GPS hardware may be an assisted GPS system, e.g., A-GPS or aGPS, or may be a standalone GPS. Standalone GPS only uses radio signals from the satellite-based positioning system.
- An assisted GPS system uses network resources available to the drone 102 to locate and use the satellite-based positioning system in poor signal conditions, such as in a city where signals bounce off of buildings or pass through walls or tree cover.
- the one or more processors 116 may process machine/computer-readable executable instructions and data, and the memory 118 may store machine/computer-readable executable instructions and data including one or more applications, including a monitoring safety application 120 .
- the processor 116 and memory 118 are hardware.
- the memory 118 includes random access memory (RAM) and non-transitory memory, e.g., a non-transitory computer-readable medium such as one or more flash disks or hard drives.
- the non-transitory memory may include any tangible computer-readable medium including, for example, magnetic and/or optical disks, flash drives, and the like.
- the monitoring safety application 120 may be a component of an application and/or service executable by the drone 102 .
- the monitoring safety application 120 may be a single unit of deployable executable code.
- the monitoring safety application 120 may also be one application and/or a suite of applications for monitoring a person that triggers an alarm notification.
- the monitoring safety application 120 receives a location of the person, determines a route for the drone 102 to fly to the person using the GPS hardware, routes the drone 102 to the person based on the route, and monitors the person using video, photographs, and/or audio from the camera 103 .
- the drone 102 upon receipt of the alert notification, the drone 102 receives the location and determines a shortest and/or quickest route to the location.
- the route also may be determined by another computing device and transmitted to the drone 102 .
- the alarm response server 108 may determine the route.
- the route may be determined based on weather conditions and obstacles including buildings, trees, power lines, and other obstacles.
- the drone 102 takes off and may travel at a first particular altitude and a particular speed to the location.
- the drone 102 may also travel at a variable altitude that could change during the flight and a variable speed that could change during the flight.
- the drone 102 may pass through one or more waypoints on the route to the destination.
- the waypoints may be automatically assigned or may be assigned by an operator or user before or during flight.
- the waypoints may be used to avoid obstacles, avoid a populated area, or for another reason.
- the particular altitude may be 50 feet, 100 feet, 200 feet, 1000 feet, 2000 feet, and other altitudes.
- the drone 102 travels to the location based on the route and upon arrival the drone 102 begins monitoring. Once the drone 102 arrives at the location, the drone 102 may hover at a second particular altitude above the person for a particular period of time. The second particular altitude may be the same altitude as the first particular altitude or a different altitude from the first particular altitude.
- the video, photographs, and/or the audio may be based on an aerial view of the person.
- the drone 102 may land at the location or near the location and the video, photographs, and/or the audio may be based on a terrestrial view of the person.
- the monitoring safety application 120 determines and builds one or more data structures comprising a three-dimensional environment, tracks objects including the person and obstacles, and records information.
- the monitoring safety application 120 monitors the person using the one or more cameras 103 and the one or more microphones 105 .
- the monitoring safety application 120 records video, photographs, and/or audio.
- the drone 102 may determine whether the mobile computing device 106 , the wearable device 104 , and/or the person are currently moving or stationary. If the mobile computing device 106 , the wearable device 104 , and/or the person is moving, the drone 102 tracks and follows the person and continues to record video, photographs, and/or audio.
- the monitoring safety application 120 streams the video and/or the audio to the alarm response server 108 and/or the call center server 112 .
- the monitoring safety application 120 stores the video, photographs, and/or the audio in the memory 118 .
- the monitoring safety application 120 communicates data and messages with the mobile computing device 106 , the alarm response server 108 , and/or the call center server 112 using the communication network 114 .
- the drone 102 may further include an optional display/output device 107 and an input device 109 .
- the display/output device 107 is used to provide status information about the drone 102 including a current battery level or fuel level, a flying status (e.g., ascending/descending), and other information.
- the output device 107 may be one or more light emitting diodes, e.g., a light emitting diode that flashes while the drone 102 is in operation.
- the display may indicate the status information.
- the display can be a liquid-crystal display, a light-emitting diode display, an organic light-emitting diode display, a touch screen display, an e-ink display, an e-paper display, and other displays.
- the input device 109 is used to interact with the drone 102 and may include one or more hardware buttons.
- the hardware buttons may include an on/off button and other buttons.
- the input device 109 may be included within the display if the display is a touch screen display. The input device 109 allows a user of the drone 102 to manipulate and interact with the monitoring safety application 120 .
- the drone 102 may also include an optional remote control receiver that operates with the input device 109 for receiving information from an optional remote control transmitter.
- the remote control transmitter transmits information to the remote control receiver to monitor, control, and operate the drone 102 .
- the remote control transmitter may be a dedicated device comprising one or more processors and memory or a computer such as the alarm response server 108 or the call center server 112 .
- a first drone 102 may communicate with a second drone 102 .
- the first drone 102 and the second drone 102 may travel to the location and cooperate to simultaneously monitor video, photographs, and/or the audio from multiple vantage points and/or multiple angles.
- the first drone 102 and the second drone 102 may stream the video and/or the audio to the alarm response server 108 and/or the call center server 112 .
- the first drone 102 and the second drone 102 may store the video, photographs, and/or the audio in the memory 118 .
- the first drone 102 and the second drone 102 may communicate data and messages with the mobile computing device 106 , the alarm response server 108 , and/or the call center server 112 using the communication network 114 .
- the first drone 102 may travel to the location at a first time and monitor video, photographs, and/or the audio.
- the first drone 102 may send a message to the alarm response server 108 and/or a second drone 102 .
- the second drone 102 may travel to the location at a second time and monitor video, photographs, and/or the audio.
- the first drone 102 and the second drone 102 may cooperate to seamlessly monitor video, photographs, and/or the audio for an extended period of time that may be longer than the life of a battery of a single drone.
- the second drone 102 may send a message to the alarm response server 108 and/or a third drone 102 and the monitoring process may continue by the third drone 102 , and so on.
- FIG. 1 illustrates a block diagram of the optional wearable device 104 according to an example embodiment.
- the wearable device 104 may be a computer having one or more processors 122 and memory 124 , including but not limited to a watch, a necklace, a pendant, a hair clip, a hair tie, a pin, a tie clip/tack, a ring, a cufflink, a belt clip, a scarf, a pashmina, a wrap, a shawl, a garment, a keychain, another small mobile computing device, or a dedicated electronic device having a processor 122 and memory 124 .
- the wearable device 104 may be a Bluetooth Low Energy (BLE, Bluetooth LE, Bluetooth Smart) Device based on the Bluetooth 4.0 specification or another specification. According to an example embodiment, the wearable device 104 and the mobile computing device 106 are paired and communicate wirelessly using a short-range wireless network, e.g., Bluetooth.
- BLE Bluetooth Low Energy
- Bluetooth LE Bluetooth LE
- Bluetooth Smart Bluetooth Low Energy
- the wearable device 104 may create a personal area network and/or a mesh network for communicating with the one or more mobile computing devices 106 and/or the one or more drones 102 . Additionally, the wearable device 104 , the mobile computing device 106 , and the one or more drones 102 may communicate using Zigbee, Wi-Fi, near field magnetic inductance, sonic (sound) waves, and/or infrared (light) waves.
- the wearable device 104 may be a smart watch such as a GARMINTM smart watch, a PebbleTM smart watch, a SAMSUNGTM Galaxy Gear smart watch, an ANDROIDTM based smart watch, an APPLETM and/or iOSTM-based smart watch, a TizenTM smart watch, and a VALRTTM wearable device, among others.
- a smart watch such as a GARMINTM smart watch, a PebbleTM smart watch, a SAMSUNGTM Galaxy Gear smart watch, an ANDROIDTM based smart watch, an APPLETM and/or iOSTM-based smart watch, a TizenTM smart watch, and a VALRTTM wearable device, among others.
- the one or more processors 122 may process machine/computer-readable executable instructions and data, and the memory 124 may store machine/computer-readable executable instructions and data including one or more applications, including a wearable safety application 126 .
- the processor 122 and memory 124 are hardware.
- the memory 124 includes random access memory (RAM) and non-transitory memory, e.g., a non-transitory computer-readable medium such as one or more flash disks or hard drives.
- the non-transitory memory may include any tangible computer-readable medium including, for example, magnetic and/or optical disks, flash drives, and the like.
- the wearable safety application 126 may be a component of an application and/or service executable by the wearable device 104 .
- the wearable safety application 126 may be a single unit of deployable executable code.
- the wearable safety application 126 may also be one application and/or a suite of applications for triggering an alarm notification.
- the wearable safety application 126 sends an alarm notification directly to the alarm response server 108 .
- the wearable safety application 126 sends the alarm notification to the mobile computing device 106 and the mobile computing device 106 forwards the alarm notification to the alarm response server 108 .
- the wearable safety application 126 may be a web-based application viewed in a browser on the wearable device 104 and/or a native application executed by the wearable device 104 .
- the wearable safety application 126 may be downloaded from the Internet and/or digital distribution platforms, e.g., directly from a website or an app store such as the PebbleTM appstore, the (iOSTM) App Store, and GOOGLE PLAY,TM among others.
- the wearable safety application 126 communicates messages with the mobile computing device 106 and/or the alarm response server 108 using the communication network 114 .
- the wearable device 104 may further include an optional display and an input device.
- the display is used to display visual components of the wearable safety application 126 , such as at a user interface.
- the user interface may display a user interface of the wearable safety application 126 .
- the display can be a liquid-crystal display, a light-emitting diode display, an organic light-emitting diode display, a touch screen display, an e-ink display, an e-paper display, and other displays.
- the input device is used to interact with the wearable safety application 126 and may include one or more hardware buttons.
- the input device may be included within the display if the display is a touch screen display.
- the input device allows a user of the wearable device 104 to manipulate and interact with the user interface of the wearable safety application 126 .
- FIG. 1 also illustrates a block diagram of the mobile computing device 106 according to an example embodiment.
- the mobile computing device 106 may be a computer having one or more processors 128 and memory 130 , including but not limited to a server, laptop, desktop, tablet computer, smartphone, or a dedicated electronic device having a processor 128 and memory 130 .
- the one or more processors 128 may process machine/computer-readable executable instructions and data
- the memory 130 may store machine/computer-readable executable instructions and data including one or more applications, including a mobile safety application 132 .
- the processor 128 and memory 130 are hardware.
- the memory 130 includes random access memory (RAM) and non-transitory memory, e.g., a non-transitory computer-readable medium such as one or more flash disks or hard drives.
- the non-transitory memory may include any tangible computer-readable medium including, for example, magnetic and/or optical disks, flash drives, and the like.
- the mobile safety application 132 may be a component of an application and/or service executable by the mobile computing device 106 .
- the mobile safety application 132 may be a single unit of deployable executable code.
- the mobile safety application 132 may also be one application and/or a suite of applications for triggering an alarm notification.
- the mobile safety application 132 sends an alarm notification directly to the alarm response server 108 .
- the mobile safety application 132 receives the alarm notification from the wearable device 104 and the mobile computing device 106 forwards the alarm notification to the alarm response server 108 .
- the mobile safety application 132 may be a web-based application viewed in a browser on the mobile computing device 106 and/or a native application executed by the mobile computing device 106 .
- the application may be downloaded from the Internet and/or digital distribution platforms, e.g., directly from a website, the MacTM App Store, the (iOSTM) App Store, and/or GOOGLE PLAYTM, among others.
- the mobile safety application 132 is an iOSTM application, an AndroidTM application, or a WindowsTM Phone application.
- the mobile safety application 132 communicates messages with the drone 102 , the wearable device 104 and/or the alarm response server 108 using the communication network 114 .
- the mobile computing device 106 includes global positioning system (GPS) hardware.
- the GPS hardware communicates with a GPS satellite-based positioning system.
- the mobile computing device 106 may further include an optional display and an input device.
- the display is used to display visual components of the mobile safety application 132 , such as at a user interface.
- the user interface may display a user interface of the mobile safety application 132 .
- the display can be a liquid-crystal display, a light-emitting diode display, an organic light-emitting diode display, a touch screen display, an e-ink display, an e-paper display, and other displays.
- the input device is used to interact with the mobile safety application 132 and may include a mouse, a keyboard, a trackpad, and/or the like.
- the input device may be included within the display if the display is a touch screen display.
- the input device allows a user of the mobile computing device 106 to manipulate and interact with the user interface of the mobile safety application 132
- FIG. 1 further illustrates a block diagram of the alarm response server 108 according to an example embodiment.
- the alarm response server 108 is a computer having one or more processors 134 and memory 136 .
- the alarm response server 108 may be, for example, a laptop, desktop, a server, tablet computer, mobile computing device (e.g., a smart phone) or a dedicated electronic device having a processor 134 and memory 136 .
- the alarm response server 108 includes one or more processors 134 to process data and memory 136 to store machine/computer-readable executable instructions and data including an alarm response application 138 .
- the processor 134 and memory 136 are hardware.
- the memory 136 includes non-transitory memory, e.g., random access memory (RAM) and one or more hard disks.
- the non-transitory memory may include any tangible computer-readable medium including, for example, magnetic and/or optical disks, flash drives, and the like.
- the data associated with the alarm response application 138 may be stored in a structured query language (SQL) server database or another appropriate database management system within memory 136 and/or in the one or more databases 110 .
- the memory 136 and/or the databases 110 may also include a dedicated file server having one or more dedicated processors, random access memory (RAM), a Redundant Array of Inexpensive Disks (RAID) hard drive configuration, an Ethernet interface or other communication interface, and a server-based operating system.
- RAM random access memory
- RAID Redundant Array of Inexpensive Disks
- the alarm response server 108 may further include an optional display and an input device.
- the display is used to display visual components of the alarm response application 138 , such as at a user interface.
- the user interface may display a user interface of the alarm response application 138 .
- the display can be a liquid-crystal display, a light-emitting diode display, an organic light-emitting diode display, a touch screen display, an e-ink display, an e-paper display, and other displays.
- the input device is used to interact with the alarm response application 138 and may include a mouse, a keyboard, a trackpad, and/or the like.
- the input device may be included within the display if the display is a touch screen display.
- the input device allows a user of the alarm response server 108 to manipulate and interact with the user interface of the alarm response application 138 .
- the one or more databases 110 may store user information associated with one or more users of the wearable safety application 126 and/or the mobile safety application 132 such as identifying information.
- the one or more databases 110 may store alarm notification information including a record of each alarm notification received by the alarm response server 108 .
- Each record may include a unique alarm notification identifier and the unique identifier associated with corresponding identifying information.
- the record also may include location information and other information.
- the one or more databases 110 may store PSAP information as shown in FIG. 2 .
- FIG. 1 illustrates a block diagram of the call center server 112 according to an example embodiment.
- the call center server 112 may be associated with a PSAP, e.g., a 911 emergency dispatch center.
- the call center server 112 is a computer having one or more processors 140 and memory 142 .
- the call center server 112 may be, for example, a laptop, desktop, a server, tablet computer, mobile computing device (e.g., a smart phone) or a dedicated electronic device having a processor 140 and memory 142 .
- the call center server 112 includes one or more processors 140 to process data and memory 142 to store machine/computer-readable executable instructions and data including an emergency dispatch application 144 .
- the processor 140 and memory 142 are hardware.
- the memory 142 includes non-transitory memory, e.g., random access memory (RAM) and one or more hard disks.
- the non-transitory memory may include any tangible computer-readable medium including, for example, magnetic and/or optical disks, flash drives, and the like.
- the data associated with the emergency dispatch application 144 may be stored in a structured query language (SQL) server database or another appropriate database management system within memory 142 and/or in one or more databases associated with the call center server 112 .
- SQL structured query language
- the memory 142 and/or the databases associated with the call center server 112 may also include a dedicated file server having one or more dedicated processors, random access memory (RAM), a Redundant Array of Inexpensive Disks (RAID) hard drive configuration, an Ethernet interface or other communication interface, and a server-based operating system.
- RAM random access memory
- RAID Redundant Array of Inexpensive Disks
- the call center server 112 may further include an optional display and an input device.
- the display is used to display visual components of the emergency dispatch application 144 , such as at a user interface.
- the user interface may display a user interface of the emergency dispatch application 144 .
- the display can be a liquid-crystal display, a light-emitting diode display, an organic light-emitting diode display, a touch screen display, an e-ink display, an e-paper display, and other displays.
- the input device is used to interact with the emergency dispatch application 144 and may include a mouse, a keyboard, a trackpad, and/or the like.
- the input device may be included within the display if the display is a touch screen display.
- the input device allows a user of the call center server 112 to manipulate and interact with the user interface of the emergency dispatch application 144 .
- a user may configure the wearable device 104 and/or the mobile computing device 106 .
- the user may download and/or install the wearable safety application 126 in memory 124 on the wearable device 104 and the mobile safety application 132 in memory 130 on the mobile computing device 106 .
- the user downloads and installs the wearable safety application 126 on a PebbleTM wearable device and the user downloads and installs the mobile safety application 132 on an iOSTM-based smart phone. Once installed, the user may configure the wearable safety application 126 and the mobile safety application 132 for use.
- the user may enter setup and/or configuration information comprising identifying information.
- the identifying information may include one or more of a name (first and last), one or more email addresses, one or more telephone numbers including a telephone number of the mobile computing device 106 or the wearable device 104 , one or more addresses, a height, a weight, an eye color, a hair color, a gender, a photograph, an alarm code for disabling an alarm notification, and a secret code for discreetly indicating that the user is in immediate need of assistance, among other information.
- the secret code may be automatically derived from the alarm code.
- the secret code may be automatically set by the mobile safety application 132 as 1235.
- the user may provide information associated with one or more lifelines, e.g., a person to contact in the event of an emergency.
- the information associated with the one or more lifelines may include a name, one or more email addresses, and one or more telephone numbers, among other information.
- the wearable device 104 , the mobile computing device 106 , or another computer sends the identifying information to the alarm response server 108 via the communication network 114 .
- the alarm response server 108 receives the identifying information and stores the identifying information in the memory 136 and/or the database 110 .
- the alarm response server 108 associates the identifying information with a unique identifier (e.g., a member identifier) and transmits the unique identifier to the wearable device 104 and/or the mobile computing device 106 .
- the wearable safety application 126 and/or the mobile safety application 132 receive the unique identifier and store the unique identifier in memory 124 and/or memory 130 . At this point, the wearable safety application 124 and the mobile safety application 132 are configured and ready for use.
- the user may trigger an alarm notification representing an instant emergency alarm that deploys the drone 102 and notifies first responders (e.g., a 911 PSAP) using the wearable device 104 and/or the mobile computing device 106 .
- first responders e.g., a 911 PSAP
- the mobile safety application 132 may operate in one of two exemplary operation modes.
- a first monitoring mode the mobile safety application 132 continually determines whether the user is touching the touchscreen of the mobile computing device 106 .
- the user may keep a finger on the touchscreen while the mobile computing device 106 is located in a pocket.
- the user may keep a finger on the touchscreen while holding the mobile computing device 106 as if the mobile computing device 106 is being used to place a telephone call. If the user stops touching the touchscreen of the mobile computing device 106 , an alarm notification may be triggered. This may occur if the user is attacked and/or the user drops the mobile computing device 106 .
- the alarm notification also may be triggered if the user enters the secret passcode.
- a countdown may begin after the alarm notification is triggered. During this countdown, the user may stop the countdown or disarm the mobile safety application 132 . However, if the user does not stop the countdown or disarm the mobile safety application 132 , the alarm notification is confirmed.
- the mobile safety application 132 may automatically trigger an alarm notification after a particular preset period of time, e.g., ten minutes. While in the second monitoring mode, the mobile safety application 132 may display a timer that indicates how much of the particular period of time is left until the alarm notification is triggered. As an example, it may take the user approximately six minutes to travel from their car or a train station to their apartment. The user may desire to use the second monitoring mode of the mobile safety application 132 while traveling from their car or the train station to their apartment. The user may disarm the mobile safety application 132 upon arrival at the apartment. However, after the particular period of time ends, the alarm notification is triggered. The alarm notification also may be triggered if the user enters the secret passcode.
- a particular preset period of time e.g., ten minutes.
- the mobile safety application 132 may display a timer that indicates how much of the particular period of time is left until the alarm notification is triggered. As an example, it may take the user approximately six minutes to travel from their car or a
- a countdown may begin after the alarm notification is triggered. During this countdown, the user may stop the countdown or disarm the mobile safety application 132 . However, if the user does not stop the countdown or disarm the mobile safety application 132 , the alarm notification is confirmed.
- the wearable safety application 126 also may trigger the alarm notification.
- the user may press and hold two hardware buttons on the wearable device 104 for a particular period of time, e.g., four seconds.
- the user may press and hold one hardware button on the wearable device 104 for the particular period of time.
- the user may press a hardware button on the wearable device 104 a particular number of times consecutively in a particular period of time, e.g., three to ten times in twenty seconds.
- the user may press the touch screen of the wearable device 104 a particular number of times consecutively in a particular period of time.
- the wearable device 104 may include a radio frequency (RF) transceiver or another transceiver for transmitting the alarm notification to the mobile computing device 106 and/or the alarm response server 108 .
- the wearable device 104 may include a microphone for receiving a voice activated alarm notification, an accelerometer for detecting an acceleration greater than a particular threshold to generate an alarm notification (e.g., a hard fall), a gyroscope for detecting rotation greater than a particular threshold to generate an alarm notification (e.g., a hard fall), and a biometric device to receive an alarm notification.
- the biometric device may be a fingerprint recognition device to determine unique patterns in one or more fingers of the user or a retina scanner to determine unique patterns associated with a retina of the user.
- the biometric device may be a heart rate monitor to measure and/or record a heart rate of a user.
- the biometric device also may detect a heart attack and/or an abnormal heart rate.
- the biometric device may store information associated with the heart rate in memory 124 and memory 130 to provide historic contextual data for a normal and an abnormal heart rate. If the heart rate is lower than a particular threshold or higher than a particular threshold, the heart rate monitor may detect distressed health conditions, a heart attack and/or conditions indicative of a heart attack and generate an alarm notification that may be sent to one or more PSAPs and first responders.
- this is just one example of user health monitoring that may be executed using the systems and methods taught herein. There are numerous monitored conditions that may be used to generate an alarm notification, including temperature, breathing rate, etc.
- the wearable device 104 After the wearable device 104 triggers the alarm, the wearable device 104 sends an alarm notification message to the mobile computing device 106 .
- the alarm notification message may be sent by the wearable device 104 using a Bluetooth network or another short-range wireless network.
- the mobile computing device 106 reverse geocodes a current location of the mobile computing device 106 using the global positioning system (GPS) hardware.
- GPS global positioning system
- the GPS hardware communicates with a GPS satellite-based positioning system.
- the GPS hardware may be an assisted GPS system, e.g., A-GPS or aGPS, or may be a standalone GPS. Standalone GPS only uses radio signals from the satellite-based positioning system.
- An assisted GPS system uses network resources available to the mobile computing device 106 and/or the wearable device 104 to locate and use the satellite-based positioning system in poor signal conditions, such as in a city where signals bounce off of buildings or pass through walls or tree cover.
- the mobile computing device 106 sends or forwards the alarm notification message with the current location information and the unique identifier to the alarm response server 108 via the communication network 112 .
- the alarm response server 108 receives the alarm notification message, transmits a unique alarm identifier to the mobile computing device 106 that corresponds with this particular alarm notification, and determines one or more PSAPs based on the current location information.
- the alarm response application 138 of the alarm response server 108 determines three PSAPs that are closest to the current location of the mobile computing device 106 by querying the one or more databases 110 using the current location information, e.g., a latitude value and a longitude value.
- the alarm response application 138 of the alarm response server 108 determines three PSAPs that have a highest safety score.
- the safety score may be based on the current location of the mobile computing device 106 , a historical response time of the PSAP, a PSAP service rating (e.g., one to five stars), and other service-level agreement based factors.
- the alarm response application 138 may generate a user interface on the display of the alarm response server 108 .
- the user interface may include information associated with the one or more PSAPs, the identifying information, a map showing the current location of the user, and the monitoring information from the drone 102 , among other information.
- the user interface may include a button or other user interface element for indicating that the alarm notification is a false alarm, one or more buttons or other user interface elements to control and monitor the one or more drones 102 , and another button or other user interface element for forwarding the alarm notification to the call center server 112 .
- the alarm response application 138 determines a telephone number and/or email address in the one or more databases 110 associated with the unique identifier.
- the alarm response application 138 of the alarm response server 108 initiates one or more automated telephone calls, sends an email, and/or sends a text message (SMS/MMS) to the mobile computing device 106 or the wearable device 104 to verify a condition of the instant emergency alarm.
- SMS/MMS text message
- the user of the wearable device 104 and/or the mobile computing device 106 may indicate that the instant emergency alarm was a false alarm by providing the alarm passcode, e.g., one or more numbers such as 1234.
- the alarm passcode may be provided to a human call representative associated with the alarm response server 108 .
- the text message and the email may include a uniform resource locator (URL) to direct the user to a web page having a form to receive the alarm passcode.
- the user of the mobile computing device 106 may view the web page and transmit the alarm passcode to the alarm response server 108 .
- the alarm response server 108 confirms that the alarm passcode is correct, e.g., this is a false alarm, and the process may end.
- the user of the wearable device 104 and/or the mobile computing device 106 may indicate that the instant emergency alarm was not a false alarm by providing the secret passcode, e.g., one or more numbers such as 911 or 1235.
- the secret passcode may be provided to the human call representative associated with the alarm response server 108 .
- the text message and the email may include the URL that directs the user to the web page having the form to receive the secret passcode.
- the user of the mobile computing device 106 may view the web page and transmit the secret passcode to the alarm response server 108 .
- the alarm response server 108 confirms that the secret passcode is correct or not correct.
- the alarm response server 108 sends the alarm notification with the identifying information and the location information to the emergency dispatch application 138 of the call center server 112 via the communication network.
- the alarm response server 108 sends the alarm notification with the identifying information and the location information to the one or more lifelines by initiating an automated telephone call, sending an email, and/or sending a text message (SMS/MMS) to the one or more lifelines.
- SMS/MMS text message
- the email and text message may include a URL that provides detailed information about the alarm notification including a map showing the current location of the user.
- the alarm response server 108 If the alarm response server 108 does not receive the correct alarm passcode after a particular period of time (e.g., one minute), the alarm response server 108 sends the alarm notification with the identifying information and the location information to the emergency dispatch application 144 of the call center server 112 via the communication network.
- the alarm response server 108 sends the alarm notification with the identifying information and the location information to the one or more lifelines by initiating an automated telephone call, sending an email, and/or sending a text message (SMS/MMS) to the one or more lifelines.
- the email and text message may include a URL that provides detailed information about the alarm notification including a map showing the current location of the user.
- the alarm response server 108 sends the identifying information and the current location information to the drone 102 .
- the drone 102 receives the identifying information and the current location information and stores the identifying information and the current location information in the memory 118 .
- the drone 102 determines the quickest and/or shortest route to the current location using the current location information, weather conditions, and obstacles.
- the drone 102 travels to the current location using the route and upon arrival begins monitoring activity at the current location. As an example, the drone 102 hovers at a particular altitude and records video, photographs, and/or audio using the one or more cameras and the one or more microphones.
- the drone 102 may stream and/or transmit the video, photographs, and/or the audio to the alarm response server 108 and/or the call center server 112 . If the person, the mobile computing device 106 , and/or the wearable device 104 begins moving while the drone 102 is monitoring, the drone 102 tracks and follows the person and continues to record video, photographs, and/or audio.
- the drone 102 continues to record and/or stream the at least one of video, audio, and photographic information for a particular period of time, e.g., ten minutes, or until the drone battery level reaches a critical level.
- the critical level may be based upon a distance that the drone 102 is from the hangar.
- the drone 102 stops recording and/or streaming to have sufficient battery power to return to the hangar.
- the drone 102 continues to record and/or stream the at least one of video, audio, and photographic information until the drone 102 receives a message from one of the alarm response server 108 and/or the call center server 112 to stop recording.
- the drone 102 After the drone 102 stops recording, the drone 102 follows a reverse route or another route back to its hangar.
- the reverse route may be a route that is opposite of the route that the drone used to reach the location.
- the drone 102 upon arrival at the hangar and/or connecting to the communications network 114 , the drone 102 transmits the at least one of video, audio, and photographic information to the alarm response server 108 and/or the call center server 112 .
- the video, audio, and photographic information may be stored in the database 110 and associated with the unique identifier and the unique alarm identifier.
- the one or more drones 102 , the one or more wearable devices 104 , the one or more mobile computing devices 106 , the one or more alarm response servers 108 , the one or more databases 110 , and the one or more call center servers 112 communicate using a web application programming interface (API) comprising a defined request/response message system.
- API web application programming interface
- the message system is based on Javascript Object Notation (JSON) and the web API is a RESTful web API based on Representational State Transfer (REST).
- the web API includes one or more HTTP methods including alert activation, alert cancel, alert triggered, alert silent alarm, and alert location update, among other methods.
- Alert activation may be called when the user activates one of the monitoring mode and the timer mode.
- a record is created in the database 110 having a unique alert/alarm identifier.
- the alert activation uniform resource locator URL
- the alert activation input parameters include an alert latitude, an alert longitude, a member ID (unique identifier), an alert type (monitoring or timer), and an alarm minutes value.
- the alarm minutes value is associated with the second timer mode.
- the alert activation output parameters include a status code, a status description, and an alert ID (e.g., a unique alert/alarm identifier that represents this particular alert notification).
- the unique alert identifier may be used to reference a particular alarm notification, e.g., 27307.
- Alert cancel may be called when the user correctly enters the alarm passcode to deactivate the alert.
- Alert cancel is applicable to both the monitoring mode and the timer mode.
- the alert cancel URL comprises http://a.llr1.com/rest/AlertCancel.
- the alert cancel input parameters include a unique alert identifier.
- the alert cancel output parameters include a status code and a status description.
- Alert trigger may be called when the user is in monitoring mode and the user ends monitoring mode. Monitoring mode may end when the user removes a finger from a touchscreen of the mobile computing device 106 .
- the alert trigger URL comprises http://a.llr1.com/rest/AlertTrigger.
- the alert trigger input parameters include a unique alert identifier, an alert latitude, and an alert longitude.
- the alert trigger output parameters include a status code and a status description.
- Alert silent alarm may be called when the user is in monitoring mode and the user enters the secret password to trigger the alarm.
- the alert silent alarm URL comprises http://a.llr1.com/rest/AlertSilentAlarm.
- the alert silent alarm input parameters include a unique alert identifier.
- the alert silent alarm output parameters include a status code and a status description.
- Alert location update may be called to update location information associated with a particular alarm notification.
- the alert location update may be called at a particular interval of time after the alarm notification, e.g., every ten seconds.
- the alert location update may be called when the mobile computing device 106 and/or the wearable device 104 moves a particular distance, e.g., every 37 feet of movement.
- the drone 102 , the alarm response server 108 , and the call center server 112 may determine how fast the mobile computing device 106 and/or the wearable device 104 are moving by evaluating the difference between each alert location update.
- the drone 102 , the alarm response server 108 , and the call center server 112 may determine an instantaneous speed of the mobile computing device 106 and/or the wearable device 104 based on the distance traveled with respect to time.
- the alert location update URL comprises http://a.llr1.com/rest/AlertLocationUpdate.
- the alert location update input parameters include a unique alert identifier, an alert location latitude, and an alert location longitude.
- the alert location update output parameters include a status code and a status description.
- FIG. 2 illustrates example information in the alarm response database 110 according to an example embodiment.
- the alarm response database may store PSAP information.
- PSAP in the United States and throughout the world may have database fields/attributes stored in the alarm response database 110 .
- FIG. 2 illustrates example information in the alarm response database 110 according to an example embodiment.
- the alarm response database may store PSAP information.
- Each PSAP in the United States and throughout the world may have database fields/attributes stored in the alarm response database 110 .
- FIG. 1 illustrates example information in the alarm response database 110 according to an example embodiment.
- the database fields/attributes may include one or more of a PSAP ID, a PSAP RedID, a PSAP Segment, a PSAP First Name, a PSAP Middle Initial, a PSAP Last Name, a PSAP Department, a PSAP Mailing Address (1), a PSAP Mailing Address (2), a PSAP Mailing City, a PSAP Mailing State, a PSAP Mailing Zip Code, a PSAP Physical Address (1), a PSAP Physical Address (2), a PSAP Physical City, a PSAP Physical State, a PSAP Physical Zip Code, a PSAP Phone Number, a PSAP Phone Extension, a PSAP Fax Number, a PSAP Fax Extension, a PSAP911 Phone Number, a PSAP Longitude, a PSAP Latitude, a PSAP InvalidCount, a PSAP County, and a PSAP Region, among others.
- FIG. 3 illustrates a flowchart of a process for triggering an alarm notification and monitoring by the drone 102 , according to an example embodiment.
- the process 300 shown in FIG. 3 begins in step 302 .
- the user of the wearable device 104 and/or the mobile computing device 106 provides setup information to the wearable safety application 126 and/or the mobile safety application 132 .
- the setup information comprises the identifying information.
- the wearable safety application 126 of the wearable device 104 and/or the mobile safety application 132 of the mobile computing device 106 send the setup information including the identifying information to the alarm response server 108 via the communication network 114 .
- the alarm response server 108 stores the identifying information in the one or more databases 110 and sends a unique identifier that represents the identifying information to the wearable device 104 and/or the mobile computing device 106 .
- the wearable device 104 and/or the mobile computing device 106 receive the unique identifier and store the unique identifier in memory 124 and/or memory 130 .
- the user triggers the wearable device 104 and/or the mobile computing device 106 .
- the wearable safety application 124 receives the trigger and sends an alarm notification message to the mobile computing device 106 via Bluetooth or another short-range wireless protocol.
- the mobile safety application 132 receives the trigger via the monitoring mode or the timer mode.
- the mobile computing device 106 reverse geocodes a current location of the mobile computing device 106 .
- the wearable device 104 reverse geocodes a current location of the wearable device 104 and provides this current location with the alarm notification message.
- the mobile computing device 106 sends the alarm notification message including current location information and the unique identifier to the alarm response server 108 .
- the alarm response server 108 receives the alarm notification message having the current location information and based on the current location information and the PSAP information in the database 110 determines one or more PSAPs.
- the alarm response server 108 may send the mobile computing device 106 and/or the wearable device 104 a unique alarm identifier that represents the alarm notification.
- the alarm response server 108 notifies the user to determine whether the alarm notification is a false alarm.
- the alarm response server 108 may send one or more of a telephone call, an email, and a message to the mobile computing device 106 and/or the wearable device 104 . If the user provides a correct alarm code, the process may end. However, if the alarm notification is not a false alarm and if the user does not provide a correct alarm code or provides a secret code, in step 312 , the alarm response server 108 sends the alarm notification message including the identifying information and the current location information to the call center server 112 . In addition, the alarm response server 108 may send the identifying information and the current location information to the one or more lifelines.
- the alarm response server 108 sends the identifying information and the current location information to the drone 102 .
- the drone 102 receives the identifying information and the current location information and stores the identifying information and the current location information in the memory 118 .
- the drone 102 determines a shortest and/or quickest route from its hangar to the current location of the mobile computing device 106 and/or the wearable device 104 .
- the route may be based on weather conditions and obstacles.
- the drone follows the route to the current location of the mobile computing device 106 and/or the wearable device 104 .
- the drone 102 Upon arrival, the drone 102 records at least one of video, audio, and photographic information using the one or more cameras and the one or more microphones.
- the drone 102 streams the at least one of video, audio, and photographic information to the alarm response server 108 and/or the call center server 112 .
- the drone 102 continues to record and/or stream the at least one of video, audio, and photographic information for a particular period of time, e.g., ten minutes, or until the drone battery level reaches a critical level.
- the drone 102 continues to record and/or stream the at least one of video, audio, and photographic information until the drone 102 receives a message from the alarm response server 108 , the call center server 112 , a remote control, or another computing device to stop recording. After the drone 102 stops recording, the drone 102 follows a reverse route or another route back to its hangar. In one aspect, upon arrival at the hangar, the drone 102 transmits the at least one of video, audio, and photographic information to the alarm response server 108 and/or the call center server 112 .
- the video, audio, and photographic information may be stored in the database 110 and associated with the unique identifier and/or the unique alarm identifier.
- the wearable device 104 may directly send the alarm notification message to the alarm response server 108 .
- FIG. 4 illustrates an example computing system 400 that may implement portions of the various systems described herein, such as the drone 102 , the wearable device 104 , the mobile computing device 106 , the alarm response server 108 , the call center server 112 , and methods discussed herein, such as process 300 .
- a general-purpose computer system 400 is capable of executing a computer program product to execute a computer process. Data and program files may be input to the computer system 400 , which reads the files and executes the programs therein such as the monitoring safety application 120 , the wearable safety application 126 , the mobile safety application 132 , the alarm response application 138 , and the emergency dispatch application 144 .
- Some of the elements of a general-purpose computer system 400 are shown in FIG.
- a processor 402 is shown having an input/output (I/O) section 404 , a central processing unit (CPU) 406 , and a memory section 408 .
- processors 402 there may be one or more processors 402 , such that the processor 402 of the computer system 400 comprises a single central-processing unit 406 , or a plurality of processing units, commonly referred to as a parallel processing environment.
- the computer system 400 may be a conventional computer, a server, a distributed computer, or any other type of computer, such as one or more external computers made available via a cloud computing architecture.
- the presently described technology is optionally implemented in software devices loaded in memory 408 , stored on a configured DVD/CD-ROM 410 or storage unit 412 , and/or communicated via a wired or wireless network link 414 , thereby transforming the computer system 400 in FIG. 4 to a special purpose machine for implementing the described operations.
- the memory section 408 may be volatile media, nonvolatile media, removable media, non-removable media, and/or other media or mediums that can be accessed by a general purpose or special purpose computing device.
- the memory section 408 may include non-transitory computer storage media and communication media.
- Non-transitory computer storage media further may include volatile, nonvolatile, removable, and/or non-removable media implemented in a method or technology for the storage (and retrieval) of information, such as computer/machine-readable/executable instructions, data and data structures, engines, program modules, and/or other data.
- Communication media may, for example, embody computer/machine-readable/executable, data structures, program modules, algorithms, and/or other data.
- the communication media may also include an information delivery technology.
- the communication media may include wired and/or wireless connections and technologies and be used to transmit and/or receive wired and/or wireless communications.
- the I/O section 404 is connected to one or more user-interface devices (e.g., a keyboard 416 and a display unit 418 ), a disc storage unit 412 , and a disc drive unit 420 .
- the disc drive unit 420 is a DVD/CD-ROM drive unit capable of reading the DVD/CD-ROM medium 410 , which typically contains programs and data 422 .
- Computer program products containing mechanisms to effectuate the systems and methods in accordance with the presently described technology may reside in the memory section 404 , on a disc storage unit 412 , on the DVD/CD-ROM medium 410 of the computer system 400 , or on external storage devices made available via a cloud computing architecture with such computer program products, including one or more database management products, web server products, application server products, and/or other additional software components.
- a disc drive unit 420 may be replaced or supplemented by a floppy drive unit, a tape drive unit, or other storage medium drive unit.
- the network adapter 424 is capable of connecting the computer system 400 to a network via the network link 414 , through which the computer system can receive instructions and data.
- computing systems examples include personal computers, Intel or PowerPC-based computing systems, AMD-based computing systems and other systems running a Windows-based, a UNIX-based, or other operating system. It should be understood that computing systems may also embody devices such as Personal Digital Assistants (PDAs), mobile phones, tablets or slates, multimedia consoles, gaming consoles, set top boxes, etc.
- PDAs Personal Digital Assistants
- mobile phones tablets or slates
- multimedia consoles gaming consoles
- gaming consoles set top boxes
- the computer system 400 When used in a LAN-networking environment, the computer system 400 is connected (by wired connection and/or wirelessly) to a local network through the network interface or adapter 424 , which is one type of communications device.
- the computer system 400 When used in a WAN-networking environment, the computer system 400 typically includes a modem, a network adapter, or any other type of communications device for establishing communications over the wide area network.
- program modules depicted relative to the computer system 400 or portions thereof may be stored in a remote memory storage device. It is appreciated that the network connections shown are examples of communications devices for and other means of establishing a communications link between the computers may be used.
- source code executed by the drone 102 , the wearable device 104 , the mobile computing device 106 , the alarm response server 108 , and the call center server 112 a plurality of internal and external databases including the database 110 , source databases, and/or cached data on servers are stored in memory 118 of the drone 102 , memory 124 of the wearable device 104 , memory 130 of the mobile computing device 106 , memory 136 of the alarm response server 108 , memory 142 of the call center server 112 , or other storage systems, such as the disk storage unit 412 or the DVD/CD-ROM medium 410 , and/or other external storage devices made available and accessible via a network architecture.
- the source code executed by the drone 102 , the wearable device 104 , the mobile computing device 106 , the alarm response server 108 , and the call center server 112 may be embodied by instructions stored on such storage systems and executed by the processor 402 .
- the processor 402 which is hardware, may perform some or all of the operations described herein. Further, local computing systems, remote data sources and/or services, and other associated logic represent firmware, hardware, and/or software configured to control operations of the drone safety alert monitoring system 100 and/or other components. Such services may be implemented using a general-purpose computer and specialized software (such as a server executing service software), a special purpose computing system and specialized software (such as a mobile device or network appliance executing service software), or other computing configurations.
- a general-purpose computer and specialized software such as a server executing service software
- a special purpose computing system and specialized software such as a mobile device or network appliance executing service software
- one or more functionalities disclosed herein may be generated by the processor 402 and a user may interact with a Graphical User Interface (GUI) using one or more user-interface devices (e.g., the keyboard 416 , the display unit 418 , and the user devices 404 ) with some of the data in use directly coming from online sources and data stores.
- GUI Graphical User Interface
- user-interface devices e.g., the keyboard 416 , the display unit 418 , and the user devices 404
- FIG. 4 is but one possible example of a computer system that may employ or be configured in accordance with aspects of the present disclosure.
- FIG. 5 illustrates an example screenshot 500 of the mobile safety application 132 executed by the mobile computing device 106 according to an example embodiment.
- the mobile safety application 132 may operate in the first monitoring mode (e.g., thumb mode) or the second timer mode. If the user selects the thumb mode user interface button, the mobile safety application 132 enters the first monitoring mode. If the user selects the timer mode user interface button, the mobile safety application 132 enters the second timer mode.
- FIG. 6 illustrates another example screenshot 600 of the mobile safety application 132 executed by the mobile computing device 106 according to an example embodiment.
- the mobile safety application 132 is operating in the first monitoring mode. In the first monitoring mode, the mobile safety application 132 continually determines whether the user is touching the touchscreen of the mobile computing device 106 . If the user stops touching the touchscreen of the mobile computing device 106 , an alarm notification may be triggered. This may occur if the user is attacked and/or the user drops the mobile computing device 106 . The alarm notification also may be triggered if the user enters the secret passcode. A countdown may begin after the alarm notification is triggered. During this countdown, the user may stop the countdown or disarm the mobile safety application 132 . However, if the user does not stop the countdown or disarm the mobile safety application 132 , the alarm notification is confirmed.
- FIG. 7 illustrates another example screenshot 700 of the mobile safety application 132 executed by the mobile computing device 106 according to an example embodiment.
- the mobile safety application 132 is operating in the second timer mode.
- the user interface of the mobile safety application 132 includes a user interface element for selecting an amount of time to wait before triggering the alarm notification (e.g., a distress alert).
- FIG. 8 illustrates an example of a drone 102 according to an example embodiment.
- the drone 102 includes a camera system 103 , a microphone system 105 , an output system 107 , and an input system 109 .
- FIG. 9 illustrates a keychain including an example wearable device 900 according to an example embodiment.
- This example wearable device 900 is a VALRTTM wearable device.
- FIG. 10 illustrates another view of the example wearable device 1000 on a wristband according to an example embodiment.
- FIG. 11 illustrates a command center graphical user interface (GUI) 1100 based on an alert notification that includes one or more aerial video streams 1102 according to an example embodiment.
- the alarm response server 108 may display the command center GUI using the alarm response application 138 and/or the call center server 112 may display the command center GUI using the emergency dispatch application 144 .
- the methods disclosed may be implemented as sets of instructions or software readable by a device. Further, it is understood that the specific order or hierarchy of steps in the methods disclosed are instances of example approaches. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the method can be rearranged while remaining within the disclosed subject matter.
- the accompanying method claims present elements of the various steps in a sample order, and are not necessarily meant to be limited to the specific order or hierarchy presented.
- the described disclosure may be provided as a computer program product, or software, that may include a non-transitory machine-readable medium having stored thereon executable instructions, which may be used to program a computer system (or other electronic devices) to perform a process according to the present disclosure.
- a non-transitory machine-readable medium includes any mechanism for storing information in a form (e.g., software, processing application) readable by a machine (e.g., a computer).
- the non-transitory machine-readable medium may include, but is not limited to, magnetic storage medium (e.g., floppy diskette), optical storage medium (e.g., CD-ROM); magneto-optical storage medium, read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM and EEPROM); flash memory; or other types of medium suitable for storing electronic executable instructions.
- magnetic storage medium e.g., floppy diskette
- optical storage medium e.g., CD-ROM
- magneto-optical storage medium read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM and EEPROM); flash memory; or other types of medium suitable for storing electronic executable instructions.
- ROM read only memory
- RAM random access memory
- EPROM and EEPROM erasable programmable memory
- flash memory or other types of medium suitable for storing electronic executable instructions.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Health & Medical Sciences (AREA)
- Signal Processing (AREA)
- Business, Economics & Management (AREA)
- Biomedical Technology (AREA)
- Emergency Management (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Epidemiology (AREA)
- General Business, Economics & Management (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Primary Health Care (AREA)
- Public Health (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Alarm Systems (AREA)
Abstract
A wearable safety alarm system includes one or more processors to receive identifying information and transmit the identifying information to an alarm response server from a mobile computing device, receive, by the mobile computing device, a unique identifier that identifies the identifying information in database associated with the alarm response server, receive a trigger of an alarm notification by one of a wearable device and the mobile computing device, determine a current location of the mobile computing device, transmit an alarm notification message to the alarm response server, the alarm notification message including the current location of the mobile computing device and the unique identifier, and transmit the current location of the mobile computing device to at least one drone.
Description
- This application claims the benefit of priority to U.S. Patent Application No. 62/035,762, filed Aug. 11, 2014, which is hereby incorporated herein by reference.
- The present systems and methods relate generally to systems and methods for monitoring by a drone or unmanned aerial vehicle (UAV) that is triggered by an alarm notification message sent by a wearable device and/or a mobile computing device. More particularly, the systems and methods receive an alarm trigger and send an alarm notification message, including location information and a unique identifier representing identifying information, to a server. The server sends the location information to the drone. The drone travels to a location using the location information and begins monitoring the location.
- Mobile computing devices have gradually become a ubiquitous part of daily life. Traditionally, a mobile computing device such as a smartphone may be carried on a person in a pocket, a purse, a briefcase, a backpack, a messenger bag, etc. In other situations, the mobile computing device may be located nearby a person, such as on a table or in a car. In nearly all of these instances, users of smartphones and tablets have access to a portable device that is capable of communicating with others, capable of executing applications, and capable of sending and receiving information to other devices.
- However, when a life threatening emergency strikes, it may not be possible to dial “911” and/or reach out for help as quick as necessary because the mobile computing device may not be within arm's reach and/or may be inaccessible. In other dangerous situations, even if a person is able to dial “911” and/or reach out for help, the person may not be able to relay information during a telephone call for a variety of reasons, e.g., an incapacitating injury or an attacker/intruder is nearby.
- While mobile computing devices provide users the ability to communicate with others and reach out for help in the event of an emergency, it may be difficult or impossible to efficiently and accurately provide critical information to an emergency dispatch center when time is of the essence.
- In addition, after the emergency dispatch center is notified of the emergency, first responders may have to travel to the person to provide assistance. While the first responders attempt to arrive as soon as possible, there is typically a period of time before the first responders are able to arrive. During this period of time, valuable evidence may be lost.
- Accordingly, to meet these needs and others, there is a need for systems and methods as described herein.
- Briefly described, aspects of the present disclosure generally relate to methods and systems for monitoring by a drone or unmanned aerial vehicle (UAV) that is triggered by an alarm notification provided through an application provided through a mobile device. As used throughout the present disclosure, a mobile device may be any mobile computing platform, including a smartphone, a tablet computer, a wearable device, etc.
- In one aspect, a user provides identifying information to a wearable safety application executed by a wearable device and/or a mobile safety application executed by a mobile computing device. The wearable device and/or the mobile computing device send the identifying information to an alarm response server and the alarm response server stores the identifying information in a database. The alarm response server associates the identifying information with a unique identifier and sends the unique identifier to the wearable device and/or the mobile computing device. If an emergency occurs, the user may trigger the wearable safety application and/or the mobile safety application. The wearable safety application and/or the mobile safety application send an alarm notification message including location information and the unique identifier to the alarm response server. The alarm response server determines one or more personal safety answering points (PSAP) based on the location information. If the alarm notification is verified, e.g., the alarm notification is not a false alarm, the alarm response server sends the location information and the identifying information to a call center server associated with the one or more PSAPs for further action by emergency responders.
- In addition, the alarm response server sends the identifying information and the location information to one or more drones. The one or more drones travel to a location based on the location information and begins monitoring. In addition, the call center server also may send the location information and the identifying information to one or more lifelines, e.g., a person to contact in the event of an emergency.
- In one aspect, a drone safety alert monitoring system includes one or more processors to receive identifying information and transmit the identifying information to an alarm response server from a mobile computing device, receive, by the mobile computing device, a unique identifier that identifies the identifying information in database associated with the alarm response server, receive a trigger of an alarm notification by one of a wearable device and the mobile computing device, determine a current location of the mobile computing device, transmit an alarm notification message to the alarm response server, the alarm notification message including the current location of the mobile computing device and the unique identifier, and transmit the current location of the mobile computing device to at least one drone.
- There are numerous examples in which the features and functions of the present subject matter may be embodied. And the solutions provided herein may be applied in various use contexts. It is understood that aspects of the present disclosure may provide police and other emergency responders with a system that supplements or compliments their work. In one example, upon the issuance of an alarm notification by a police officer (e.g., though a mobile or wearable device, including a computer in a police car), location information is sent to an alarm response server, which communicates the location information to a drone, which is deployed to the location to capture monitoring information (i.e., photographs, video, audio, etc.) at the location. When at the location, the drone can start to track and follow an object from the location. For example, a police officer may issue an alarm notification for a location at which a car has been pulled over. At the scene, the officer may place a target on the car that has been pulled over. The drone may identify the target as the object to follow. Then, if the car starts moving, the drone may follow the car, and target, without further instruction from the alarm response server or other outside party. Using a target the drone can follow moving objects without the need for further instruction.
- These and other aspects, features, and benefits of the present disclosure will become apparent from the following detailed written description in conjunction with the accompanying drawings, although variations and modifications thereto may be implemented without departing from the spirit and scope of the novel concepts of the disclosure.
- The accompanying drawings illustrate embodiments of the disclosure and, together with the written description, serve to explain the teachings, principles, and solutions provided by the disclosure. Wherever possible, the same reference numbers are used throughout the drawings to refer to the same or similar elements across the various embodiments.
-
FIG. 1 illustrates a block diagram of a drone safety alert monitoring system according to an example embodiment. -
FIG. 2 illustrates example information in an alarm response (PSAP) database according to an example embodiment. -
FIG. 3 is a flowchart illustrating a process for monitoring by the drone safety alert monitoring system according to an example embodiment. -
FIG. 4 illustrates a block diagram of an example computer device for use with the example embodiments. -
FIGS. 5-7 illustrate example screenshots of a mobile safety application executed by a mobile computing device according to an example embodiment. -
FIG. 8 illustrates a perspective view of a drone according to an example embodiment. -
FIG. 9 illustrates a perspective view of a wearable device according to an example embodiment. -
FIG. 10 illustrates another a perspective view of a wearable device according to an example embodiment. -
FIG. 11 illustrates a command center graphical user interface (GUI) according to an example embodiment. - For the purpose of promoting an understanding of the principles of the present disclosure, reference will now be made to the embodiments illustrated in the drawings, and specific language will be used to describe the same. It will, nevertheless, be understood that no limitation of the scope of the disclosure is intended; alterations and further modifications of the described and illustrated embodiments, and further applications of the principles of the disclosure as illustrated therein, are contemplated as would normally occur to one skilled in the art to which the disclosure relates.
-
FIG. 1 illustrates a block diagram of a drone safetyalert monitoring system 100 according to an example embodiment. According to an aspect of the present disclosure, the drone safetyalert monitoring system 100 includes one ormore drones 102. The drone safetyalert monitoring system 100 further includes one or more optionalwearable devices 104, one or moremobile computing devices 106, one or morealarm response servers 108, one ormore databases 110, one or morecall center servers 112, and acommunication network 114. The one or more computing devices communicate and coordinate their actions by passing messages over thecommunication network 114. Thecommunication network 114 can be one or more of the Internet, an intranet, a cellular communications network, a WiFi network, a packet network, or another wired or wireless communication network. As an example, the one or more computing devices communicate data in packets, messages, or other communications using a common protocol, e.g., Hypertext Transfer Protocol (HTTP) and/or Hypertext Transfer Protocol Secure (HTTPS). As an example, the drone safetyalert monitoring system 100 may be a cloud-based computer system or a distributed computer system. - The one or more computing devices may communicate based on representational state transfer (REST) and/or Simple Object Access Protocol (SOAP). As an example, a first computer (e.g., a client computer) may send a request message that is a REST and/or a SOAP request formatted using Javascript Object Notation (JSON) and/or Extensible Markup Language (XML). In response to the request message, a second computer (e.g., a server computer) may transmit a REST and/or SOAP response formatted using JSON and/or XML.
- The embodiments described herein may be based on Oauth, an open standard for authorization. Oauth allows producers of web services to grant third-party access to web resources without sharing usernames and/or passwords. In this case, the web resources may be the one or
more drones 102, the one or morealarm response servers 108, the one ormore databases 110, and the one or morecall center servers 112. Oauth provides one application with one access token providing access to a subset of web resources on behalf of one user, similar to a valet key. In particular, the embodiments may be related to Oauth 2.0. While discussed in the context of Oauth, the present disclosure is not limited to Oauth. - The drone safety
alert monitoring system 100 may be deployed or located at a particular site including a city, a town, a college campus, a corporate campus, outdoor venue, e.g., a concert venue, an indoor venue, e.g., an arena, and other locations. The drone safetyalert monitoring system 100 may include one or more hangars to house the one ormore drones 102. In one example, a college campus may have a single hangar housing the one ormore drones 102. In another example, the one or more hangars may be distributed throughout the college campus. Each hangar may be located equidistant from other hangars, e.g., each hangar may each cover a particular grid on the particular site. However, each hangar also may be located in a particular location at the particular site based on previously reported emergencies and/or population density. As an example, the particular site may include four grids each having an equal size of 1000 feet×1000 feet. One hangar may be located in the center of each grid. Each hangar may house one ormore drones 102 to quickly and efficiently service any particular location in each grid. - Each hangar may be outfitted with one or more alternating current (AC) power sockets and one or more chargers for charging the one or more batteries of the
drone 102. As an example, thedrone 102 may be housed in a hangar located on a roof of a building, a garage, or another location. -
FIG. 1 illustrates a block diagram of thedrone 102 according to an example embodiment. Thedrone 102 may be a computer having one ormore processors 116 andmemory 118, including but not limited to an unmanned aerial system (UAS) and an unmanned aerial vehicle (UAV). Thedrone 102 is not limited to an unmanned aircraft device or a UAV and may be other types of unmanned vehicles. Thedrone 102 may be an unmanned ground vehicle (UGV) having wheels, legs, or a continuous track, an unmanned vehicle traveling on rails, e.g., an unmanned train, an unmanned boat, and an unmanned hovercraft, among other vehicles. As an example, thedrone 102 may be an autonomous or remote-controlled vehicle, an autonomous or remote-controlled car, an autonomous or remote-controlled train, an autonomous or remote-controlled boat, or an autonomous remote-controlled hovercraft, among other vehicles. However, for purposes of clarity, the majority of the description provided herein refers to thedrone 102 as being an UAV. - The flight and operation of the
drone 102 may be controlled autonomously by the one ormore processors 116, another computer (e.g., the one or morealarm response servers 108 or another mobile computing device), and/or by one or more users via a remote control. Thedrone 102 may further include one ormore cameras 103, one or more light sources, one ormore microphones 105, one or more sensors including a gyroscope, an accelerometer, a magnetometer, an ultrasound sensor, an altimeter, an air pressure sensor, a motion sensor, and other sensors, one or more rotors, one or more motors, and one or more batteries for powering thedrone 102. Thecamera 103 may be a high-definition camera capable of recording high-definition video (e.g., any video image with more than 480 horizontal lines and/or captured at rates greater than 60 frames per second). Thecamera 103 may include analog zoom and/or digital zoom and may zoom in or out during operation. Thecamera 103 also may be a thermal vision camera or a night vision camera. Thedrone 102 may be battery-powered and/or powered by another source, e.g., gasoline. Thedrone 102 may have a hull that comprises carbon fiber components, plastic components, metal components, and other components. - The
drone 102 may communicate with anotherdrone 102, thewearable device 104, themobile computing device 106, thealarm response server 108, and/or thecall center server 112 using at least one of Bluetooth, WiFi, a wired network, a wireless network, and a cellular network. According to an example embodiment, at least thedrone 102 and themobile computing device 106 may communicate wirelessly. - The
drone 102 reverse geocodes a current location of thedrone 102 using global positioning system (GPS) hardware. The GPS hardware communicates with a GPS satellite-based positioning system. The GPS hardware may be an assisted GPS system, e.g., A-GPS or aGPS, or may be a standalone GPS. Standalone GPS only uses radio signals from the satellite-based positioning system. An assisted GPS system uses network resources available to thedrone 102 to locate and use the satellite-based positioning system in poor signal conditions, such as in a city where signals bounce off of buildings or pass through walls or tree cover. - The one or
more processors 116 may process machine/computer-readable executable instructions and data, and thememory 118 may store machine/computer-readable executable instructions and data including one or more applications, including amonitoring safety application 120. Theprocessor 116 andmemory 118 are hardware. Thememory 118 includes random access memory (RAM) and non-transitory memory, e.g., a non-transitory computer-readable medium such as one or more flash disks or hard drives. The non-transitory memory may include any tangible computer-readable medium including, for example, magnetic and/or optical disks, flash drives, and the like. - The
monitoring safety application 120 may be a component of an application and/or service executable by thedrone 102. For example, themonitoring safety application 120 may be a single unit of deployable executable code. Themonitoring safety application 120 may also be one application and/or a suite of applications for monitoring a person that triggers an alarm notification. In a primary example, themonitoring safety application 120 receives a location of the person, determines a route for thedrone 102 to fly to the person using the GPS hardware, routes thedrone 102 to the person based on the route, and monitors the person using video, photographs, and/or audio from thecamera 103. - For example, upon receipt of the alert notification, the
drone 102 receives the location and determines a shortest and/or quickest route to the location. The route also may be determined by another computing device and transmitted to thedrone 102. As an example, thealarm response server 108 may determine the route. The route may be determined based on weather conditions and obstacles including buildings, trees, power lines, and other obstacles. - After the route is determined, the
drone 102 takes off and may travel at a first particular altitude and a particular speed to the location. Thedrone 102 may also travel at a variable altitude that could change during the flight and a variable speed that could change during the flight. Thedrone 102 may pass through one or more waypoints on the route to the destination. The waypoints may be automatically assigned or may be assigned by an operator or user before or during flight. The waypoints may be used to avoid obstacles, avoid a populated area, or for another reason. As an example, the particular altitude may be 50 feet, 100 feet, 200 feet, 1000 feet, 2000 feet, and other altitudes. - The
drone 102 travels to the location based on the route and upon arrival thedrone 102 begins monitoring. Once thedrone 102 arrives at the location, thedrone 102 may hover at a second particular altitude above the person for a particular period of time. The second particular altitude may be the same altitude as the first particular altitude or a different altitude from the first particular altitude. The video, photographs, and/or the audio may be based on an aerial view of the person. In another embodiment, thedrone 102 may land at the location or near the location and the video, photographs, and/or the audio may be based on a terrestrial view of the person. - The
monitoring safety application 120 determines and builds one or more data structures comprising a three-dimensional environment, tracks objects including the person and obstacles, and records information. Themonitoring safety application 120 monitors the person using the one ormore cameras 103 and the one ormore microphones 105. As an example, themonitoring safety application 120 records video, photographs, and/or audio. Thedrone 102 may determine whether themobile computing device 106, thewearable device 104, and/or the person are currently moving or stationary. If themobile computing device 106, thewearable device 104, and/or the person is moving, thedrone 102 tracks and follows the person and continues to record video, photographs, and/or audio. In one embodiment, themonitoring safety application 120 streams the video and/or the audio to thealarm response server 108 and/or thecall center server 112. In another embodiment, themonitoring safety application 120 stores the video, photographs, and/or the audio in thememory 118. Themonitoring safety application 120 communicates data and messages with themobile computing device 106, thealarm response server 108, and/or thecall center server 112 using thecommunication network 114. - The
drone 102 may further include an optional display/output device 107 and aninput device 109. The display/output device 107 is used to provide status information about thedrone 102 including a current battery level or fuel level, a flying status (e.g., ascending/descending), and other information. Theoutput device 107 may be one or more light emitting diodes, e.g., a light emitting diode that flashes while thedrone 102 is in operation. The display may indicate the status information. The display can be a liquid-crystal display, a light-emitting diode display, an organic light-emitting diode display, a touch screen display, an e-ink display, an e-paper display, and other displays. Theinput device 109 is used to interact with thedrone 102 and may include one or more hardware buttons. The hardware buttons may include an on/off button and other buttons. Theinput device 109 may be included within the display if the display is a touch screen display. Theinput device 109 allows a user of thedrone 102 to manipulate and interact with themonitoring safety application 120. - The
drone 102 may also include an optional remote control receiver that operates with theinput device 109 for receiving information from an optional remote control transmitter. The remote control transmitter transmits information to the remote control receiver to monitor, control, and operate thedrone 102. The remote control transmitter may be a dedicated device comprising one or more processors and memory or a computer such as thealarm response server 108 or thecall center server 112. - In one exemplary embodiment, a
first drone 102 may communicate with asecond drone 102. In one example, thefirst drone 102 and thesecond drone 102 may travel to the location and cooperate to simultaneously monitor video, photographs, and/or the audio from multiple vantage points and/or multiple angles. Thefirst drone 102 and thesecond drone 102 may stream the video and/or the audio to thealarm response server 108 and/or thecall center server 112. In addition, thefirst drone 102 and thesecond drone 102 may store the video, photographs, and/or the audio in thememory 118. Thefirst drone 102 and thesecond drone 102 may communicate data and messages with themobile computing device 106, thealarm response server 108, and/or thecall center server 112 using thecommunication network 114. - In a second example, the
first drone 102 may travel to the location at a first time and monitor video, photographs, and/or the audio. When thefirst drone 102 determines that the battery level reaches a particular level, thefirst drone 102 may send a message to thealarm response server 108 and/or asecond drone 102. Thesecond drone 102 may travel to the location at a second time and monitor video, photographs, and/or the audio. Thefirst drone 102 and thesecond drone 102 may cooperate to seamlessly monitor video, photographs, and/or the audio for an extended period of time that may be longer than the life of a battery of a single drone. Thesecond drone 102 may send a message to thealarm response server 108 and/or athird drone 102 and the monitoring process may continue by thethird drone 102, and so on. -
FIG. 1 illustrates a block diagram of the optionalwearable device 104 according to an example embodiment. Thewearable device 104 may be a computer having one ormore processors 122 andmemory 124, including but not limited to a watch, a necklace, a pendant, a hair clip, a hair tie, a pin, a tie clip/tack, a ring, a cufflink, a belt clip, a scarf, a pashmina, a wrap, a shawl, a garment, a keychain, another small mobile computing device, or a dedicated electronic device having aprocessor 122 andmemory 124. Thewearable device 104 may be a Bluetooth Low Energy (BLE, Bluetooth LE, Bluetooth Smart) Device based on the Bluetooth 4.0 specification or another specification. According to an example embodiment, thewearable device 104 and themobile computing device 106 are paired and communicate wirelessly using a short-range wireless network, e.g., Bluetooth. - In another example, the
wearable device 104 may create a personal area network and/or a mesh network for communicating with the one or moremobile computing devices 106 and/or the one ormore drones 102. Additionally, thewearable device 104, themobile computing device 106, and the one ormore drones 102 may communicate using Zigbee, Wi-Fi, near field magnetic inductance, sonic (sound) waves, and/or infrared (light) waves. According to an example embodiment, thewearable device 104 may be a smart watch such as a GARMIN™ smart watch, a Pebble™ smart watch, a SAMSUNG™ Galaxy Gear smart watch, an ANDROID™ based smart watch, an APPLE™ and/or iOS™-based smart watch, a Tizen™ smart watch, and a VALRT™ wearable device, among others. - The one or
more processors 122 may process machine/computer-readable executable instructions and data, and thememory 124 may store machine/computer-readable executable instructions and data including one or more applications, including awearable safety application 126. Theprocessor 122 andmemory 124 are hardware. Thememory 124 includes random access memory (RAM) and non-transitory memory, e.g., a non-transitory computer-readable medium such as one or more flash disks or hard drives. The non-transitory memory may include any tangible computer-readable medium including, for example, magnetic and/or optical disks, flash drives, and the like. - The
wearable safety application 126 may be a component of an application and/or service executable by thewearable device 104. For example, thewearable safety application 126 may be a single unit of deployable executable code. Thewearable safety application 126 may also be one application and/or a suite of applications for triggering an alarm notification. In one embodiment, thewearable safety application 126 sends an alarm notification directly to thealarm response server 108. In another embodiment, thewearable safety application 126 sends the alarm notification to themobile computing device 106 and themobile computing device 106 forwards the alarm notification to thealarm response server 108. Thewearable safety application 126 may be a web-based application viewed in a browser on thewearable device 104 and/or a native application executed by thewearable device 104. Thewearable safety application 126 may be downloaded from the Internet and/or digital distribution platforms, e.g., directly from a website or an app store such as the Pebble™ appstore, the (iOS™) App Store, and GOOGLE PLAY,™ among others. Thewearable safety application 126 communicates messages with themobile computing device 106 and/or thealarm response server 108 using thecommunication network 114. - The
wearable device 104 may further include an optional display and an input device. The display is used to display visual components of thewearable safety application 126, such as at a user interface. In one example, the user interface may display a user interface of thewearable safety application 126. The display can be a liquid-crystal display, a light-emitting diode display, an organic light-emitting diode display, a touch screen display, an e-ink display, an e-paper display, and other displays. The input device is used to interact with thewearable safety application 126 and may include one or more hardware buttons. The input device may be included within the display if the display is a touch screen display. The input device allows a user of thewearable device 104 to manipulate and interact with the user interface of thewearable safety application 126. -
FIG. 1 also illustrates a block diagram of themobile computing device 106 according to an example embodiment. Themobile computing device 106 may be a computer having one ormore processors 128 andmemory 130, including but not limited to a server, laptop, desktop, tablet computer, smartphone, or a dedicated electronic device having aprocessor 128 andmemory 130. The one ormore processors 128 may process machine/computer-readable executable instructions and data, and thememory 130 may store machine/computer-readable executable instructions and data including one or more applications, including amobile safety application 132. Theprocessor 128 andmemory 130 are hardware. Thememory 130 includes random access memory (RAM) and non-transitory memory, e.g., a non-transitory computer-readable medium such as one or more flash disks or hard drives. The non-transitory memory may include any tangible computer-readable medium including, for example, magnetic and/or optical disks, flash drives, and the like. - The
mobile safety application 132 may be a component of an application and/or service executable by themobile computing device 106. For example, themobile safety application 132 may be a single unit of deployable executable code. Themobile safety application 132 may also be one application and/or a suite of applications for triggering an alarm notification. In one embodiment, themobile safety application 132 sends an alarm notification directly to thealarm response server 108. In another embodiment, themobile safety application 132 receives the alarm notification from thewearable device 104 and themobile computing device 106 forwards the alarm notification to thealarm response server 108. Themobile safety application 132 may be a web-based application viewed in a browser on themobile computing device 106 and/or a native application executed by themobile computing device 106. The application may be downloaded from the Internet and/or digital distribution platforms, e.g., directly from a website, the Mac™ App Store, the (iOS™) App Store, and/or GOOGLE PLAY™, among others. According to an example embodiment, themobile safety application 132 is an iOS™ application, an Android™ application, or a Windows™ Phone application. Themobile safety application 132 communicates messages with thedrone 102, thewearable device 104 and/or thealarm response server 108 using thecommunication network 114. - The
mobile computing device 106 includes global positioning system (GPS) hardware. The GPS hardware communicates with a GPS satellite-based positioning system. Themobile computing device 106 may further include an optional display and an input device. The display is used to display visual components of themobile safety application 132, such as at a user interface. In one example, the user interface may display a user interface of themobile safety application 132. The display can be a liquid-crystal display, a light-emitting diode display, an organic light-emitting diode display, a touch screen display, an e-ink display, an e-paper display, and other displays. The input device is used to interact with themobile safety application 132 and may include a mouse, a keyboard, a trackpad, and/or the like. The input device may be included within the display if the display is a touch screen display. The input device allows a user of themobile computing device 106 to manipulate and interact with the user interface of themobile safety application 132. -
FIG. 1 further illustrates a block diagram of thealarm response server 108 according to an example embodiment. According to an aspect of the present disclosure, thealarm response server 108 is a computer having one ormore processors 134 andmemory 136. Thealarm response server 108 may be, for example, a laptop, desktop, a server, tablet computer, mobile computing device (e.g., a smart phone) or a dedicated electronic device having aprocessor 134 andmemory 136. Thealarm response server 108 includes one ormore processors 134 to process data andmemory 136 to store machine/computer-readable executable instructions and data including analarm response application 138. Theprocessor 134 andmemory 136 are hardware. Thememory 136 includes non-transitory memory, e.g., random access memory (RAM) and one or more hard disks. The non-transitory memory may include any tangible computer-readable medium including, for example, magnetic and/or optical disks, flash drives, and the like. The data associated with thealarm response application 138 may be stored in a structured query language (SQL) server database or another appropriate database management system withinmemory 136 and/or in the one ormore databases 110. Additionally, thememory 136 and/or thedatabases 110 may also include a dedicated file server having one or more dedicated processors, random access memory (RAM), a Redundant Array of Inexpensive Disks (RAID) hard drive configuration, an Ethernet interface or other communication interface, and a server-based operating system. - The
alarm response server 108 may further include an optional display and an input device. The display is used to display visual components of thealarm response application 138, such as at a user interface. In one example, the user interface may display a user interface of thealarm response application 138. The display can be a liquid-crystal display, a light-emitting diode display, an organic light-emitting diode display, a touch screen display, an e-ink display, an e-paper display, and other displays. The input device is used to interact with thealarm response application 138 and may include a mouse, a keyboard, a trackpad, and/or the like. The input device may be included within the display if the display is a touch screen display. The input device allows a user of thealarm response server 108 to manipulate and interact with the user interface of thealarm response application 138. - According to an example embodiment, the one or
more databases 110 may store user information associated with one or more users of thewearable safety application 126 and/or themobile safety application 132 such as identifying information. In addition, the one ormore databases 110 may store alarm notification information including a record of each alarm notification received by thealarm response server 108. Each record may include a unique alarm notification identifier and the unique identifier associated with corresponding identifying information. The record also may include location information and other information. In addition, the one ormore databases 110 may store PSAP information as shown inFIG. 2 . -
FIG. 1 illustrates a block diagram of thecall center server 112 according to an example embodiment. Thecall center server 112 may be associated with a PSAP, e.g., a 911 emergency dispatch center. According to an aspect of the present disclosure, thecall center server 112 is a computer having one ormore processors 140 andmemory 142. Thecall center server 112 may be, for example, a laptop, desktop, a server, tablet computer, mobile computing device (e.g., a smart phone) or a dedicated electronic device having aprocessor 140 andmemory 142. Thecall center server 112 includes one ormore processors 140 to process data andmemory 142 to store machine/computer-readable executable instructions and data including anemergency dispatch application 144. Theprocessor 140 andmemory 142 are hardware. Thememory 142 includes non-transitory memory, e.g., random access memory (RAM) and one or more hard disks. The non-transitory memory may include any tangible computer-readable medium including, for example, magnetic and/or optical disks, flash drives, and the like. The data associated with theemergency dispatch application 144 may be stored in a structured query language (SQL) server database or another appropriate database management system withinmemory 142 and/or in one or more databases associated with thecall center server 112. Additionally, thememory 142 and/or the databases associated with thecall center server 112 may also include a dedicated file server having one or more dedicated processors, random access memory (RAM), a Redundant Array of Inexpensive Disks (RAID) hard drive configuration, an Ethernet interface or other communication interface, and a server-based operating system. - The
call center server 112 may further include an optional display and an input device. The display is used to display visual components of theemergency dispatch application 144, such as at a user interface. In one example, the user interface may display a user interface of theemergency dispatch application 144. The display can be a liquid-crystal display, a light-emitting diode display, an organic light-emitting diode display, a touch screen display, an e-ink display, an e-paper display, and other displays. The input device is used to interact with theemergency dispatch application 144 and may include a mouse, a keyboard, a trackpad, and/or the like. The input device may be included within the display if the display is a touch screen display. The input device allows a user of thecall center server 112 to manipulate and interact with the user interface of theemergency dispatch application 144. - In one embodiment, a user may configure the
wearable device 104 and/or themobile computing device 106. The user may download and/or install thewearable safety application 126 inmemory 124 on thewearable device 104 and themobile safety application 132 inmemory 130 on themobile computing device 106. In an example, the user downloads and installs thewearable safety application 126 on a Pebble™ wearable device and the user downloads and installs themobile safety application 132 on an iOS™-based smart phone. Once installed, the user may configure thewearable safety application 126 and themobile safety application 132 for use. Using the user interface of themobile safety application 132, the user interface of thewearable safety application 126, or another interface (e.g., a web-based interface), the user may enter setup and/or configuration information comprising identifying information. The identifying information may include one or more of a name (first and last), one or more email addresses, one or more telephone numbers including a telephone number of themobile computing device 106 or thewearable device 104, one or more addresses, a height, a weight, an eye color, a hair color, a gender, a photograph, an alarm code for disabling an alarm notification, and a secret code for discreetly indicating that the user is in immediate need of assistance, among other information. As an example, the secret code may be automatically derived from the alarm code. If the alarm code is entered as 1234, the secret code may be automatically set by themobile safety application 132 as 1235. In addition, the user may provide information associated with one or more lifelines, e.g., a person to contact in the event of an emergency. The information associated with the one or more lifelines may include a name, one or more email addresses, and one or more telephone numbers, among other information. - The
wearable device 104, themobile computing device 106, or another computer sends the identifying information to thealarm response server 108 via thecommunication network 114. Thealarm response server 108 receives the identifying information and stores the identifying information in thememory 136 and/or thedatabase 110. Thealarm response server 108 associates the identifying information with a unique identifier (e.g., a member identifier) and transmits the unique identifier to thewearable device 104 and/or themobile computing device 106. Thewearable safety application 126 and/or themobile safety application 132 receive the unique identifier and store the unique identifier inmemory 124 and/ormemory 130. At this point, thewearable safety application 124 and themobile safety application 132 are configured and ready for use. - According to an example embodiment, in the event of an emergency, the user may trigger an alarm notification representing an instant emergency alarm that deploys the
drone 102 and notifies first responders (e.g., a 911 PSAP) using thewearable device 104 and/or themobile computing device 106. - After the
mobile safety application 132 is configured and ready for use, themobile safety application 132 may operate in one of two exemplary operation modes. In a first monitoring mode, themobile safety application 132 continually determines whether the user is touching the touchscreen of themobile computing device 106. In one example, the user may keep a finger on the touchscreen while themobile computing device 106 is located in a pocket. In another example, the user may keep a finger on the touchscreen while holding themobile computing device 106 as if themobile computing device 106 is being used to place a telephone call. If the user stops touching the touchscreen of themobile computing device 106, an alarm notification may be triggered. This may occur if the user is attacked and/or the user drops themobile computing device 106. The alarm notification also may be triggered if the user enters the secret passcode. A countdown may begin after the alarm notification is triggered. During this countdown, the user may stop the countdown or disarm themobile safety application 132. However, if the user does not stop the countdown or disarm themobile safety application 132, the alarm notification is confirmed. - In a second monitoring mode, the
mobile safety application 132 may automatically trigger an alarm notification after a particular preset period of time, e.g., ten minutes. While in the second monitoring mode, themobile safety application 132 may display a timer that indicates how much of the particular period of time is left until the alarm notification is triggered. As an example, it may take the user approximately six minutes to travel from their car or a train station to their apartment. The user may desire to use the second monitoring mode of themobile safety application 132 while traveling from their car or the train station to their apartment. The user may disarm themobile safety application 132 upon arrival at the apartment. However, after the particular period of time ends, the alarm notification is triggered. The alarm notification also may be triggered if the user enters the secret passcode. A countdown may begin after the alarm notification is triggered. During this countdown, the user may stop the countdown or disarm themobile safety application 132. However, if the user does not stop the countdown or disarm themobile safety application 132, the alarm notification is confirmed. - The
wearable safety application 126 also may trigger the alarm notification. In one example, the user may press and hold two hardware buttons on thewearable device 104 for a particular period of time, e.g., four seconds. In another example, the user may press and hold one hardware button on thewearable device 104 for the particular period of time. In a further example, the user may press a hardware button on the wearable device 104 a particular number of times consecutively in a particular period of time, e.g., three to ten times in twenty seconds. In another example, the user may press the touch screen of the wearable device 104 a particular number of times consecutively in a particular period of time. - The
wearable device 104 may include a radio frequency (RF) transceiver or another transceiver for transmitting the alarm notification to themobile computing device 106 and/or thealarm response server 108. Thewearable device 104 may include a microphone for receiving a voice activated alarm notification, an accelerometer for detecting an acceleration greater than a particular threshold to generate an alarm notification (e.g., a hard fall), a gyroscope for detecting rotation greater than a particular threshold to generate an alarm notification (e.g., a hard fall), and a biometric device to receive an alarm notification. In one aspect, the biometric device may be a fingerprint recognition device to determine unique patterns in one or more fingers of the user or a retina scanner to determine unique patterns associated with a retina of the user. In another aspect, the biometric device may be a heart rate monitor to measure and/or record a heart rate of a user. The biometric device also may detect a heart attack and/or an abnormal heart rate. The biometric device may store information associated with the heart rate inmemory 124 andmemory 130 to provide historic contextual data for a normal and an abnormal heart rate. If the heart rate is lower than a particular threshold or higher than a particular threshold, the heart rate monitor may detect distressed health conditions, a heart attack and/or conditions indicative of a heart attack and generate an alarm notification that may be sent to one or more PSAPs and first responders. Of course, this is just one example of user health monitoring that may be executed using the systems and methods taught herein. There are numerous monitored conditions that may be used to generate an alarm notification, including temperature, breathing rate, etc. - After the
wearable device 104 triggers the alarm, thewearable device 104 sends an alarm notification message to themobile computing device 106. The alarm notification message may be sent by thewearable device 104 using a Bluetooth network or another short-range wireless network. Themobile computing device 106 reverse geocodes a current location of themobile computing device 106 using the global positioning system (GPS) hardware. The GPS hardware communicates with a GPS satellite-based positioning system. The GPS hardware may be an assisted GPS system, e.g., A-GPS or aGPS, or may be a standalone GPS. Standalone GPS only uses radio signals from the satellite-based positioning system. An assisted GPS system uses network resources available to themobile computing device 106 and/or thewearable device 104 to locate and use the satellite-based positioning system in poor signal conditions, such as in a city where signals bounce off of buildings or pass through walls or tree cover. - The
mobile computing device 106 sends or forwards the alarm notification message with the current location information and the unique identifier to thealarm response server 108 via thecommunication network 112. Thealarm response server 108 receives the alarm notification message, transmits a unique alarm identifier to themobile computing device 106 that corresponds with this particular alarm notification, and determines one or more PSAPs based on the current location information. In one example, thealarm response application 138 of thealarm response server 108 determines three PSAPs that are closest to the current location of themobile computing device 106 by querying the one ormore databases 110 using the current location information, e.g., a latitude value and a longitude value. In another example, thealarm response application 138 of thealarm response server 108 determines three PSAPs that have a highest safety score. The safety score may be based on the current location of themobile computing device 106, a historical response time of the PSAP, a PSAP service rating (e.g., one to five stars), and other service-level agreement based factors. - The
alarm response application 138 may generate a user interface on the display of thealarm response server 108. The user interface may include information associated with the one or more PSAPs, the identifying information, a map showing the current location of the user, and the monitoring information from thedrone 102, among other information. The user interface may include a button or other user interface element for indicating that the alarm notification is a false alarm, one or more buttons or other user interface elements to control and monitor the one ormore drones 102, and another button or other user interface element for forwarding the alarm notification to thecall center server 112. - After or concurrently with the determination of the one or more PSAPs, the
alarm response application 138 determines a telephone number and/or email address in the one ormore databases 110 associated with the unique identifier. Thealarm response application 138 of thealarm response server 108 initiates one or more automated telephone calls, sends an email, and/or sends a text message (SMS/MMS) to themobile computing device 106 or thewearable device 104 to verify a condition of the instant emergency alarm. - The user of the
wearable device 104 and/or themobile computing device 106 may indicate that the instant emergency alarm was a false alarm by providing the alarm passcode, e.g., one or more numbers such as 1234. The alarm passcode may be provided to a human call representative associated with thealarm response server 108. In another instance, the text message and the email may include a uniform resource locator (URL) to direct the user to a web page having a form to receive the alarm passcode. The user of themobile computing device 106 may view the web page and transmit the alarm passcode to thealarm response server 108. Using thedatabase 110, thealarm response server 108 confirms that the alarm passcode is correct, e.g., this is a false alarm, and the process may end. - The user of the
wearable device 104 and/or themobile computing device 106 may indicate that the instant emergency alarm was not a false alarm by providing the secret passcode, e.g., one or more numbers such as 911 or 1235. The secret passcode may be provided to the human call representative associated with thealarm response server 108. In another instance, the text message and the email may include the URL that directs the user to the web page having the form to receive the secret passcode. The user of themobile computing device 106 may view the web page and transmit the secret passcode to thealarm response server 108. Using thedatabase 110, thealarm response server 108 confirms that the secret passcode is correct or not correct. If the secret passcode is correct, thealarm response server 108 sends the alarm notification with the identifying information and the location information to theemergency dispatch application 138 of thecall center server 112 via the communication network. Optionally, thealarm response server 108 sends the alarm notification with the identifying information and the location information to the one or more lifelines by initiating an automated telephone call, sending an email, and/or sending a text message (SMS/MMS) to the one or more lifelines. The email and text message may include a URL that provides detailed information about the alarm notification including a map showing the current location of the user. - If the
alarm response server 108 does not receive the correct alarm passcode after a particular period of time (e.g., one minute), thealarm response server 108 sends the alarm notification with the identifying information and the location information to theemergency dispatch application 144 of thecall center server 112 via the communication network. Optionally, thealarm response server 108 sends the alarm notification with the identifying information and the location information to the one or more lifelines by initiating an automated telephone call, sending an email, and/or sending a text message (SMS/MMS) to the one or more lifelines. The email and text message may include a URL that provides detailed information about the alarm notification including a map showing the current location of the user. - If the
alarm response server 108 does not receive the correct alarm passcode after the particular period of time, thealarm response server 108 sends the identifying information and the current location information to thedrone 102. Thedrone 102 receives the identifying information and the current location information and stores the identifying information and the current location information in thememory 118. Thedrone 102 determines the quickest and/or shortest route to the current location using the current location information, weather conditions, and obstacles. Thedrone 102 travels to the current location using the route and upon arrival begins monitoring activity at the current location. As an example, thedrone 102 hovers at a particular altitude and records video, photographs, and/or audio using the one or more cameras and the one or more microphones. Thedrone 102 may stream and/or transmit the video, photographs, and/or the audio to thealarm response server 108 and/or thecall center server 112. If the person, themobile computing device 106, and/or thewearable device 104 begins moving while thedrone 102 is monitoring, thedrone 102 tracks and follows the person and continues to record video, photographs, and/or audio. - In one embodiment, the
drone 102 continues to record and/or stream the at least one of video, audio, and photographic information for a particular period of time, e.g., ten minutes, or until the drone battery level reaches a critical level. The critical level may be based upon a distance that thedrone 102 is from the hangar. Upon reaching the critical level, thedrone 102 stops recording and/or streaming to have sufficient battery power to return to the hangar. In another embodiment, thedrone 102 continues to record and/or stream the at least one of video, audio, and photographic information until thedrone 102 receives a message from one of thealarm response server 108 and/or thecall center server 112 to stop recording. After thedrone 102 stops recording, thedrone 102 follows a reverse route or another route back to its hangar. The reverse route may be a route that is opposite of the route that the drone used to reach the location. In one aspect, upon arrival at the hangar and/or connecting to thecommunications network 114, thedrone 102 transmits the at least one of video, audio, and photographic information to thealarm response server 108 and/or thecall center server 112. The video, audio, and photographic information may be stored in thedatabase 110 and associated with the unique identifier and the unique alarm identifier. - According to an example embodiment, the one or
more drones 102, the one or morewearable devices 104, the one or moremobile computing devices 106, the one or morealarm response servers 108, the one ormore databases 110, and the one or morecall center servers 112 communicate using a web application programming interface (API) comprising a defined request/response message system. According to one aspect, the message system is based on Javascript Object Notation (JSON) and the web API is a RESTful web API based on Representational State Transfer (REST). - The web API includes one or more HTTP methods including alert activation, alert cancel, alert triggered, alert silent alarm, and alert location update, among other methods.
- Alert activation may be called when the user activates one of the monitoring mode and the timer mode. When the alert activation is called, a record is created in the
database 110 having a unique alert/alarm identifier. As an example, the alert activation uniform resource locator (URL) comprises http://a.llr1.com/rest/AlertActivation. The alert activation input parameters include an alert latitude, an alert longitude, a member ID (unique identifier), an alert type (monitoring or timer), and an alarm minutes value. The alarm minutes value is associated with the second timer mode. The alert activation output parameters include a status code, a status description, and an alert ID (e.g., a unique alert/alarm identifier that represents this particular alert notification). The unique alert identifier may be used to reference a particular alarm notification, e.g., 27307. - Sample alert activation header & body:
- Authorization: OAuth
- Content-Type: application/json\r\n\r\n\r\n
- {“AlertLatitud”:“41.903507”,“AlertLogitud”:“−87.987227”,“MemberId”:“1”,“AlertType”:“M”,“AlarmMinutes”:“0”}\r\n
- LIVE Response Successful:
- LIVE Response Failed: {“StatusCode”:1,“StatusDescription”:“No matching ‘Member’.”,“AlertId”:“0”}
- Alert cancel may be called when the user correctly enters the alarm passcode to deactivate the alert. Alert cancel is applicable to both the monitoring mode and the timer mode. As an example, the alert cancel URL comprises http://a.llr1.com/rest/AlertCancel. The alert cancel input parameters include a unique alert identifier. The alert cancel output parameters include a status code and a status description.
- Sample alert cancel header & body:
- Authorization: OAuth
- Content-Type: application/json\r\n\r\n\r\n
- {“AlertId”:“27307”}\r\n
- LIVE Response Successful: {“StatusCode”:0,“StatusDescription”:“Success.”}
- LIVE Response Failed: {“StatusCode”:1,“StatusDescription”:“Invalid Alert Id—Alert Id not found”}
- Alert trigger may be called when the user is in monitoring mode and the user ends monitoring mode. Monitoring mode may end when the user removes a finger from a touchscreen of the
mobile computing device 106. As an example, the alert trigger URL comprises http://a.llr1.com/rest/AlertTrigger. The alert trigger input parameters include a unique alert identifier, an alert latitude, and an alert longitude. The alert trigger output parameters include a status code and a status description. - Sample alert trigger header & body:
- Authorization: OAuth
- Content-Type: application/json\r\n\r\n\r\n
- {“AlertId”:“167”,“AlertLatitud”: “45.903507”,“AlertLogitud”:“−82.987227”}\r\n
- LIVE Response Successful: {“StatusCode”:0,“StatusDescription”:“Success.”}
- LIVE Response Failed: {“StatusCode”:1,“StatusDescription”:“Invalid Alert Id—Alert Id not found”}
- Alert silent alarm may be called when the user is in monitoring mode and the user enters the secret password to trigger the alarm. As an example, the alert silent alarm URL comprises http://a.llr1.com/rest/AlertSilentAlarm. The alert silent alarm input parameters include a unique alert identifier. The alert silent alarm output parameters include a status code and a status description.
- Sample alert silent alarm header & body:
- Authorization: OAuth
- Content-Type: application/json\r\n\r\n\r\n
- {“AlertId”:“4”}\r\n
- LIVE Response Successful: {“StatusCode”:0,“StatusDescription”:“Success.”}
- LIVE Response Failed: {“StatusCode”:1,“StatusDescription”:“Invalid Alert Id—Alert Id not found”}
- Alert location update may be called to update location information associated with a particular alarm notification. As an example, the alert location update may be called at a particular interval of time after the alarm notification, e.g., every ten seconds. In another example, the alert location update may be called when the
mobile computing device 106 and/or thewearable device 104 moves a particular distance, e.g., every 37 feet of movement. Based on the alert location update, thedrone 102, thealarm response server 108, and thecall center server 112 may determine how fast themobile computing device 106 and/or thewearable device 104 are moving by evaluating the difference between each alert location update. Thedrone 102, thealarm response server 108, and thecall center server 112 may determine an instantaneous speed of themobile computing device 106 and/or thewearable device 104 based on the distance traveled with respect to time. As an example, the alert location update URL comprises http://a.llr1.com/rest/AlertLocationUpdate. The alert location update input parameters include a unique alert identifier, an alert location latitude, and an alert location longitude. The alert location update output parameters include a status code and a status description. - Sample alert location update header & body:
- Authorization: OAuth
- Content-Type: application/json\r\n\r\n\r\n
- {“AlertId”:“27307”,“AlertLocationLatitud”:“41.90350”,“AlertLocationLogitud”:“−87.987227”}\r\n
- LIVE Response: {“StatusCode”:0,“StatusDescription”:“Success.”}
-
FIG. 2 illustrates example information in thealarm response database 110 according to an example embodiment. According to an example embodiment, the alarm response database may store PSAP information. Each PSAP in the United States and throughout the world may have database fields/attributes stored in thealarm response database 110. As shown inFIG. 2 , the database fields/attributes may include one or more of a PSAP ID, a PSAP RedID, a PSAP Segment, a PSAP First Name, a PSAP Middle Initial, a PSAP Last Name, a PSAP Department, a PSAP Mailing Address (1), a PSAP Mailing Address (2), a PSAP Mailing City, a PSAP Mailing State, a PSAP Mailing Zip Code, a PSAP Physical Address (1), a PSAP Physical Address (2), a PSAP Physical City, a PSAP Physical State, a PSAP Physical Zip Code, a PSAP Phone Number, a PSAP Phone Extension, a PSAP Fax Number, a PSAP Fax Extension, a PSAP911 Phone Number, a PSAP Longitude, a PSAP Latitude, a PSAP InvalidCount, a PSAP County, and a PSAP Region, among others. -
FIG. 3 illustrates a flowchart of a process for triggering an alarm notification and monitoring by thedrone 102, according to an example embodiment. Theprocess 300 shown inFIG. 3 begins instep 302. Instep 302, the user of thewearable device 104 and/or themobile computing device 106 provides setup information to thewearable safety application 126 and/or themobile safety application 132. The setup information comprises the identifying information. Instep 304, thewearable safety application 126 of thewearable device 104 and/or themobile safety application 132 of themobile computing device 106 send the setup information including the identifying information to thealarm response server 108 via thecommunication network 114. Thealarm response server 108 stores the identifying information in the one ormore databases 110 and sends a unique identifier that represents the identifying information to thewearable device 104 and/or themobile computing device 106. Thewearable device 104 and/or themobile computing device 106 receive the unique identifier and store the unique identifier inmemory 124 and/ormemory 130. - In
step 306, in the event of an emergency, the user triggers thewearable device 104 and/or themobile computing device 106. In one embodiment, thewearable safety application 124 receives the trigger and sends an alarm notification message to themobile computing device 106 via Bluetooth or another short-range wireless protocol. In an additional embodiment, themobile safety application 132 receives the trigger via the monitoring mode or the timer mode. Themobile computing device 106 reverse geocodes a current location of themobile computing device 106. In another embodiment, thewearable device 104 reverse geocodes a current location of thewearable device 104 and provides this current location with the alarm notification message. Themobile computing device 106 sends the alarm notification message including current location information and the unique identifier to thealarm response server 108. - In
step 308, thealarm response server 108 receives the alarm notification message having the current location information and based on the current location information and the PSAP information in thedatabase 110 determines one or more PSAPs. In response to the alarm notification message, thealarm response server 108 may send themobile computing device 106 and/or the wearable device 104 a unique alarm identifier that represents the alarm notification. - In
step 310, thealarm response server 108 notifies the user to determine whether the alarm notification is a false alarm. Thealarm response server 108 may send one or more of a telephone call, an email, and a message to themobile computing device 106 and/or thewearable device 104. If the user provides a correct alarm code, the process may end. However, if the alarm notification is not a false alarm and if the user does not provide a correct alarm code or provides a secret code, instep 312, thealarm response server 108 sends the alarm notification message including the identifying information and the current location information to thecall center server 112. In addition, thealarm response server 108 may send the identifying information and the current location information to the one or more lifelines. - In
step 314, thealarm response server 108 sends the identifying information and the current location information to thedrone 102. Instep 316, thedrone 102 receives the identifying information and the current location information and stores the identifying information and the current location information in thememory 118. Thedrone 102 determines a shortest and/or quickest route from its hangar to the current location of themobile computing device 106 and/or thewearable device 104. The route may be based on weather conditions and obstacles. - In
step 318, the drone follows the route to the current location of themobile computing device 106 and/or thewearable device 104. Upon arrival, thedrone 102 records at least one of video, audio, and photographic information using the one or more cameras and the one or more microphones. In one embodiment, thedrone 102 streams the at least one of video, audio, and photographic information to thealarm response server 108 and/or thecall center server 112. In one embodiment, thedrone 102 continues to record and/or stream the at least one of video, audio, and photographic information for a particular period of time, e.g., ten minutes, or until the drone battery level reaches a critical level. In another embodiment, thedrone 102 continues to record and/or stream the at least one of video, audio, and photographic information until thedrone 102 receives a message from thealarm response server 108, thecall center server 112, a remote control, or another computing device to stop recording. After thedrone 102 stops recording, thedrone 102 follows a reverse route or another route back to its hangar. In one aspect, upon arrival at the hangar, thedrone 102 transmits the at least one of video, audio, and photographic information to thealarm response server 108 and/or thecall center server 112. The video, audio, and photographic information may be stored in thedatabase 110 and associated with the unique identifier and/or the unique alarm identifier. - Although the embodiment described above indicates that the
mobile computing device 106 sends the alarm notification message to thealarm response server 108, according to another embodiment, thewearable device 104 may directly send the alarm notification message to thealarm response server 108. -
FIG. 4 illustrates anexample computing system 400 that may implement portions of the various systems described herein, such as thedrone 102, thewearable device 104, themobile computing device 106, thealarm response server 108, thecall center server 112, and methods discussed herein, such asprocess 300. A general-purpose computer system 400 is capable of executing a computer program product to execute a computer process. Data and program files may be input to thecomputer system 400, which reads the files and executes the programs therein such as themonitoring safety application 120, thewearable safety application 126, themobile safety application 132, thealarm response application 138, and theemergency dispatch application 144. Some of the elements of a general-purpose computer system 400 are shown inFIG. 4 wherein aprocessor 402 is shown having an input/output (I/O)section 404, a central processing unit (CPU) 406, and amemory section 408. There may be one ormore processors 402, such that theprocessor 402 of thecomputer system 400 comprises a single central-processing unit 406, or a plurality of processing units, commonly referred to as a parallel processing environment. Thecomputer system 400 may be a conventional computer, a server, a distributed computer, or any other type of computer, such as one or more external computers made available via a cloud computing architecture. The presently described technology is optionally implemented in software devices loaded inmemory 408, stored on a configured DVD/CD-ROM 410 orstorage unit 412, and/or communicated via a wired orwireless network link 414, thereby transforming thecomputer system 400 inFIG. 4 to a special purpose machine for implementing the described operations. - The
memory section 408 may be volatile media, nonvolatile media, removable media, non-removable media, and/or other media or mediums that can be accessed by a general purpose or special purpose computing device. For example, thememory section 408 may include non-transitory computer storage media and communication media. Non-transitory computer storage media further may include volatile, nonvolatile, removable, and/or non-removable media implemented in a method or technology for the storage (and retrieval) of information, such as computer/machine-readable/executable instructions, data and data structures, engines, program modules, and/or other data. Communication media may, for example, embody computer/machine-readable/executable, data structures, program modules, algorithms, and/or other data. The communication media may also include an information delivery technology. The communication media may include wired and/or wireless connections and technologies and be used to transmit and/or receive wired and/or wireless communications. - The I/
O section 404 is connected to one or more user-interface devices (e.g., akeyboard 416 and a display unit 418), adisc storage unit 412, and adisc drive unit 420. Generally, thedisc drive unit 420 is a DVD/CD-ROM drive unit capable of reading the DVD/CD-ROM medium 410, which typically contains programs anddata 422. Computer program products containing mechanisms to effectuate the systems and methods in accordance with the presently described technology may reside in thememory section 404, on adisc storage unit 412, on the DVD/CD-ROM medium 410 of thecomputer system 400, or on external storage devices made available via a cloud computing architecture with such computer program products, including one or more database management products, web server products, application server products, and/or other additional software components. Alternatively, adisc drive unit 420 may be replaced or supplemented by a floppy drive unit, a tape drive unit, or other storage medium drive unit. Thenetwork adapter 424 is capable of connecting thecomputer system 400 to a network via thenetwork link 414, through which the computer system can receive instructions and data. Examples of such systems include personal computers, Intel or PowerPC-based computing systems, AMD-based computing systems and other systems running a Windows-based, a UNIX-based, or other operating system. It should be understood that computing systems may also embody devices such as Personal Digital Assistants (PDAs), mobile phones, tablets or slates, multimedia consoles, gaming consoles, set top boxes, etc. - When used in a LAN-networking environment, the
computer system 400 is connected (by wired connection and/or wirelessly) to a local network through the network interface oradapter 424, which is one type of communications device. When used in a WAN-networking environment, thecomputer system 400 typically includes a modem, a network adapter, or any other type of communications device for establishing communications over the wide area network. In a networked environment, program modules depicted relative to thecomputer system 400 or portions thereof, may be stored in a remote memory storage device. It is appreciated that the network connections shown are examples of communications devices for and other means of establishing a communications link between the computers may be used. - In an example implementation, source code executed by the
drone 102, thewearable device 104, themobile computing device 106, thealarm response server 108, and thecall center server 112, a plurality of internal and external databases including thedatabase 110, source databases, and/or cached data on servers are stored inmemory 118 of thedrone 102,memory 124 of thewearable device 104,memory 130 of themobile computing device 106,memory 136 of thealarm response server 108,memory 142 of thecall center server 112, or other storage systems, such as thedisk storage unit 412 or the DVD/CD-ROM medium 410, and/or other external storage devices made available and accessible via a network architecture. The source code executed by thedrone 102, thewearable device 104, themobile computing device 106, thealarm response server 108, and thecall center server 112 may be embodied by instructions stored on such storage systems and executed by theprocessor 402. - The
processor 402, which is hardware, may perform some or all of the operations described herein. Further, local computing systems, remote data sources and/or services, and other associated logic represent firmware, hardware, and/or software configured to control operations of the drone safetyalert monitoring system 100 and/or other components. Such services may be implemented using a general-purpose computer and specialized software (such as a server executing service software), a special purpose computing system and specialized software (such as a mobile device or network appliance executing service software), or other computing configurations. In addition, one or more functionalities disclosed herein may be generated by theprocessor 402 and a user may interact with a Graphical User Interface (GUI) using one or more user-interface devices (e.g., thekeyboard 416, thedisplay unit 418, and the user devices 404) with some of the data in use directly coming from online sources and data stores. The system set forth inFIG. 4 is but one possible example of a computer system that may employ or be configured in accordance with aspects of the present disclosure. -
FIG. 5 illustrates anexample screenshot 500 of themobile safety application 132 executed by themobile computing device 106 according to an example embodiment. As shown inFIG. 5 , themobile safety application 132 may operate in the first monitoring mode (e.g., thumb mode) or the second timer mode. If the user selects the thumb mode user interface button, themobile safety application 132 enters the first monitoring mode. If the user selects the timer mode user interface button, themobile safety application 132 enters the second timer mode. -
FIG. 6 illustrates anotherexample screenshot 600 of themobile safety application 132 executed by themobile computing device 106 according to an example embodiment. As shown inFIG. 6 , themobile safety application 132 is operating in the first monitoring mode. In the first monitoring mode, themobile safety application 132 continually determines whether the user is touching the touchscreen of themobile computing device 106. If the user stops touching the touchscreen of themobile computing device 106, an alarm notification may be triggered. This may occur if the user is attacked and/or the user drops themobile computing device 106. The alarm notification also may be triggered if the user enters the secret passcode. A countdown may begin after the alarm notification is triggered. During this countdown, the user may stop the countdown or disarm themobile safety application 132. However, if the user does not stop the countdown or disarm themobile safety application 132, the alarm notification is confirmed. -
FIG. 7 illustrates anotherexample screenshot 700 of themobile safety application 132 executed by themobile computing device 106 according to an example embodiment. As shown inFIG. 7 , themobile safety application 132 is operating in the second timer mode. As shown in thescreenshot 700, the user interface of themobile safety application 132 includes a user interface element for selecting an amount of time to wait before triggering the alarm notification (e.g., a distress alert). -
FIG. 8 illustrates an example of adrone 102 according to an example embodiment. As shown, thedrone 102 includes acamera system 103, amicrophone system 105, anoutput system 107, and aninput system 109. -
FIG. 9 illustrates a keychain including an examplewearable device 900 according to an example embodiment. This examplewearable device 900 is a VALRT™ wearable device.FIG. 10 illustrates another view of the examplewearable device 1000 on a wristband according to an example embodiment. -
FIG. 11 illustrates a command center graphical user interface (GUI) 1100 based on an alert notification that includes one or moreaerial video streams 1102 according to an example embodiment. As an example, thealarm response server 108 may display the command center GUI using thealarm response application 138 and/or thecall center server 112 may display the command center GUI using theemergency dispatch application 144. - In the present disclosure, the methods disclosed may be implemented as sets of instructions or software readable by a device. Further, it is understood that the specific order or hierarchy of steps in the methods disclosed are instances of example approaches. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the method can be rearranged while remaining within the disclosed subject matter. The accompanying method claims present elements of the various steps in a sample order, and are not necessarily meant to be limited to the specific order or hierarchy presented.
- The described disclosure may be provided as a computer program product, or software, that may include a non-transitory machine-readable medium having stored thereon executable instructions, which may be used to program a computer system (or other electronic devices) to perform a process according to the present disclosure. A non-transitory machine-readable medium includes any mechanism for storing information in a form (e.g., software, processing application) readable by a machine (e.g., a computer). The non-transitory machine-readable medium may include, but is not limited to, magnetic storage medium (e.g., floppy diskette), optical storage medium (e.g., CD-ROM); magneto-optical storage medium, read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM and EEPROM); flash memory; or other types of medium suitable for storing electronic executable instructions.
- The description above includes example systems, methods, techniques, instruction sequences, and/or computer program products that embody techniques of the present disclosure. However, it is understood that the described disclosure may be practiced without these specific details.
- It is believed that the present disclosure and many of its attendant advantages will be understood by the foregoing description, and it will be apparent that various changes may be made in the form, construction and arrangement of the components without departing from the disclosed subject matter or without sacrificing all of its material advantages. The form described is merely explanatory, and it is the intention of the following claims to encompass and include such changes.
- While the present disclosure has been described with reference to various embodiments, it will be understood that these embodiments are illustrative and that the scope of the disclosure is not limited to them. Many variations, modifications, additions, and improvements are possible. More generally, embodiments in accordance with the present disclosure have been described in the context of particular implementations. Functionality may be separated or combined in blocks differently in various embodiments of the disclosure or described with different terminology. These and other variations, modifications, additions, and improvements may fall within the scope of the disclosure as defined in the claims that follow.
Claims (12)
1. A system comprising:
at least one processor to:
receive identifying information and transmit the identifying information to an alarm response server from a mobile computing device;
receive, by the mobile computing device, a unique identifier that identifies the identifying information in a database associated with the alarm response server;
receive a trigger of an alarm notification by one of a wearable device and the mobile computing device;
determine a current location of the mobile computing device;
transmit an alarm notification message to the alarm response server, the alarm notification message including the current location of the mobile computing device and the unique identifier; and
transmit the current location of the mobile computing device to at least one drone.
2. The system of claim 1 , the at least one processor further to:
receive monitoring information from the at least one drone, the monitoring information comprising at least one of video, audio, and photographic information; and
store the monitoring information in the database and associate the monitoring information with the unique identifier.
3. The system of claim 1 , wherein the alarm response server determines at least one public safety answering point (PSAP) based on at least one of the current location of the mobile computing device and a safety score.
4. The system of claim 1 , wherein the alarm response server notifies the mobile computing device to determine whether the alarm notification is a false alarm.
5. The system of claim 1 , wherein the alarm response server sends the identifying information and the current location of the mobile computing device to a call center server and to at least one lifeline.
6. The system of claim 1 , wherein the alarm notification is triggered by one of a hardware button of the wearable device, a touch screen of the wearable device, a microphone of the wearable device, an accelerometer of the wearable device, a gyroscope of the wearable device, a fingerprint recognition device of the wearable device, a retina scanner of the wearable device, and a heart rate monitor of the wearable device.
7. The system of claim 1 , the at least one processor further to pair the wearable device with the mobile computing device.
8. A system comprising:
a mobile device including a processor and a communication interface;
an alarm response server including a processor and a communication interface; and
a drone including a processor, a communication interface, and a camera system;
wherein,
in response to an alarm condition, the alarm response server receives an alarm notification message from the mobile device including location information;
in response to receipt of the location information, the alarm response server communicates the location information to the drone; and
in response to receipt of the location information, the drone follows a route to the location, and records monitoring information at the location.
9. The system of claim 8 wherein the alarm response server sends the mobile device a unique identifier in response to receiving setup information from the mobile device.
10. The system of claim 9 wherein, in response to receiving the monitoring information from the drone, the alarm response server stores the monitoring information in an associated database and associates the monitoring information with the unique identifier.
11. The system of claim 9 wherein, in response to receiving the unique identifier and location information, the alarm response server communicates the unique identifier and location information to a call center server.
12. The system of claim 11 wherein, in response to receiving the unique identifier and location information, the call center server communicates the location information to one or more lifelines.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/824,011 US20160042637A1 (en) | 2014-08-11 | 2015-08-11 | Drone Safety Alert Monitoring System and Method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462035762P | 2014-08-11 | 2014-08-11 | |
US14/824,011 US20160042637A1 (en) | 2014-08-11 | 2015-08-11 | Drone Safety Alert Monitoring System and Method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160042637A1 true US20160042637A1 (en) | 2016-02-11 |
Family
ID=55267836
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/824,011 Abandoned US20160042637A1 (en) | 2014-08-11 | 2015-08-11 | Drone Safety Alert Monitoring System and Method |
Country Status (1)
Country | Link |
---|---|
US (1) | US20160042637A1 (en) |
Cited By (131)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150294514A1 (en) * | 2014-04-15 | 2015-10-15 | Disney Enterprises, Inc. | System and Method for Identification Triggered By Beacons |
US20160107749A1 (en) * | 2014-10-17 | 2016-04-21 | Tyco Fire & Security Gmbh | Fixed Drone Visualization In Security Systems |
US20160107750A1 (en) * | 2014-10-15 | 2016-04-21 | Flight of the Century, Inc. | Airborne drone delivery network and method of operating same |
US20160116914A1 (en) * | 2014-10-17 | 2016-04-28 | Tyco Fire & Security Gmbh | Drone Tours In Security Systems |
US20160191793A1 (en) * | 2014-12-29 | 2016-06-30 | Lg Electronics Inc. | Mobile device and method for controlling the same |
CN106482729A (en) * | 2016-10-25 | 2017-03-08 | 北京小米移动软件有限公司 | The method and device of unmanned plane pursuit movement target |
US20170201568A1 (en) * | 2016-01-08 | 2017-07-13 | Universal Research Solutions, Llc | Processing of Portable Device Data |
US20170227965A1 (en) * | 2008-08-11 | 2017-08-10 | Chris DeCenzo | Mobile premises automation platform |
CN107205111A (en) * | 2016-03-18 | 2017-09-26 | 奥林巴斯株式会社 | Camera device, mobile device, camera system, image capture method and recording medium |
US20170280412A1 (en) * | 2016-03-24 | 2017-09-28 | Chiun Mai Communication Systems, Inc. | Interactive communication system, method and wearable device therefor |
US20170293298A1 (en) * | 2014-12-25 | 2017-10-12 | SZ DJI Technology Co., Ltd. | Flight aiding method and system for unmanned aerial vehicle, unmanned aerial vehicle, and mobile terminal |
US20180000397A1 (en) * | 2016-06-30 | 2018-01-04 | Wellen Sham | Safety driving system |
US9994315B2 (en) * | 2015-07-16 | 2018-06-12 | Physio-Control, Inc. | Unmanned aerial vehicles in medical applications |
WO2018139952A1 (en) * | 2017-01-27 | 2018-08-02 | Леонид Михайлович БЕРЕЩАНСКИЙ | System for alerting to a threat to personal safety |
WO2018151621A1 (en) * | 2017-02-20 | 2018-08-23 | Леонид Михайлович БЕРЕЩАНСКИЙ | System for alerting to a threat to safety |
US10085683B1 (en) | 2017-08-11 | 2018-10-02 | Wellen Sham | Vehicle fatigue monitoring system |
US20180290731A1 (en) * | 2015-12-25 | 2018-10-11 | Sharp Kabushiki Kaisha | Mobile body, communication terminal, and control method for mobile body |
WO2018190748A1 (en) * | 2017-04-11 | 2018-10-18 | Леонид Михайлович БЕРЕЩАНСКИЙ | System for alerting to a threat to personal safety or to health (variants) |
WO2018194671A1 (en) * | 2017-04-21 | 2018-10-25 | Hewlett-Packard Development Company, L.P. | Assistance notifications in response to assistance events |
US20180322749A1 (en) * | 2017-05-05 | 2018-11-08 | Doron KEMPEL | System and method for threat monitoring, detection, and response |
US10127801B2 (en) | 2005-03-16 | 2018-11-13 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US10127802B2 (en) | 2010-09-28 | 2018-11-13 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US10142166B2 (en) | 2004-03-16 | 2018-11-27 | Icontrol Networks, Inc. | Takeover of security network |
US10142394B2 (en) | 2007-06-12 | 2018-11-27 | Icontrol Networks, Inc. | Generating risk profile using data of home monitoring and security system |
US10142392B2 (en) | 2007-01-24 | 2018-11-27 | Icontrol Networks, Inc. | Methods and systems for improved system performance |
US10140840B2 (en) | 2007-04-23 | 2018-11-27 | Icontrol Networks, Inc. | Method and system for providing alternate network access |
US10156831B2 (en) | 2004-03-16 | 2018-12-18 | Icontrol Networks, Inc. | Automation system with mobile interface |
US10156959B2 (en) | 2005-03-16 | 2018-12-18 | Icontrol Networks, Inc. | Cross-client sensor user interface in an integrated security network |
US10198954B2 (en) * | 2015-12-30 | 2019-02-05 | Motorola Solutions, Inc. | Method and apparatus for positioning an unmanned robotic vehicle |
US10200504B2 (en) | 2007-06-12 | 2019-02-05 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US20190072985A1 (en) * | 2017-09-01 | 2019-03-07 | Qualcomm Incorporated | Personal Security Robotic Vehicle |
US10237806B2 (en) | 2009-04-30 | 2019-03-19 | Icontrol Networks, Inc. | Activation of a home automation controller |
US10237237B2 (en) | 2007-06-12 | 2019-03-19 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10252701B2 (en) * | 2017-01-04 | 2019-04-09 | Industrial Technology Research Institute | Object tracking system and method therewith |
US10278050B2 (en) | 2015-05-26 | 2019-04-30 | Noonlight, Inc. | Systems and methods for providing assistance in an emergency |
US10283000B2 (en) * | 2015-10-23 | 2019-05-07 | Vigilair Limited | Unmanned aerial vehicle deployment system |
US10293768B2 (en) | 2017-08-11 | 2019-05-21 | Wellen Sham | Automatic in-vehicle component adjustment |
US10303415B1 (en) * | 2015-03-26 | 2019-05-28 | Amazon Technologies, Inc. | Mobile display array |
US10304315B2 (en) * | 2016-06-09 | 2019-05-28 | Amp Llc | Systems and methods for health monitoring and providing emergency support |
US10306449B2 (en) | 2016-08-26 | 2019-05-28 | Intrinsic Value, Llc | Systems, devices, and methods for emergency responses and safety |
US10313303B2 (en) | 2007-06-12 | 2019-06-04 | Icontrol Networks, Inc. | Forming a security network including integrated security system components and network devices |
US10339791B2 (en) | 2007-06-12 | 2019-07-02 | Icontrol Networks, Inc. | Security network integrated with premise security system |
US10348575B2 (en) | 2013-06-27 | 2019-07-09 | Icontrol Networks, Inc. | Control system user interface |
US10365810B2 (en) | 2007-06-12 | 2019-07-30 | Icontrol Networks, Inc. | Control system user interface |
US10380871B2 (en) | 2005-03-16 | 2019-08-13 | Icontrol Networks, Inc. | Control system user interface |
US10389736B2 (en) | 2007-06-12 | 2019-08-20 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10423309B2 (en) | 2007-06-12 | 2019-09-24 | Icontrol Networks, Inc. | Device integration framework |
US10447491B2 (en) | 2004-03-16 | 2019-10-15 | Icontrol Networks, Inc. | Premises system management using status signal |
US10498830B2 (en) | 2007-06-12 | 2019-12-03 | Icontrol Networks, Inc. | Wi-Fi-to-serial encapsulation in systems |
US10506413B2 (en) | 2017-08-28 | 2019-12-10 | Intrinsic Value, Llc | Systems, devices, and methods for emergency responses and safety |
US10522026B2 (en) | 2008-08-11 | 2019-12-31 | Icontrol Networks, Inc. | Automation system user interface with three-dimensional display |
US10523689B2 (en) | 2007-06-12 | 2019-12-31 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US10530839B2 (en) | 2008-08-11 | 2020-01-07 | Icontrol Networks, Inc. | Integrated cloud system with lightweight gateway for premises automation |
US10559193B2 (en) | 2002-02-01 | 2020-02-11 | Comcast Cable Communications, Llc | Premises management systems |
US10616075B2 (en) | 2007-06-12 | 2020-04-07 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10616244B2 (en) | 2006-06-12 | 2020-04-07 | Icontrol Networks, Inc. | Activation of gateway device |
US20200137212A1 (en) * | 2018-10-28 | 2020-04-30 | International Business Machines Corporation | Automated individual security through a wearable aerial device |
US10666523B2 (en) | 2007-06-12 | 2020-05-26 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10721087B2 (en) | 2005-03-16 | 2020-07-21 | Icontrol Networks, Inc. | Method for networked touchscreen with integrated interfaces |
US10741057B2 (en) | 2010-12-17 | 2020-08-11 | Icontrol Networks, Inc. | Method and system for processing security event data |
US10747216B2 (en) | 2007-02-28 | 2020-08-18 | Icontrol Networks, Inc. | Method and system for communicating with and controlling an alarm system from a remote server |
US10785319B2 (en) | 2006-06-12 | 2020-09-22 | Icontrol Networks, Inc. | IP device discovery systems and methods |
CN111711722A (en) * | 2020-05-27 | 2020-09-25 | 维沃移动通信有限公司 | Information reminding method and device and electronic equipment |
US10841381B2 (en) | 2005-03-16 | 2020-11-17 | Icontrol Networks, Inc. | Security system with networked touchscreen |
US10891959B1 (en) | 2016-07-01 | 2021-01-12 | Google Llc | Voice message capturing system |
US10979389B2 (en) | 2004-03-16 | 2021-04-13 | Icontrol Networks, Inc. | Premises management configuration and control |
US10999254B2 (en) | 2005-03-16 | 2021-05-04 | Icontrol Networks, Inc. | System for data routing in networks |
US11006263B2 (en) * | 2016-07-07 | 2021-05-11 | Ford Global Technologies, Llc | Vehicle-integrated drone |
US11089122B2 (en) | 2007-06-12 | 2021-08-10 | Icontrol Networks, Inc. | Controlling data routing among networks |
US20210266437A1 (en) * | 2017-04-23 | 2021-08-26 | Orcam Technologies Ltd. | Remotely identifying a location of a wearable apparatus |
US11113950B2 (en) | 2005-03-16 | 2021-09-07 | Icontrol Networks, Inc. | Gateway integrated with premises security system |
US11119510B2 (en) * | 2019-07-29 | 2021-09-14 | Aurora Flight Sciences Corporation, a subsidiary of The Boeing Company | Systems and methods for generating flight paths for navigating an aircraft |
US20210303006A1 (en) * | 2020-03-25 | 2021-09-30 | Tencent America LLC | Systems and methods for unmanned aerial system communication |
US20210314758A1 (en) * | 2016-06-09 | 2021-10-07 | Amp Llc | Systems and methods for health monitoring and providing emergency support |
US11146637B2 (en) | 2014-03-03 | 2021-10-12 | Icontrol Networks, Inc. | Media content management |
US11153266B2 (en) | 2004-03-16 | 2021-10-19 | Icontrol Networks, Inc. | Gateway registry methods and systems |
US11182060B2 (en) | 2004-03-16 | 2021-11-23 | Icontrol Networks, Inc. | Networked touchscreen with integrated interfaces |
US11201755B2 (en) | 2004-03-16 | 2021-12-14 | Icontrol Networks, Inc. | Premises system management using status signal |
US11212192B2 (en) | 2007-06-12 | 2021-12-28 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11216954B2 (en) * | 2018-04-18 | 2022-01-04 | Tg-17, Inc. | Systems and methods for real-time adjustment of neural networks for autonomous tracking and localization of moving subject |
US11218878B2 (en) | 2007-06-12 | 2022-01-04 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11237714B2 (en) | 2007-06-12 | 2022-02-01 | Control Networks, Inc. | Control system user interface |
US11240059B2 (en) | 2010-12-20 | 2022-02-01 | Icontrol Networks, Inc. | Defining and implementing sensor triggered response rules |
US11244545B2 (en) | 2004-03-16 | 2022-02-08 | Icontrol Networks, Inc. | Cross-client sensor user interface in an integrated security network |
US11259165B2 (en) | 2016-08-26 | 2022-02-22 | Intrinsic Value, Llc | Systems, devices, and methods for emergency responses and safety |
US11277465B2 (en) | 2004-03-16 | 2022-03-15 | Icontrol Networks, Inc. | Generating risk profile using data of home monitoring and security system |
US11310199B2 (en) | 2004-03-16 | 2022-04-19 | Icontrol Networks, Inc. | Premises management configuration and control |
US11316753B2 (en) | 2007-06-12 | 2022-04-26 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11316958B2 (en) | 2008-08-11 | 2022-04-26 | Icontrol Networks, Inc. | Virtual device systems and methods |
US11343380B2 (en) | 2004-03-16 | 2022-05-24 | Icontrol Networks, Inc. | Premises system automation |
US20220173934A1 (en) * | 2008-08-11 | 2022-06-02 | Icontrol Networks, Inc. | Mobile premises automation platform |
US11368327B2 (en) | 2008-08-11 | 2022-06-21 | Icontrol Networks, Inc. | Integrated cloud system for premises automation |
US11386384B2 (en) * | 2017-12-22 | 2022-07-12 | Wing Aviation Llc | Delivery-location recharging during aerial transport tasks |
US11398147B2 (en) | 2010-09-28 | 2022-07-26 | Icontrol Networks, Inc. | Method, system and apparatus for automated reporting of account and sensor zone information to a central station |
US11405463B2 (en) | 2014-03-03 | 2022-08-02 | Icontrol Networks, Inc. | Media content management |
US11424980B2 (en) | 2005-03-16 | 2022-08-23 | Icontrol Networks, Inc. | Forming a security network including integrated security system components |
US11423756B2 (en) | 2007-06-12 | 2022-08-23 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11451409B2 (en) | 2005-03-16 | 2022-09-20 | Icontrol Networks, Inc. | Security network integrating security system and network devices |
US11450106B2 (en) * | 2015-10-05 | 2022-09-20 | Pillar Vision, Inc. | Systems and methods for monitoring objects at sporting events |
US11490239B2 (en) * | 2018-06-15 | 2022-11-01 | Manmeetsingh Sethi | Emergency reporting application |
US11489812B2 (en) | 2004-03-16 | 2022-11-01 | Icontrol Networks, Inc. | Forming a security network including integrated security system components and network devices |
US11496568B2 (en) | 2005-03-16 | 2022-11-08 | Icontrol Networks, Inc. | Security system with networked touchscreen |
US11582065B2 (en) | 2007-06-12 | 2023-02-14 | Icontrol Networks, Inc. | Systems and methods for device communication |
US11601810B2 (en) | 2007-06-12 | 2023-03-07 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11615697B2 (en) | 2005-03-16 | 2023-03-28 | Icontrol Networks, Inc. | Premise management systems and methods |
US11646907B2 (en) | 2007-06-12 | 2023-05-09 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US20230156163A1 (en) * | 2021-11-12 | 2023-05-18 | Raymond Anthony Joao | Personal monitoring apparatus and methods |
US11677577B2 (en) | 2004-03-16 | 2023-06-13 | Icontrol Networks, Inc. | Premises system management using status signal |
US11700142B2 (en) | 2005-03-16 | 2023-07-11 | Icontrol Networks, Inc. | Security network integrating security system and network devices |
US11706279B2 (en) | 2007-01-24 | 2023-07-18 | Icontrol Networks, Inc. | Methods and systems for data communication |
US11706045B2 (en) | 2005-03-16 | 2023-07-18 | Icontrol Networks, Inc. | Modular electronic display platform |
US11729255B2 (en) | 2008-08-11 | 2023-08-15 | Icontrol Networks, Inc. | Integrated cloud system with lightweight gateway for premises automation |
US11750414B2 (en) | 2010-12-16 | 2023-09-05 | Icontrol Networks, Inc. | Bidirectional security sensor communication for a premises security system |
US11758026B2 (en) | 2008-08-11 | 2023-09-12 | Icontrol Networks, Inc. | Virtual device systems and methods |
US11780578B2 (en) | 2017-08-16 | 2023-10-10 | Cainiao Smart Logistics Holding Limited | Control channel allocation method, take-off method and remote control method for flight apparatus |
US20230322381A1 (en) * | 2016-02-23 | 2023-10-12 | State Farm Mutual Automobile Insurance Company | Systems and methods for operating drones in response to an incident |
US11792330B2 (en) | 2005-03-16 | 2023-10-17 | Icontrol Networks, Inc. | Communication and automation in a premises management system |
US11811845B2 (en) | 2004-03-16 | 2023-11-07 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11816323B2 (en) | 2008-06-25 | 2023-11-14 | Icontrol Networks, Inc. | Automation system user interface |
US11831462B2 (en) | 2007-08-24 | 2023-11-28 | Icontrol Networks, Inc. | Controlling data routing in premises management systems |
US11916928B2 (en) | 2008-01-24 | 2024-02-27 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11916870B2 (en) | 2004-03-16 | 2024-02-27 | Icontrol Networks, Inc. | Gateway registry methods and systems |
US11955020B2 (en) | 2021-06-09 | 2024-04-09 | Ford Global Technologies, Llc | Systems and methods for operating drone flights over public roadways |
US12003387B2 (en) | 2012-06-27 | 2024-06-04 | Comcast Cable Communications, Llc | Control system user interface |
US20240233453A1 (en) * | 2016-12-14 | 2024-07-11 | Telefonaktiebolaget Lm Ericsson (Publ) | Methods and entities for alerting about failure of an unmanned aerial vehicle |
US12063220B2 (en) | 2004-03-16 | 2024-08-13 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US12063221B2 (en) | 2006-06-12 | 2024-08-13 | Icontrol Networks, Inc. | Activation of gateway device |
US12077313B1 (en) | 2021-05-28 | 2024-09-03 | Onstation Corporation | Low-cost attritable aircraft modified with adaptive suites |
US12077314B1 (en) | 2021-04-08 | 2024-09-03 | Onstation Corporation | Transforming aircraft using low-cost attritable aircraft modified with adaptive suites |
US12184443B2 (en) | 2007-06-12 | 2024-12-31 | Icontrol Networks, Inc. | Controlling data routing among networks |
US12277853B2 (en) | 2021-07-30 | 2025-04-15 | Icontrol Networks, Inc. | Gateway integrated with premises security system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040193334A1 (en) * | 2003-03-27 | 2004-09-30 | Carl-Olof Carlsson | Waypoint navigation |
US20060167597A1 (en) * | 2005-01-24 | 2006-07-27 | Bodin William K | Enabling services on a UAV |
US8078162B2 (en) * | 2007-10-10 | 2011-12-13 | Battelle Energy Alliance, Llc | Airborne wireless communication systems, airborne communication methods, and communication methods |
US20150339912A1 (en) * | 2014-05-20 | 2015-11-26 | Ooma, Inc. | Security Monitoring and Control |
-
2015
- 2015-08-11 US US14/824,011 patent/US20160042637A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040193334A1 (en) * | 2003-03-27 | 2004-09-30 | Carl-Olof Carlsson | Waypoint navigation |
US20060167597A1 (en) * | 2005-01-24 | 2006-07-27 | Bodin William K | Enabling services on a UAV |
US8078162B2 (en) * | 2007-10-10 | 2011-12-13 | Battelle Energy Alliance, Llc | Airborne wireless communication systems, airborne communication methods, and communication methods |
US20150339912A1 (en) * | 2014-05-20 | 2015-11-26 | Ooma, Inc. | Security Monitoring and Control |
Cited By (247)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10559193B2 (en) | 2002-02-01 | 2020-02-11 | Comcast Cable Communications, Llc | Premises management systems |
US10890881B2 (en) | 2004-03-16 | 2021-01-12 | Icontrol Networks, Inc. | Premises management networking |
US11368429B2 (en) | 2004-03-16 | 2022-06-21 | Icontrol Networks, Inc. | Premises management configuration and control |
US11175793B2 (en) | 2004-03-16 | 2021-11-16 | Icontrol Networks, Inc. | User interface in a premises network |
US11184322B2 (en) | 2004-03-16 | 2021-11-23 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US12253833B2 (en) | 2004-03-16 | 2025-03-18 | Icontrol Networks, Inc. | Automation system with mobile interface |
US12063220B2 (en) | 2004-03-16 | 2024-08-13 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11991306B2 (en) | 2004-03-16 | 2024-05-21 | Icontrol Networks, Inc. | Premises system automation |
US11916870B2 (en) | 2004-03-16 | 2024-02-27 | Icontrol Networks, Inc. | Gateway registry methods and systems |
US11182060B2 (en) | 2004-03-16 | 2021-11-23 | Icontrol Networks, Inc. | Networked touchscreen with integrated interfaces |
US11449012B2 (en) | 2004-03-16 | 2022-09-20 | Icontrol Networks, Inc. | Premises management networking |
US11893874B2 (en) | 2004-03-16 | 2024-02-06 | Icontrol Networks, Inc. | Networked touchscreen with integrated interfaces |
US11201755B2 (en) | 2004-03-16 | 2021-12-14 | Icontrol Networks, Inc. | Premises system management using status signal |
US11410531B2 (en) | 2004-03-16 | 2022-08-09 | Icontrol Networks, Inc. | Automation system user interface with three-dimensional display |
US11082395B2 (en) | 2004-03-16 | 2021-08-03 | Icontrol Networks, Inc. | Premises management configuration and control |
US11043112B2 (en) | 2004-03-16 | 2021-06-22 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US11811845B2 (en) | 2004-03-16 | 2023-11-07 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11037433B2 (en) | 2004-03-16 | 2021-06-15 | Icontrol Networks, Inc. | Management of a security system at a premises |
US11489812B2 (en) | 2004-03-16 | 2022-11-01 | Icontrol Networks, Inc. | Forming a security network including integrated security system components and network devices |
US10692356B2 (en) | 2004-03-16 | 2020-06-23 | Icontrol Networks, Inc. | Control system user interface |
US11782394B2 (en) | 2004-03-16 | 2023-10-10 | Icontrol Networks, Inc. | Automation system with mobile interface |
US11757834B2 (en) | 2004-03-16 | 2023-09-12 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10447491B2 (en) | 2004-03-16 | 2019-10-15 | Icontrol Networks, Inc. | Premises system management using status signal |
US10691295B2 (en) | 2004-03-16 | 2020-06-23 | Icontrol Networks, Inc. | User interface in a premises network |
US10992784B2 (en) | 2004-03-16 | 2021-04-27 | Control Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11537186B2 (en) | 2004-03-16 | 2022-12-27 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US11378922B2 (en) | 2004-03-16 | 2022-07-05 | Icontrol Networks, Inc. | Automation system with mobile interface |
US10142166B2 (en) | 2004-03-16 | 2018-11-27 | Icontrol Networks, Inc. | Takeover of security network |
US10979389B2 (en) | 2004-03-16 | 2021-04-13 | Icontrol Networks, Inc. | Premises management configuration and control |
US11153266B2 (en) | 2004-03-16 | 2021-10-19 | Icontrol Networks, Inc. | Gateway registry methods and systems |
US10735249B2 (en) | 2004-03-16 | 2020-08-04 | Icontrol Networks, Inc. | Management of a security system at a premises |
US10156831B2 (en) | 2004-03-16 | 2018-12-18 | Icontrol Networks, Inc. | Automation system with mobile interface |
US11677577B2 (en) | 2004-03-16 | 2023-06-13 | Icontrol Networks, Inc. | Premises system management using status signal |
US11159484B2 (en) | 2004-03-16 | 2021-10-26 | Icontrol Networks, Inc. | Forming a security network including integrated security system components and network devices |
US11588787B2 (en) | 2004-03-16 | 2023-02-21 | Icontrol Networks, Inc. | Premises management configuration and control |
US11810445B2 (en) | 2004-03-16 | 2023-11-07 | Icontrol Networks, Inc. | Cross-client sensor user interface in an integrated security network |
US11656667B2 (en) | 2004-03-16 | 2023-05-23 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US11244545B2 (en) | 2004-03-16 | 2022-02-08 | Icontrol Networks, Inc. | Cross-client sensor user interface in an integrated security network |
US11277465B2 (en) | 2004-03-16 | 2022-03-15 | Icontrol Networks, Inc. | Generating risk profile using data of home monitoring and security system |
US11310199B2 (en) | 2004-03-16 | 2022-04-19 | Icontrol Networks, Inc. | Premises management configuration and control |
US11625008B2 (en) | 2004-03-16 | 2023-04-11 | Icontrol Networks, Inc. | Premises management networking |
US10796557B2 (en) | 2004-03-16 | 2020-10-06 | Icontrol Networks, Inc. | Automation system user interface with three-dimensional display |
US11626006B2 (en) | 2004-03-16 | 2023-04-11 | Icontrol Networks, Inc. | Management of a security system at a premises |
US10754304B2 (en) | 2004-03-16 | 2020-08-25 | Icontrol Networks, Inc. | Automation system with mobile interface |
US11343380B2 (en) | 2004-03-16 | 2022-05-24 | Icontrol Networks, Inc. | Premises system automation |
US11601397B2 (en) | 2004-03-16 | 2023-03-07 | Icontrol Networks, Inc. | Premises management configuration and control |
US10380871B2 (en) | 2005-03-16 | 2019-08-13 | Icontrol Networks, Inc. | Control system user interface |
US10930136B2 (en) | 2005-03-16 | 2021-02-23 | Icontrol Networks, Inc. | Premise management systems and methods |
US11615697B2 (en) | 2005-03-16 | 2023-03-28 | Icontrol Networks, Inc. | Premise management systems and methods |
US10841381B2 (en) | 2005-03-16 | 2020-11-17 | Icontrol Networks, Inc. | Security system with networked touchscreen |
US11367340B2 (en) | 2005-03-16 | 2022-06-21 | Icontrol Networks, Inc. | Premise management systems and methods |
US10156959B2 (en) | 2005-03-16 | 2018-12-18 | Icontrol Networks, Inc. | Cross-client sensor user interface in an integrated security network |
US11706045B2 (en) | 2005-03-16 | 2023-07-18 | Icontrol Networks, Inc. | Modular electronic display platform |
US10721087B2 (en) | 2005-03-16 | 2020-07-21 | Icontrol Networks, Inc. | Method for networked touchscreen with integrated interfaces |
US11700142B2 (en) | 2005-03-16 | 2023-07-11 | Icontrol Networks, Inc. | Security network integrating security system and network devices |
US10999254B2 (en) | 2005-03-16 | 2021-05-04 | Icontrol Networks, Inc. | System for data routing in networks |
US11595364B2 (en) | 2005-03-16 | 2023-02-28 | Icontrol Networks, Inc. | System for data routing in networks |
US10127801B2 (en) | 2005-03-16 | 2018-11-13 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US11424980B2 (en) | 2005-03-16 | 2022-08-23 | Icontrol Networks, Inc. | Forming a security network including integrated security system components |
US11792330B2 (en) | 2005-03-16 | 2023-10-17 | Icontrol Networks, Inc. | Communication and automation in a premises management system |
US11496568B2 (en) | 2005-03-16 | 2022-11-08 | Icontrol Networks, Inc. | Security system with networked touchscreen |
US11113950B2 (en) | 2005-03-16 | 2021-09-07 | Icontrol Networks, Inc. | Gateway integrated with premises security system |
US11451409B2 (en) | 2005-03-16 | 2022-09-20 | Icontrol Networks, Inc. | Security network integrating security system and network devices |
US11824675B2 (en) | 2005-03-16 | 2023-11-21 | Icontrol Networks, Inc. | Networked touchscreen with integrated interfaces |
US11418518B2 (en) | 2006-06-12 | 2022-08-16 | Icontrol Networks, Inc. | Activation of gateway device |
US10616244B2 (en) | 2006-06-12 | 2020-04-07 | Icontrol Networks, Inc. | Activation of gateway device |
US12063221B2 (en) | 2006-06-12 | 2024-08-13 | Icontrol Networks, Inc. | Activation of gateway device |
US10785319B2 (en) | 2006-06-12 | 2020-09-22 | Icontrol Networks, Inc. | IP device discovery systems and methods |
US10142392B2 (en) | 2007-01-24 | 2018-11-27 | Icontrol Networks, Inc. | Methods and systems for improved system performance |
US12120171B2 (en) | 2007-01-24 | 2024-10-15 | Icontrol Networks, Inc. | Methods and systems for data communication |
US11412027B2 (en) | 2007-01-24 | 2022-08-09 | Icontrol Networks, Inc. | Methods and systems for data communication |
US11706279B2 (en) | 2007-01-24 | 2023-07-18 | Icontrol Networks, Inc. | Methods and systems for data communication |
US11418572B2 (en) | 2007-01-24 | 2022-08-16 | Icontrol Networks, Inc. | Methods and systems for improved system performance |
US10747216B2 (en) | 2007-02-28 | 2020-08-18 | Icontrol Networks, Inc. | Method and system for communicating with and controlling an alarm system from a remote server |
US10657794B1 (en) | 2007-02-28 | 2020-05-19 | Icontrol Networks, Inc. | Security, monitoring and automation controller access and use of legacy security control panel information |
US11194320B2 (en) | 2007-02-28 | 2021-12-07 | Icontrol Networks, Inc. | Method and system for managing communication connectivity |
US11809174B2 (en) | 2007-02-28 | 2023-11-07 | Icontrol Networks, Inc. | Method and system for managing communication connectivity |
US11132888B2 (en) | 2007-04-23 | 2021-09-28 | Icontrol Networks, Inc. | Method and system for providing alternate network access |
US10672254B2 (en) | 2007-04-23 | 2020-06-02 | Icontrol Networks, Inc. | Method and system for providing alternate network access |
US10140840B2 (en) | 2007-04-23 | 2018-11-27 | Icontrol Networks, Inc. | Method and system for providing alternate network access |
US11663902B2 (en) | 2007-04-23 | 2023-05-30 | Icontrol Networks, Inc. | Method and system for providing alternate network access |
US10389736B2 (en) | 2007-06-12 | 2019-08-20 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10142394B2 (en) | 2007-06-12 | 2018-11-27 | Icontrol Networks, Inc. | Generating risk profile using data of home monitoring and security system |
US10339791B2 (en) | 2007-06-12 | 2019-07-02 | Icontrol Networks, Inc. | Security network integrated with premise security system |
US10365810B2 (en) | 2007-06-12 | 2019-07-30 | Icontrol Networks, Inc. | Control system user interface |
US10313303B2 (en) | 2007-06-12 | 2019-06-04 | Icontrol Networks, Inc. | Forming a security network including integrated security system components and network devices |
US11601810B2 (en) | 2007-06-12 | 2023-03-07 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11611568B2 (en) | 2007-06-12 | 2023-03-21 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11625161B2 (en) | 2007-06-12 | 2023-04-11 | Icontrol Networks, Inc. | Control system user interface |
US11316753B2 (en) | 2007-06-12 | 2022-04-26 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11632308B2 (en) | 2007-06-12 | 2023-04-18 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10237237B2 (en) | 2007-06-12 | 2019-03-19 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11646907B2 (en) | 2007-06-12 | 2023-05-09 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10423309B2 (en) | 2007-06-12 | 2019-09-24 | Icontrol Networks, Inc. | Device integration framework |
US11237714B2 (en) | 2007-06-12 | 2022-02-01 | Control Networks, Inc. | Control system user interface |
US11218878B2 (en) | 2007-06-12 | 2022-01-04 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10200504B2 (en) | 2007-06-12 | 2019-02-05 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US10444964B2 (en) | 2007-06-12 | 2019-10-15 | Icontrol Networks, Inc. | Control system user interface |
US11582065B2 (en) | 2007-06-12 | 2023-02-14 | Icontrol Networks, Inc. | Systems and methods for device communication |
US10498830B2 (en) | 2007-06-12 | 2019-12-03 | Icontrol Networks, Inc. | Wi-Fi-to-serial encapsulation in systems |
US11722896B2 (en) | 2007-06-12 | 2023-08-08 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11212192B2 (en) | 2007-06-12 | 2021-12-28 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10523689B2 (en) | 2007-06-12 | 2019-12-31 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US12250547B2 (en) | 2007-06-12 | 2025-03-11 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US12184443B2 (en) | 2007-06-12 | 2024-12-31 | Icontrol Networks, Inc. | Controlling data routing among networks |
US11089122B2 (en) | 2007-06-12 | 2021-08-10 | Icontrol Networks, Inc. | Controlling data routing among networks |
US10616075B2 (en) | 2007-06-12 | 2020-04-07 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11894986B2 (en) | 2007-06-12 | 2024-02-06 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10666523B2 (en) | 2007-06-12 | 2020-05-26 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11423756B2 (en) | 2007-06-12 | 2022-08-23 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11815969B2 (en) | 2007-08-10 | 2023-11-14 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US11831462B2 (en) | 2007-08-24 | 2023-11-28 | Icontrol Networks, Inc. | Controlling data routing in premises management systems |
US11916928B2 (en) | 2008-01-24 | 2024-02-27 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11816323B2 (en) | 2008-06-25 | 2023-11-14 | Icontrol Networks, Inc. | Automation system user interface |
US11641391B2 (en) | 2008-08-11 | 2023-05-02 | Icontrol Networks Inc. | Integrated cloud system with lightweight gateway for premises automation |
US11258625B2 (en) * | 2008-08-11 | 2022-02-22 | Icontrol Networks, Inc. | Mobile premises automation platform |
US12267385B2 (en) | 2008-08-11 | 2025-04-01 | Icontrol Networks, Inc. | Integrated cloud system with lightweight gateway for premises automation |
US11616659B2 (en) | 2008-08-11 | 2023-03-28 | Icontrol Networks, Inc. | Integrated cloud system for premises automation |
US12244663B2 (en) | 2008-08-11 | 2025-03-04 | Icontrol Networks, Inc. | Integrated cloud system with lightweight gateway for premises automation |
US10522026B2 (en) | 2008-08-11 | 2019-12-31 | Icontrol Networks, Inc. | Automation system user interface with three-dimensional display |
US11190578B2 (en) | 2008-08-11 | 2021-11-30 | Icontrol Networks, Inc. | Integrated cloud system with lightweight gateway for premises automation |
US10530839B2 (en) | 2008-08-11 | 2020-01-07 | Icontrol Networks, Inc. | Integrated cloud system with lightweight gateway for premises automation |
US11711234B2 (en) | 2008-08-11 | 2023-07-25 | Icontrol Networks, Inc. | Integrated cloud system for premises automation |
US11729255B2 (en) | 2008-08-11 | 2023-08-15 | Icontrol Networks, Inc. | Integrated cloud system with lightweight gateway for premises automation |
US11758026B2 (en) | 2008-08-11 | 2023-09-12 | Icontrol Networks, Inc. | Virtual device systems and methods |
US11368327B2 (en) | 2008-08-11 | 2022-06-21 | Icontrol Networks, Inc. | Integrated cloud system for premises automation |
US11792036B2 (en) * | 2008-08-11 | 2023-10-17 | Icontrol Networks, Inc. | Mobile premises automation platform |
US20220173934A1 (en) * | 2008-08-11 | 2022-06-02 | Icontrol Networks, Inc. | Mobile premises automation platform |
US11316958B2 (en) | 2008-08-11 | 2022-04-26 | Icontrol Networks, Inc. | Virtual device systems and methods |
US11962672B2 (en) | 2008-08-11 | 2024-04-16 | Icontrol Networks, Inc. | Virtual device systems and methods |
US20170227965A1 (en) * | 2008-08-11 | 2017-08-10 | Chris DeCenzo | Mobile premises automation platform |
US10674428B2 (en) | 2009-04-30 | 2020-06-02 | Icontrol Networks, Inc. | Hardware configurable security, monitoring and automation controller having modular communication protocol interfaces |
US10813034B2 (en) | 2009-04-30 | 2020-10-20 | Icontrol Networks, Inc. | Method, system and apparatus for management of applications for an SMA controller |
US11284331B2 (en) | 2009-04-30 | 2022-03-22 | Icontrol Networks, Inc. | Server-based notification of alarm event subsequent to communication failure with armed security system |
US10332363B2 (en) | 2009-04-30 | 2019-06-25 | Icontrol Networks, Inc. | Controller and interface for home security, monitoring and automation having customizable audio alerts for SMA events |
US11601865B2 (en) | 2009-04-30 | 2023-03-07 | Icontrol Networks, Inc. | Server-based notification of alarm event subsequent to communication failure with armed security system |
US12245131B2 (en) | 2009-04-30 | 2025-03-04 | Icontrol Networks, Inc. | Security, monitoring and automation controller access and use of legacy security control panel information |
US10275999B2 (en) | 2009-04-30 | 2019-04-30 | Icontrol Networks, Inc. | Server-based notification of alarm event subsequent to communication failure with armed security system |
US10237806B2 (en) | 2009-04-30 | 2019-03-19 | Icontrol Networks, Inc. | Activation of a home automation controller |
US11129084B2 (en) | 2009-04-30 | 2021-09-21 | Icontrol Networks, Inc. | Notification of event subsequent to communication failure with security system |
US12127095B2 (en) | 2009-04-30 | 2024-10-22 | Icontrol Networks, Inc. | Custom content for premises management |
US11997584B2 (en) | 2009-04-30 | 2024-05-28 | Icontrol Networks, Inc. | Activation of a home automation controller |
US11356926B2 (en) | 2009-04-30 | 2022-06-07 | Icontrol Networks, Inc. | Hardware configurable security, monitoring and automation controller having modular communication protocol interfaces |
US11223998B2 (en) | 2009-04-30 | 2022-01-11 | Icontrol Networks, Inc. | Security, monitoring and automation controller access and use of legacy security control panel information |
US11665617B2 (en) | 2009-04-30 | 2023-05-30 | Icontrol Networks, Inc. | Server-based notification of alarm event subsequent to communication failure with armed security system |
US11553399B2 (en) | 2009-04-30 | 2023-01-10 | Icontrol Networks, Inc. | Custom content for premises management |
US11778534B2 (en) | 2009-04-30 | 2023-10-03 | Icontrol Networks, Inc. | Hardware configurable security, monitoring and automation controller having modular communication protocol interfaces |
US11398147B2 (en) | 2010-09-28 | 2022-07-26 | Icontrol Networks, Inc. | Method, system and apparatus for automated reporting of account and sensor zone information to a central station |
US10127802B2 (en) | 2010-09-28 | 2018-11-13 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US11900790B2 (en) | 2010-09-28 | 2024-02-13 | Icontrol Networks, Inc. | Method, system and apparatus for automated reporting of account and sensor zone information to a central station |
US10223903B2 (en) | 2010-09-28 | 2019-03-05 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US11750414B2 (en) | 2010-12-16 | 2023-09-05 | Icontrol Networks, Inc. | Bidirectional security sensor communication for a premises security system |
US12088425B2 (en) | 2010-12-16 | 2024-09-10 | Icontrol Networks, Inc. | Bidirectional security sensor communication for a premises security system |
US10741057B2 (en) | 2010-12-17 | 2020-08-11 | Icontrol Networks, Inc. | Method and system for processing security event data |
US12100287B2 (en) | 2010-12-17 | 2024-09-24 | Icontrol Networks, Inc. | Method and system for processing security event data |
US11341840B2 (en) | 2010-12-17 | 2022-05-24 | Icontrol Networks, Inc. | Method and system for processing security event data |
US12021649B2 (en) | 2010-12-20 | 2024-06-25 | Icontrol Networks, Inc. | Defining and implementing sensor triggered response rules |
US11240059B2 (en) | 2010-12-20 | 2022-02-01 | Icontrol Networks, Inc. | Defining and implementing sensor triggered response rules |
US12003387B2 (en) | 2012-06-27 | 2024-06-04 | Comcast Cable Communications, Llc | Control system user interface |
US11296950B2 (en) | 2013-06-27 | 2022-04-05 | Icontrol Networks, Inc. | Control system user interface |
US10348575B2 (en) | 2013-06-27 | 2019-07-09 | Icontrol Networks, Inc. | Control system user interface |
US11146637B2 (en) | 2014-03-03 | 2021-10-12 | Icontrol Networks, Inc. | Media content management |
US11405463B2 (en) | 2014-03-03 | 2022-08-02 | Icontrol Networks, Inc. | Media content management |
US11943301B2 (en) | 2014-03-03 | 2024-03-26 | Icontrol Networks, Inc. | Media content management |
US9875588B2 (en) * | 2014-04-15 | 2018-01-23 | Disney Enterprises, Inc. | System and method for identification triggered by beacons |
US20150294514A1 (en) * | 2014-04-15 | 2015-10-15 | Disney Enterprises, Inc. | System and Method for Identification Triggered By Beacons |
US9868526B2 (en) * | 2014-10-15 | 2018-01-16 | W. Morrison Consulting Group, Inc. | Airborne drone delivery network and method of operating same |
US20160107750A1 (en) * | 2014-10-15 | 2016-04-21 | Flight of the Century, Inc. | Airborne drone delivery network and method of operating same |
US11753162B2 (en) | 2014-10-17 | 2023-09-12 | Johnson Controls Tyco IP Holdings LLP | Fixed drone visualization in security systems |
US11157021B2 (en) * | 2014-10-17 | 2021-10-26 | Tyco Fire & Security Gmbh | Drone tours in security systems |
US20160107749A1 (en) * | 2014-10-17 | 2016-04-21 | Tyco Fire & Security Gmbh | Fixed Drone Visualization In Security Systems |
US11414188B2 (en) | 2014-10-17 | 2022-08-16 | Johnson Controls Tyco IP Holdings LLP | Fixed drone visualization in security systems |
US20160116914A1 (en) * | 2014-10-17 | 2016-04-28 | Tyco Fire & Security Gmbh | Drone Tours In Security Systems |
US12071238B2 (en) | 2014-10-17 | 2024-08-27 | Tyco Fire & Security Gmbh | Fixed drone visualization in security systems |
US10301018B2 (en) * | 2014-10-17 | 2019-05-28 | Tyco Fire & Security Gmbh | Fixed drone visualization in security systems |
US10795354B2 (en) * | 2014-12-25 | 2020-10-06 | SZ DJI Technology Co., Ltd. | Flight aiding method and system for unmanned aerial vehicle, unmanned aerial vehicle, and mobile terminal |
US20170293298A1 (en) * | 2014-12-25 | 2017-10-12 | SZ DJI Technology Co., Ltd. | Flight aiding method and system for unmanned aerial vehicle, unmanned aerial vehicle, and mobile terminal |
US11474516B2 (en) * | 2014-12-25 | 2022-10-18 | SZ DJI Technology Co., Ltd. | Flight aiding method and system for unmanned aerial vehicle, unmanned aerial vehicle, and mobile terminal |
US9635248B2 (en) * | 2014-12-29 | 2017-04-25 | Lg Electronics Inc. | Mobile device and method for controlling the same |
US20160191793A1 (en) * | 2014-12-29 | 2016-06-30 | Lg Electronics Inc. | Mobile device and method for controlling the same |
US10303415B1 (en) * | 2015-03-26 | 2019-05-28 | Amazon Technologies, Inc. | Mobile display array |
US10278050B2 (en) | 2015-05-26 | 2019-04-30 | Noonlight, Inc. | Systems and methods for providing assistance in an emergency |
US10814978B2 (en) | 2015-07-16 | 2020-10-27 | Physio-Control, Inc. | Unmanned aerial vehicles in medical applications |
US9994315B2 (en) * | 2015-07-16 | 2018-06-12 | Physio-Control, Inc. | Unmanned aerial vehicles in medical applications |
US20220415048A1 (en) * | 2015-10-05 | 2022-12-29 | Pillar Vision, Inc. | Systems and methods for monitoring objects at sporting events |
US11450106B2 (en) * | 2015-10-05 | 2022-09-20 | Pillar Vision, Inc. | Systems and methods for monitoring objects at sporting events |
US10283000B2 (en) * | 2015-10-23 | 2019-05-07 | Vigilair Limited | Unmanned aerial vehicle deployment system |
US20180290731A1 (en) * | 2015-12-25 | 2018-10-11 | Sharp Kabushiki Kaisha | Mobile body, communication terminal, and control method for mobile body |
US10198954B2 (en) * | 2015-12-30 | 2019-02-05 | Motorola Solutions, Inc. | Method and apparatus for positioning an unmanned robotic vehicle |
US20170201568A1 (en) * | 2016-01-08 | 2017-07-13 | Universal Research Solutions, Llc | Processing of Portable Device Data |
US10103947B2 (en) * | 2016-01-08 | 2018-10-16 | Universal Research Solutions, Llc | Processing of portable device data |
US20230322381A1 (en) * | 2016-02-23 | 2023-10-12 | State Farm Mutual Automobile Insurance Company | Systems and methods for operating drones in response to an incident |
CN107205111A (en) * | 2016-03-18 | 2017-09-26 | 奥林巴斯株式会社 | Camera device, mobile device, camera system, image capture method and recording medium |
US20170280412A1 (en) * | 2016-03-24 | 2017-09-28 | Chiun Mai Communication Systems, Inc. | Interactive communication system, method and wearable device therefor |
US10741053B2 (en) * | 2016-06-09 | 2020-08-11 | Amp Llc | Systems and methods for health monitoring and providing emergency support |
US11632661B2 (en) * | 2016-06-09 | 2023-04-18 | Amp Llc | Systems and methods for health monitoring and providing emergency support |
US11470461B2 (en) * | 2016-06-09 | 2022-10-11 | Amp Llc | Systems and methods for health monitoring and providing emergency support |
US20230328498A1 (en) * | 2016-06-09 | 2023-10-12 | Amp Llc | Systems and methods for health monitoring and providing emergency support |
US12101706B2 (en) * | 2016-06-09 | 2024-09-24 | Amp Llc | Systems and methods for health monitoring and providing emergency support |
US10304315B2 (en) * | 2016-06-09 | 2019-05-28 | Amp Llc | Systems and methods for health monitoring and providing emergency support |
US20210314758A1 (en) * | 2016-06-09 | 2021-10-07 | Amp Llc | Systems and methods for health monitoring and providing emergency support |
US20190281433A1 (en) * | 2016-06-09 | 2019-09-12 | Amp Llc | Systems and methods for health monitoring and providing emergency support |
US20180000397A1 (en) * | 2016-06-30 | 2018-01-04 | Wellen Sham | Safety driving system |
US10610145B2 (en) * | 2016-06-30 | 2020-04-07 | Wellen Sham | Safety driving system |
US10891959B1 (en) | 2016-07-01 | 2021-01-12 | Google Llc | Voice message capturing system |
US11527251B1 (en) | 2016-07-01 | 2022-12-13 | Google Llc | Voice message capturing system |
US12183349B1 (en) | 2016-07-01 | 2024-12-31 | Google Llc | Voice message capturing system |
US11006263B2 (en) * | 2016-07-07 | 2021-05-11 | Ford Global Technologies, Llc | Vehicle-integrated drone |
US10609542B2 (en) | 2016-08-26 | 2020-03-31 | Intrinsic Value, Llc | Systems, devices, and methods for emergency responses and safety |
US11259165B2 (en) | 2016-08-26 | 2022-02-22 | Intrinsic Value, Llc | Systems, devices, and methods for emergency responses and safety |
US10869181B2 (en) | 2016-08-26 | 2020-12-15 | Intrinsic Value, Llc | Systems, devices, and methods for emergency responses and safety |
US10516983B2 (en) | 2016-08-26 | 2019-12-24 | Intrinsic Value, Llc | Systems, devices, and methods for emergency responses and safety |
US10306449B2 (en) | 2016-08-26 | 2019-05-28 | Intrinsic Value, Llc | Systems, devices, and methods for emergency responses and safety |
US10531265B2 (en) | 2016-08-26 | 2020-01-07 | Intrinsic Value, Llc | Systems, devices, and methods for emergency responses and safety |
CN106482729A (en) * | 2016-10-25 | 2017-03-08 | 北京小米移动软件有限公司 | The method and device of unmanned plane pursuit movement target |
US20240233453A1 (en) * | 2016-12-14 | 2024-07-11 | Telefonaktiebolaget Lm Ericsson (Publ) | Methods and entities for alerting about failure of an unmanned aerial vehicle |
US10252701B2 (en) * | 2017-01-04 | 2019-04-09 | Industrial Technology Research Institute | Object tracking system and method therewith |
WO2018139952A1 (en) * | 2017-01-27 | 2018-08-02 | Леонид Михайлович БЕРЕЩАНСКИЙ | System for alerting to a threat to personal safety |
WO2018151621A1 (en) * | 2017-02-20 | 2018-08-23 | Леонид Михайлович БЕРЕЩАНСКИЙ | System for alerting to a threat to safety |
WO2018190748A1 (en) * | 2017-04-11 | 2018-10-18 | Леонид Михайлович БЕРЕЩАНСКИЙ | System for alerting to a threat to personal safety or to health (variants) |
WO2018194671A1 (en) * | 2017-04-21 | 2018-10-25 | Hewlett-Packard Development Company, L.P. | Assistance notifications in response to assistance events |
US20210266437A1 (en) * | 2017-04-23 | 2021-08-26 | Orcam Technologies Ltd. | Remotely identifying a location of a wearable apparatus |
US10600295B2 (en) * | 2017-05-05 | 2020-03-24 | Tg-17, Inc. | System and method for threat monitoring, detection, and response |
WO2018204807A1 (en) * | 2017-05-05 | 2018-11-08 | Tg-17, Llc | System and method for threat monitoring, detection, and response |
US20180322749A1 (en) * | 2017-05-05 | 2018-11-08 | Doron KEMPEL | System and method for threat monitoring, detection, and response |
US10293768B2 (en) | 2017-08-11 | 2019-05-21 | Wellen Sham | Automatic in-vehicle component adjustment |
US10085683B1 (en) | 2017-08-11 | 2018-10-02 | Wellen Sham | Vehicle fatigue monitoring system |
US11780578B2 (en) | 2017-08-16 | 2023-10-10 | Cainiao Smart Logistics Holding Limited | Control channel allocation method, take-off method and remote control method for flight apparatus |
US10506413B2 (en) | 2017-08-28 | 2019-12-10 | Intrinsic Value, Llc | Systems, devices, and methods for emergency responses and safety |
US20190072985A1 (en) * | 2017-09-01 | 2019-03-07 | Qualcomm Incorporated | Personal Security Robotic Vehicle |
US10725483B2 (en) * | 2017-09-01 | 2020-07-28 | Qualcomm Incorporated | Personal security robotic vehicle |
US11294398B2 (en) | 2017-09-01 | 2022-04-05 | Qualcomm Incorporated | Personal security robotic vehicle |
US11386384B2 (en) * | 2017-12-22 | 2022-07-12 | Wing Aviation Llc | Delivery-location recharging during aerial transport tasks |
US20230034563A1 (en) * | 2017-12-22 | 2023-02-02 | Wing Aviation Llc | Recipient-Assisted Vehicle Recharging |
US11803804B2 (en) * | 2017-12-22 | 2023-10-31 | Wing Aviation Llc | Recipient-assisted vehicle recharging |
US11216954B2 (en) * | 2018-04-18 | 2022-01-04 | Tg-17, Inc. | Systems and methods for real-time adjustment of neural networks for autonomous tracking and localization of moving subject |
US11490239B2 (en) * | 2018-06-15 | 2022-11-01 | Manmeetsingh Sethi | Emergency reporting application |
US20200137212A1 (en) * | 2018-10-28 | 2020-04-30 | International Business Machines Corporation | Automated individual security through a wearable aerial device |
US10798237B2 (en) * | 2018-10-28 | 2020-10-06 | International Business Machines Corporation | Automated individual security |
US11119510B2 (en) * | 2019-07-29 | 2021-09-14 | Aurora Flight Sciences Corporation, a subsidiary of The Boeing Company | Systems and methods for generating flight paths for navigating an aircraft |
US20210303006A1 (en) * | 2020-03-25 | 2021-09-30 | Tencent America LLC | Systems and methods for unmanned aerial system communication |
CN111711722A (en) * | 2020-05-27 | 2020-09-25 | 维沃移动通信有限公司 | Information reminding method and device and electronic equipment |
US12077314B1 (en) | 2021-04-08 | 2024-09-03 | Onstation Corporation | Transforming aircraft using low-cost attritable aircraft modified with adaptive suites |
US12077313B1 (en) | 2021-05-28 | 2024-09-03 | Onstation Corporation | Low-cost attritable aircraft modified with adaptive suites |
US11955020B2 (en) | 2021-06-09 | 2024-04-09 | Ford Global Technologies, Llc | Systems and methods for operating drone flights over public roadways |
US12277853B2 (en) | 2021-07-30 | 2025-04-15 | Icontrol Networks, Inc. | Gateway integrated with premises security system |
US20230156163A1 (en) * | 2021-11-12 | 2023-05-18 | Raymond Anthony Joao | Personal monitoring apparatus and methods |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20160042637A1 (en) | Drone Safety Alert Monitoring System and Method | |
US11570607B2 (en) | Systems and methods for identifying and activating emergency response assets | |
US12184803B2 (en) | Unmanned aerial vehicle emergency dispatch and diagnostics data apparatus, systems and methods | |
US11785458B2 (en) | Security and public safety application for a mobile device | |
US11665523B2 (en) | Systems and methods for emergency communications amongst groups of devices based on shared data | |
US20230114663A1 (en) | Systems and methods for emergency data integration | |
AU2018391963B2 (en) | Method, device, and system for adaptive training of machine learning models via detected in-field contextual sensor events and associated located and retrieved digital audio and/or video imaging | |
AU2018391964B2 (en) | Training a machine learning model with digital audio and/or video | |
US9454889B2 (en) | Security and public safety application for a mobile device | |
US9883370B2 (en) | Security and public safety application for a mobile device with enhanced incident reporting capabilities | |
US10037667B2 (en) | Wristband and application to allow one person to monitor another | |
US11234111B2 (en) | Systems and methods for improving alert messaging using device to device communication | |
US20220014895A1 (en) | Spatiotemporal analysis for emergency response | |
US10489649B2 (en) | Drone data locker system | |
US10332376B2 (en) | Workplace management system and wearable device therefor | |
US10212570B1 (en) | System for providing notifications to user devices | |
CA3162430A1 (en) | Systems and methods for delivering and supporting digital requests for emergency service | |
US20200074839A1 (en) | Situational awareness platform, methods, and devices | |
US10510240B2 (en) | Methods and systems for evaluating compliance of communication of a dispatcher | |
Gunasundari et al. | Gesture Controlled Drone Swarm System for Violence Detection Using Machine Learning for Women Safety | |
US12101699B2 (en) | Security ecosystem, device and method for communicating with communication devices based on workflow interactions |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CLANDESTINE DEVELOPMENT, LLC, ILLINOIS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CAHILL, PETER;REEL/FRAME:036309/0874 Effective date: 20150810 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |