US20080022575A1 - Spotter scope - Google Patents
Spotter scope Download PDFInfo
- Publication number
- US20080022575A1 US20080022575A1 US11/696,050 US69605007A US2008022575A1 US 20080022575 A1 US20080022575 A1 US 20080022575A1 US 69605007 A US69605007 A US 69605007A US 2008022575 A1 US2008022575 A1 US 2008022575A1
- Authority
- US
- United States
- Prior art keywords
- bullet
- video
- target
- point
- intended
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 claims abstract description 13
- 238000004891 communication Methods 0.000 claims abstract description 5
- 238000004458 analytical method Methods 0.000 abstract description 4
- 230000008685 targeting Effects 0.000 description 6
- 230000004913 activation Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 238000010191 image analysis Methods 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000009849 deactivation Effects 0.000 description 1
- 238000010304 firing Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000035939 shock Effects 0.000 description 1
Images
Classifications
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G1/00—Sighting devices
- F41G1/46—Sighting devices for particular applications
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G1/00—Sighting devices
- F41G1/38—Telescopic sights specially adapted for smallarms or ordnance; Supports or mountings therefor
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G3/00—Aiming or laying means
- F41G3/02—Aiming or laying means using an independent line of sight
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G3/00—Aiming or laying means
- F41G3/14—Indirect aiming means
- F41G3/142—Indirect aiming means based on observation of a first shoot; using a simulated shoot
Definitions
- U.S. military sniper teams generally consist of a shooter and an observer (or spotter).
- the observer uses a non-electronic glass optics-based spotting scope to observe a target, determine distance, and estimate wind speed and direction before a shot is fired.
- the spotter conveys this information to the shooter for point of aim adjustments prior to shooting. Distance is estimated manually.
- the spotter After the shooter fires, the spotter tries to observe the actual path of the bullet (trace) to the intended target (point of impact) through the spotting scope. The spotter then attempts to determine if the target was hit based on the observed trace trajectory. If the target was not hit, the spotter determines where the bullet crossed the plane of the target and suggests an aiming correction to the shooter. Observing target can only be performed during daylight and the trace is extremely difficult to observe even under ideal daylight conditions. Trace observations are also subject to very large errors. Also, if no spotter is present, then observation of the trace trajectory is not possible.
- the present invention provides systems and methods for automatically generating an aim point correction for sniper operations.
- the present invention reduces spotter/sniper workload and improves trace spotting analysis.
- An example system includes a scope, a video capture component, an output device, and a processor in signal communication with the video capture component and the output device.
- the video capture component captures video of a bullet from when the bullet left a weapon to at least when the bullet crossed a previously determined target range.
- the processor determines from the captured video where the bullet was located relative to an intended target when the bullet was at the target range, generates a new aim point if the bullet was determined to have missed an intended hit point, and outputs the generated new aim point to the output device.
- the intended hit point is the intended target.
- the video capture component includes a digital video camera and/or an infrared video camera.
- FIG. 1 illustrates a perspective view of an example spotter scope formed in accordance with an embodiment of the present invention
- FIG. 2 illustrates a block diagram of components of the scope shown in FIG. 1 ;
- FIG. 3 is a flow diagram of an example process performed by the scope of FIGS. 1 and 2 ;
- FIG. 4 is an example image viewable by a user of the scope.
- FIG. 5 is a perspective view of a sniper's gun-mounted scope.
- FIG. 1 shows an example spotter scope 20 formed in accordance with an embodiment of the present invention.
- the scope 20 may be hand-held or mounted to a support device, such as a tripod 40 .
- the scope 20 includes a housing 24 with a scope lens 34 , a video lens 36 , and an infrared lens 38 located at a first end of the housing 24 .
- eye pieces 28 At a second end of the housing 24 are eye pieces 28 that correspond to the lenses 34 - 38 , user interface controls 30 , and a display device 32 .
- the scope 20 includes a processor 60 that is in data communication with user interface controls 30 , the display device 32 , and an output device 42 .
- An example of the output device 42 is a digital micro mirror device (DMD) that is controlled by a Digital Signal Processing (DSP) chip for presenting images in the field of view through the scope lens 34 and via an associated eye piece.
- DMD digital micro mirror device
- DSP Digital Signal Processing
- the processor 60 includes video capture components 80 , video processing components 82 , and a targeting component 88 .
- the video capture components 80 includes a digital video camera associated with the video lens 36 and an infrared video capture component associated with the infrared lens 38 .
- the video capture components 80 capture video images of a trajectory of a bullet expelled by a nearby weapon.
- the captured video is sent to the video processing components 82 for analysis.
- the video captured by the digital video camera is processed to determine trajectory of the bullet and at night the video captured by the infrared camera is used to determine bullet trajectory.
- Daytime video capture with the digital video camera can be augmented by the infrared camera where conditions warrant.
- the processing component 82 determines where the bullet was most likely to have crossed the plane of the intended target. If the processing component 82 determines that the trajectory of the bullet shows that the bullet did not hit the intended target, then the targeting component 88 determines an aiming correction location.
- the processing component 82 and the targeting component 88 includes a display component for generating an image of the location of where the bullet crossed the target plane (processing component 82 ) and an image for a new aiming point (targeting component 88 ).
- the images are sent to the display device 32 and/or the output device 44 for presentation within the field of view of the scope, other video capture devices may be used.
- the processor 60 may output the captured video to the display device 32 .
- the display device 32 may present scope status information, activateable user controls (e.g., touch screen control buttons), previously stored information, or information received (wirelessly or via wire) from another system.
- FIG. 3 is a flow diagram of an example process 120 performed by the components of the scope 20 .
- one of the video capture components 80 records video at some point prior to firing of the weapon that is in close proximity to the scope 20 .
- the video capture components 80 may be activated manually by the user interacting with the user interface controls 30 or the display device 32 , by activation of a remote control that is in wired or wireless signal communication with the processor 60 .
- the remote control device may be a voice capturing device and the processor 60 includes a voice processing component (not shown) that interprets voice signals sent to it via the remote control.
- Activation or deactivation of the capturing of video images can be performed automatically, for example, by sensing activation of the weapon and by deactivating after a predefined period of time from when the weapon was activated.
- image analysis of the captured video is automatically performed in order to determine trajectory of the bullet.
- the processor 60 automatically determines the point where the bullet crossed the intended target based on the determined trajectory, the frame rate of the captured video, a predicted range of the intended target, and a determination of when the bullet left the weapon or when the trigger was pulled.
- the determination of when the bullet left the weapon or trigger activation may be based on a sensed event, such as sound or shock as sensed by a sensing device (not shown).
- processor 60 outputs a dot, such as a red dot, to represent the determined point where the bullet crossed the intended target.
- the outputted dot is presented on the output device 42 . If, at the decision block 136 , it was determined that the bullet did hit the target, then the process is done, See block 138 . However, if the bullet did not hit the target as determined at the decision block 136 , the processor 60 , at a block 140 , determines an aiming correction point based on the point determined at the block 132 and the previous aiming point. At a block 42 , a corrected pipper location or aim point location is generated and displayed and outputted by the output device 142 or the display device 32 . The determination by the processor 60 of whether the bullet hit the target is based on comparing the point determined at the block 132 to a stored image that is sized according to the determined predicted range of the target.
- FIG. 4 illustrates an image 160 that a viewer sees through the scope 20 .
- a center pipper 166 in this example is located at the center of the intended target.
- the point 168 is displayed to one viewing the image 160 in order to show where the point is that was determined at the block 132 .
- a new pipper 170 is generated and outputted according to the block 142 .
- the point 168 and pipper 170 are presented within the scope by a DMD and DSP chip.
- the corrected pipper location such as the pipper 170 of FIG. 4
- the sniper viewing the target through gun-mounted scope 180 adjusts their targeting in order to match the new aim location, See aim point 188 . If it is determined that the new aim location is outside of the MILDOT settings of a typical scope, then the sniper will activate a dial 190 in order to adjust the targeting aim point according to the new aim point.
- the range of the target is predicted manually by the spotter or shooter or automatically by the processor 60 .
- the spotter or shooter determines range by known techniques and enters the determined range into the processor 60 using the user interface controls 30 or the display device 32 .
- the processor 60 automatically determines range by using image analysis of a center portion of an image recorded by one of the video capture components 80 after the user has placed the crosshair on the intended target and instructed the processor 60 to calculate range.
- the processor 60 performs image matching that matches a prestored target object (upper body human form) to a similar object in the captured image. After a match has been determined, range is determined by determining a width and/or a height dimensions of the matched object in the captured image and comparing that to predefined width and height dimensions for a typical or predefined target.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Closed-Circuit Television Systems (AREA)
- Aiming, Guidance, Guns With A Light Source, Armor, Camouflage, And Targets (AREA)
Abstract
Systems and methods for automatically generating an aim point correction for sniper operations. The present invention reduces spotter/sniper workload and improves trace spotting analysis. An example system includes a scope, a video capture component, an output device, and a processor in signal communication with the video capture component and the output device. The video capture component captures video of a bullet from when the bullet left a weapon to at least when the bullet crossed a previously determined target range. The processor determines from the captured video where the bullet was located relative to an intended target when the bullet was at the target range, generates a new aim point if the bullet was determined to have missed an intended hit point, and outputs the generated new aim point to the output device.
Description
- This application claims priority to provisional patent application Ser. No. 60/746,736 filed on May 8, 2006 and is incorporated herein by reference.
- U.S. military sniper teams generally consist of a shooter and an observer (or spotter). The observer uses a non-electronic glass optics-based spotting scope to observe a target, determine distance, and estimate wind speed and direction before a shot is fired. The spotter conveys this information to the shooter for point of aim adjustments prior to shooting. Distance is estimated manually.
- After the shooter fires, the spotter tries to observe the actual path of the bullet (trace) to the intended target (point of impact) through the spotting scope. The spotter then attempts to determine if the target was hit based on the observed trace trajectory. If the target was not hit, the spotter determines where the bullet crossed the plane of the target and suggests an aiming correction to the shooter. Observing target can only be performed during daylight and the trace is extremely difficult to observe even under ideal daylight conditions. Trace observations are also subject to very large errors. Also, if no spotter is present, then observation of the trace trajectory is not possible.
- Therefore, there exists a need for an improved spotter scope.
- The present invention provides systems and methods for automatically generating an aim point correction for sniper operations. The present invention reduces spotter/sniper workload and improves trace spotting analysis.
- An example system includes a scope, a video capture component, an output device, and a processor in signal communication with the video capture component and the output device. The video capture component captures video of a bullet from when the bullet left a weapon to at least when the bullet crossed a previously determined target range. The processor determines from the captured video where the bullet was located relative to an intended target when the bullet was at the target range, generates a new aim point if the bullet was determined to have missed an intended hit point, and outputs the generated new aim point to the output device.
- In one aspect of the invention, the intended hit point is the intended target.
- In another aspect of the invention, the video capture component includes a digital video camera and/or an infrared video camera.
- Preferred and alternative embodiments of the present invention are described in detail below with reference to the following drawings:
-
FIG. 1 illustrates a perspective view of an example spotter scope formed in accordance with an embodiment of the present invention; -
FIG. 2 illustrates a block diagram of components of the scope shown inFIG. 1 ; -
FIG. 3 is a flow diagram of an example process performed by the scope ofFIGS. 1 and 2 ; -
FIG. 4 is an example image viewable by a user of the scope; and -
FIG. 5 is a perspective view of a sniper's gun-mounted scope. -
FIG. 1 shows anexample spotter scope 20 formed in accordance with an embodiment of the present invention. Thescope 20 may be hand-held or mounted to a support device, such as atripod 40. Thescope 20 includes ahousing 24 with ascope lens 34, avideo lens 36, and aninfrared lens 38 located at a first end of thehousing 24. At a second end of thehousing 24 areeye pieces 28 that correspond to the lenses 34-38, user interface controls 30, and adisplay device 32. - As shown in
FIG. 2 , thescope 20 includes aprocessor 60 that is in data communication withuser interface controls 30, thedisplay device 32, and anoutput device 42. An example of theoutput device 42 is a digital micro mirror device (DMD) that is controlled by a Digital Signal Processing (DSP) chip for presenting images in the field of view through thescope lens 34 and via an associated eye piece. - In one embodiment, the
processor 60 includesvideo capture components 80, video processing components 82, and atargeting component 88. Thevideo capture components 80 includes a digital video camera associated with thevideo lens 36 and an infrared video capture component associated with theinfrared lens 38. Thevideo capture components 80 capture video images of a trajectory of a bullet expelled by a nearby weapon. The captured video is sent to the video processing components 82 for analysis. In a daytime situation, the video captured by the digital video camera is processed to determine trajectory of the bullet and at night the video captured by the infrared camera is used to determine bullet trajectory. Daytime video capture with the digital video camera can be augmented by the infrared camera where conditions warrant. Once the trajectory has been determined from one or both of the generated video images, the processing component 82 determines where the bullet was most likely to have crossed the plane of the intended target. If the processing component 82 determines that the trajectory of the bullet shows that the bullet did not hit the intended target, then thetargeting component 88 determines an aiming correction location. The processing component 82 and thetargeting component 88 includes a display component for generating an image of the location of where the bullet crossed the target plane (processing component 82) and an image for a new aiming point (targeting component 88). The images are sent to thedisplay device 32 and/or the output device 44 for presentation within the field of view of the scope, other video capture devices may be used. - The
processor 60 may output the captured video to thedisplay device 32. Also, thedisplay device 32 may present scope status information, activateable user controls (e.g., touch screen control buttons), previously stored information, or information received (wirelessly or via wire) from another system. -
FIG. 3 is a flow diagram of anexample process 120 performed by the components of thescope 20. First, at ablock 126, one of thevideo capture components 80 records video at some point prior to firing of the weapon that is in close proximity to thescope 20. Thevideo capture components 80 may be activated manually by the user interacting with theuser interface controls 30 or thedisplay device 32, by activation of a remote control that is in wired or wireless signal communication with theprocessor 60. In one embodiment, the remote control device may be a voice capturing device and theprocessor 60 includes a voice processing component (not shown) that interprets voice signals sent to it via the remote control. Activation or deactivation of the capturing of video images can be performed automatically, for example, by sensing activation of the weapon and by deactivating after a predefined period of time from when the weapon was activated. Next, at ablock 128 image analysis of the captured video is automatically performed in order to determine trajectory of the bullet. At ablock 132, theprocessor 60 automatically determines the point where the bullet crossed the intended target based on the determined trajectory, the frame rate of the captured video, a predicted range of the intended target, and a determination of when the bullet left the weapon or when the trigger was pulled. The determination of when the bullet left the weapon or trigger activation may be based on a sensed event, such as sound or shock as sensed by a sensing device (not shown). - At
block 134,processor 60 outputs a dot, such as a red dot, to represent the determined point where the bullet crossed the intended target. The outputted dot is presented on theoutput device 42. If, at thedecision block 136, it was determined that the bullet did hit the target, then the process is done, Seeblock 138. However, if the bullet did not hit the target as determined at thedecision block 136, theprocessor 60, at ablock 140, determines an aiming correction point based on the point determined at theblock 132 and the previous aiming point. At ablock 42, a corrected pipper location or aim point location is generated and displayed and outputted by theoutput device 142 or thedisplay device 32. The determination by theprocessor 60 of whether the bullet hit the target is based on comparing the point determined at theblock 132 to a stored image that is sized according to the determined predicted range of the target. -
FIG. 4 illustrates animage 160 that a viewer sees through thescope 20. Acenter pipper 166 in this example is located at the center of the intended target. After the weapon has been fired and the analysis has been performed atblocks image 160 in order to show where the point is that was determined at theblock 132. After the correction determination is made at theblock 140, anew pipper 170 is generated and outputted according to theblock 142. The point 168 andpipper 170 are presented within the scope by a DMD and DSP chip. - The corrected pipper location, such as the
pipper 170 ofFIG. 4 , is conveyed to the sniper. The sniper viewing the target through gun-mountedscope 180 adjusts their targeting in order to match the new aim location, See aimpoint 188. If it is determined that the new aim location is outside of the MILDOT settings of a typical scope, then the sniper will activate adial 190 in order to adjust the targeting aim point according to the new aim point. - In one embodiment, the range of the target is predicted manually by the spotter or shooter or automatically by the
processor 60. The spotter or shooter determines range by known techniques and enters the determined range into theprocessor 60 using the user interface controls 30 or thedisplay device 32. Theprocessor 60 automatically determines range by using image analysis of a center portion of an image recorded by one of thevideo capture components 80 after the user has placed the crosshair on the intended target and instructed theprocessor 60 to calculate range. Theprocessor 60 performs image matching that matches a prestored target object (upper body human form) to a similar object in the captured image. After a match has been determined, range is determined by determining a width and/or a height dimensions of the matched object in the captured image and comparing that to predefined width and height dimensions for a typical or predefined target. - While the preferred embodiment of the invention has been illustrated and described, as noted above, many changes can be made without departing from the spirit and scope of the invention. Accordingly, the scope of the invention is not limited by the disclosure of the preferred embodiment. Instead, the invention should be determined entirely by reference to the claims that follow.
Claims (12)
1. A method for automatically generating an aim point correction, the method comprising:
capturing video of a bullet from when the bullet left a weapon to at least when the bullet crossed a previously determined target range;
automatically determining from the captured video where the bullet was located relative to an intended target when the bullet was at the target range;
automatically generating a new aim point if the bullet was determined to have missed an intended hit point; and
outputting the generated new aim point.
2. The method of claim 1 , wherein the intended hit point is the intended target.
3. The method of claim 1 , wherein capturing includes capturing daytime video images.
4. The method of claim 1 , wherein capturing includes capturing infrared video images.
5. The method of claim 1 , further comprising automatically determining range of the target.
6. The method of claim 1 , wherein outputting includes displaying the generated new aim point in a field of view of a scope.
7. A system for automatically generating an aim point correction, the system comprising:
a scope;
a video capture component configured to capture video of a bullet from when the bullet left a weapon to at least when the bullet crossed a previously determined target range;
an output device;
a processor in signal communication with the video capture component and the output device, the processor comprising:
a first component configured to determine from the captured video where the bullet was located relative to an intended target when the bullet was at the target range;
a second component configured to generate a new aim point if the bullet was determined to have missed an intended hit point; and
a third component configured to output the generated new aim point to the output device.
8. The system of claim 7 , wherein the intended hit point is the intended target.
9. The system of claim 7 , wherein the video capture component includes a digital video camera.
10. The system of claim 7 , wherein the video capture component includes an infrared video camera.
11. The system of claim 7 , wherein the processor comprises a fourth component configured to determine range of the target.
12. The system of claim 7 , wherein the output device includes a component for outputting the generated new aim point in a field of view of the scope.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/696,050 US20080022575A1 (en) | 2006-05-08 | 2007-04-03 | Spotter scope |
EP07107673A EP1860395A1 (en) | 2006-05-08 | 2007-05-08 | Spotter scope |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US74673606P | 2006-05-08 | 2006-05-08 | |
US11/696,050 US20080022575A1 (en) | 2006-05-08 | 2007-04-03 | Spotter scope |
Publications (1)
Publication Number | Publication Date |
---|---|
US20080022575A1 true US20080022575A1 (en) | 2008-01-31 |
Family
ID=38537683
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/696,050 Abandoned US20080022575A1 (en) | 2006-05-08 | 2007-04-03 | Spotter scope |
Country Status (2)
Country | Link |
---|---|
US (1) | US20080022575A1 (en) |
EP (1) | EP1860395A1 (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110315767A1 (en) * | 2010-06-28 | 2011-12-29 | Lowrance John L | Automatically adjustable gun sight |
US20120090216A1 (en) * | 2010-10-19 | 2012-04-19 | Danyun Li | Electronic Firearm Sight and method for adjusting the reticle thereof |
US8285305B2 (en) | 2010-09-13 | 2012-10-09 | Honeywell International Inc. | Notifying a user of an event |
US8457179B2 (en) | 2010-09-13 | 2013-06-04 | Honeywell International Inc. | Devices, methods, and systems for building monitoring |
US20150287224A1 (en) * | 2013-10-01 | 2015-10-08 | Technology Service Corporation | Virtual tracer methods and systems |
US20160069643A1 (en) * | 2014-09-06 | 2016-03-10 | Philip Lyren | Weapon Targeting System |
US20160084617A1 (en) * | 2014-09-19 | 2016-03-24 | Philip Lyren | Weapon Targeting System |
DE102015120205A1 (en) * | 2015-09-18 | 2017-03-23 | Rheinmetall Defence Electronics Gmbh | Remote weapon station and method of operating a remote weapon station |
DE102015120036A1 (en) * | 2015-11-19 | 2017-05-24 | Rheinmetall Defence Electronics Gmbh | Remote weapon station and method of operating a remote weapon station |
DE102016007624A1 (en) * | 2016-06-23 | 2018-01-11 | Diehl Defence Gmbh & Co. Kg | 1Procedure for file correction of a weapon system |
US11287638B2 (en) | 2019-08-20 | 2022-03-29 | Francesco E. DeAngelis | Reflex sight with superluminescent micro-display, dynamic reticle, and metadata overlay |
US11421961B2 (en) * | 2009-05-15 | 2022-08-23 | Hvrt Corp. | Apparatus and method for calculating aiming point information |
WO2022259241A1 (en) * | 2021-06-07 | 2022-12-15 | Smart Shooter Ltd. | System and method for zeroing of smart aiming device |
US20230003485A1 (en) * | 2021-07-01 | 2023-01-05 | Raytheon Canada Limited | Digital booster for sights |
Citations (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4494198A (en) * | 1981-03-12 | 1985-01-15 | Barr & Stroud Limited | Gun fire control systems |
US5026158A (en) * | 1988-07-15 | 1991-06-25 | Golubic Victor G | Apparatus and method for displaying and storing impact points of firearm projectiles on a sight field of view |
US5954507A (en) * | 1996-09-18 | 1999-09-21 | Bristlecone Corporation | Method and apparatus for training a shooter of a firearm |
US5991043A (en) * | 1996-01-08 | 1999-11-23 | Tommy Anderson | Impact position marker for ordinary or simulated shooting |
US6247259B1 (en) * | 1997-10-09 | 2001-06-19 | The State Of Israel, Atomic Energy Commission, Soreq Nuclear Research Center | Method and apparatus for fire control |
US6252706B1 (en) * | 1997-03-12 | 2001-06-26 | Gabriel Guary | Telescopic sight for individual weapon with automatic aiming and adjustment |
US20050021282A1 (en) * | 1997-12-08 | 2005-01-27 | Sammut Dennis J. | Apparatus and method for calculating aiming point information |
US20050268521A1 (en) * | 2004-06-07 | 2005-12-08 | Raytheon Company | Electronic sight for firearm, and method of operating same |
US20060005447A1 (en) * | 2003-09-12 | 2006-01-12 | Vitronics Inc. | Processor aided firing of small arms |
US7158167B1 (en) * | 1997-08-05 | 2007-01-02 | Mitsubishi Electric Research Laboratories, Inc. | Video recording device for a targetable weapon |
US20070044364A1 (en) * | 1997-12-08 | 2007-03-01 | Horus Vision | Apparatus and method for calculating aiming point information |
US7210262B2 (en) * | 2004-12-23 | 2007-05-01 | Raytheon Company | Method and apparatus for safe operation of an electronic firearm sight depending upon detected ambient illumination |
US20070097351A1 (en) * | 2005-11-01 | 2007-05-03 | Leupold & Stevens, Inc. | Rotary menu display and targeting reticles for laser rangefinders and the like |
US7269920B2 (en) * | 2004-03-10 | 2007-09-18 | Raytheon Company | Weapon sight with ballistics information persistence |
US20080039962A1 (en) * | 2006-05-23 | 2008-02-14 | Mcrae Michael W | Firearm system for data acquisition and control |
US20080163536A1 (en) * | 2005-03-18 | 2008-07-10 | Rudolf Koch | Sighting Mechansim For Fire Arms |
US7404268B1 (en) * | 2004-12-09 | 2008-07-29 | Bae Systems Information And Electronic Systems Integration Inc. | Precision targeting system for firearms |
US20090235570A1 (en) * | 1997-12-08 | 2009-09-24 | Horus Vision | Apparatus and method for calculating aiming point information |
US7603804B2 (en) * | 2003-11-04 | 2009-10-20 | Leupold & Stevens, Inc. | Ballistic reticle for projectile weapon aiming systems and method of aiming |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU1232300A (en) * | 1998-10-23 | 2000-05-15 | Precision Remotes, Inc. | Rapid aiming telepresent system |
-
2007
- 2007-04-03 US US11/696,050 patent/US20080022575A1/en not_active Abandoned
- 2007-05-08 EP EP07107673A patent/EP1860395A1/en not_active Withdrawn
Patent Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4494198A (en) * | 1981-03-12 | 1985-01-15 | Barr & Stroud Limited | Gun fire control systems |
US5026158A (en) * | 1988-07-15 | 1991-06-25 | Golubic Victor G | Apparatus and method for displaying and storing impact points of firearm projectiles on a sight field of view |
US5991043A (en) * | 1996-01-08 | 1999-11-23 | Tommy Anderson | Impact position marker for ordinary or simulated shooting |
US5954507A (en) * | 1996-09-18 | 1999-09-21 | Bristlecone Corporation | Method and apparatus for training a shooter of a firearm |
US6252706B1 (en) * | 1997-03-12 | 2001-06-26 | Gabriel Guary | Telescopic sight for individual weapon with automatic aiming and adjustment |
US7158167B1 (en) * | 1997-08-05 | 2007-01-02 | Mitsubishi Electric Research Laboratories, Inc. | Video recording device for a targetable weapon |
US6247259B1 (en) * | 1997-10-09 | 2001-06-19 | The State Of Israel, Atomic Energy Commission, Soreq Nuclear Research Center | Method and apparatus for fire control |
US20070044364A1 (en) * | 1997-12-08 | 2007-03-01 | Horus Vision | Apparatus and method for calculating aiming point information |
US20050021282A1 (en) * | 1997-12-08 | 2005-01-27 | Sammut Dennis J. | Apparatus and method for calculating aiming point information |
US20090235570A1 (en) * | 1997-12-08 | 2009-09-24 | Horus Vision | Apparatus and method for calculating aiming point information |
US20060005447A1 (en) * | 2003-09-12 | 2006-01-12 | Vitronics Inc. | Processor aided firing of small arms |
US7603804B2 (en) * | 2003-11-04 | 2009-10-20 | Leupold & Stevens, Inc. | Ballistic reticle for projectile weapon aiming systems and method of aiming |
US7269920B2 (en) * | 2004-03-10 | 2007-09-18 | Raytheon Company | Weapon sight with ballistics information persistence |
US20050268521A1 (en) * | 2004-06-07 | 2005-12-08 | Raytheon Company | Electronic sight for firearm, and method of operating same |
US7404268B1 (en) * | 2004-12-09 | 2008-07-29 | Bae Systems Information And Electronic Systems Integration Inc. | Precision targeting system for firearms |
US20080190007A1 (en) * | 2004-12-09 | 2008-08-14 | Page Edward A | Precision targeting system for firearms |
US7210262B2 (en) * | 2004-12-23 | 2007-05-01 | Raytheon Company | Method and apparatus for safe operation of an electronic firearm sight depending upon detected ambient illumination |
US20080163536A1 (en) * | 2005-03-18 | 2008-07-10 | Rudolf Koch | Sighting Mechansim For Fire Arms |
US20070097351A1 (en) * | 2005-11-01 | 2007-05-03 | Leupold & Stevens, Inc. | Rotary menu display and targeting reticles for laser rangefinders and the like |
US20080039962A1 (en) * | 2006-05-23 | 2008-02-14 | Mcrae Michael W | Firearm system for data acquisition and control |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11421961B2 (en) * | 2009-05-15 | 2022-08-23 | Hvrt Corp. | Apparatus and method for calculating aiming point information |
US20110315767A1 (en) * | 2010-06-28 | 2011-12-29 | Lowrance John L | Automatically adjustable gun sight |
US8285305B2 (en) | 2010-09-13 | 2012-10-09 | Honeywell International Inc. | Notifying a user of an event |
US8457179B2 (en) | 2010-09-13 | 2013-06-04 | Honeywell International Inc. | Devices, methods, and systems for building monitoring |
US8588820B2 (en) | 2010-09-13 | 2013-11-19 | Honeywell International Inc. | Notifying a user of an event |
US9008697B2 (en) | 2010-09-13 | 2015-04-14 | Honeywell International Inc. | Notifying a user of an event |
EP2631590A4 (en) * | 2010-10-19 | 2017-02-22 | Danyun Li | Electronic sighting device and method of regulating and determining graduation thereof |
US20120090216A1 (en) * | 2010-10-19 | 2012-04-19 | Danyun Li | Electronic Firearm Sight and method for adjusting the reticle thereof |
US20150287224A1 (en) * | 2013-10-01 | 2015-10-08 | Technology Service Corporation | Virtual tracer methods and systems |
US20160069643A1 (en) * | 2014-09-06 | 2016-03-10 | Philip Lyren | Weapon Targeting System |
US20160084617A1 (en) * | 2014-09-19 | 2016-03-24 | Philip Lyren | Weapon Targeting System |
US10184758B2 (en) * | 2014-09-19 | 2019-01-22 | Philip Lyren | Weapon targeting system |
US10746507B2 (en) * | 2014-09-19 | 2020-08-18 | Philip Lyren | Weapon targeting system |
DE102015120205A1 (en) * | 2015-09-18 | 2017-03-23 | Rheinmetall Defence Electronics Gmbh | Remote weapon station and method of operating a remote weapon station |
DE102015120036A1 (en) * | 2015-11-19 | 2017-05-24 | Rheinmetall Defence Electronics Gmbh | Remote weapon station and method of operating a remote weapon station |
DE102016007624A1 (en) * | 2016-06-23 | 2018-01-11 | Diehl Defence Gmbh & Co. Kg | 1Procedure for file correction of a weapon system |
US11287638B2 (en) | 2019-08-20 | 2022-03-29 | Francesco E. DeAngelis | Reflex sight with superluminescent micro-display, dynamic reticle, and metadata overlay |
WO2022259241A1 (en) * | 2021-06-07 | 2022-12-15 | Smart Shooter Ltd. | System and method for zeroing of smart aiming device |
US20240271908A1 (en) * | 2021-06-07 | 2024-08-15 | Smart Shooter Ltd. | System and method for zeroing of smart aiming device |
US20230003485A1 (en) * | 2021-07-01 | 2023-01-05 | Raytheon Canada Limited | Digital booster for sights |
US11644277B2 (en) * | 2021-07-01 | 2023-05-09 | Raytheon Canada Limited | Digital booster for sights |
Also Published As
Publication number | Publication date |
---|---|
EP1860395A1 (en) | 2007-11-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20080022575A1 (en) | Spotter scope | |
JP4874248B2 (en) | Electronic sight for small firearms and operation method thereof | |
US8908045B2 (en) | Camera device to capture and generate target lead and shooting technique data and images | |
KR101252929B1 (en) | Method and apparatus for safe operation of an electronic firearm sight depending upon the detection of a selected color | |
CN104567543B (en) | Sighting system and operational approach thereof | |
US8336777B1 (en) | Covert aiming and imaging devices | |
US7926219B2 (en) | Digital scope with horizontally compressed sidefields | |
US5834676A (en) | Weapon-mounted location-monitoring apparatus | |
EP3034987A1 (en) | System for identifying a position of impact of a weapon shot on a target | |
US20110315767A1 (en) | Automatically adjustable gun sight | |
EA030649B1 (en) | Firearm aiming system with range finder, and method of acquiring a target | |
EA031066B1 (en) | Firearm aiming system (embodiments) and method of operating the firearm | |
WO2012068423A2 (en) | Firearm sight having uhd video camera | |
US10480903B2 (en) | Rifle scope and method of providing embedded training | |
EP2111612A1 (en) | Image orientation correction method and system | |
CA2718150A1 (en) | Weapons control systems | |
US20180202775A1 (en) | Shooting Game for Multiple Players with Dynamic Shot Position Recognition and Remote Sensors | |
JP2016166731A (en) | Shooting system, gun, and data processing device | |
CN109425261A (en) | A kind of novel video riflescope dual-purpose round the clock | |
KR101912754B1 (en) | Shooting and display system for shooting target | |
KR101779199B1 (en) | Apparatus for recording security video | |
KR102485302B1 (en) | Portable image display apparatus and image display method | |
EP1350132A1 (en) | A device for viewing objects at a distance from a user of the device | |
KR101402758B1 (en) | System and method for shooting game | |
KR101977234B1 (en) | Assembled shooting simulation system using of fish-eye lens camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HONEYWELL INTERNATIONAL INC., NEW JERSEY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DREXLER, JEROME P.;CORNETT, ALAN G.;BECKER, ROBERT C.;REEL/FRAME:019106/0836;SIGNING DATES FROM 20070329 TO 20070402 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |