+

WO1998028706A1 - Systeme de securite video a faible taux de fausses alertes utilisant la classification d'objets - Google Patents

Systeme de securite video a faible taux de fausses alertes utilisant la classification d'objets Download PDF

Info

Publication number
WO1998028706A1
WO1998028706A1 PCT/US1997/024163 US9724163W WO9828706A1 WO 1998028706 A1 WO1998028706 A1 WO 1998028706A1 US 9724163 W US9724163 W US 9724163W WO 9828706 A1 WO9828706 A1 WO 9828706A1
Authority
WO
WIPO (PCT)
Prior art keywords
scene
security system
intruder
human
image
Prior art date
Application number
PCT/US1997/024163
Other languages
English (en)
Other versions
WO1998028706B1 (fr
Inventor
John R. Wootton
Gary S. Waldman
Gregory L. Hobson
Original Assignee
Esco Electronics Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US08/772,595 external-priority patent/US5937092A/en
Priority claimed from US08/772,731 external-priority patent/US5956424A/en
Application filed by Esco Electronics Corporation filed Critical Esco Electronics Corporation
Priority to CA002275893A priority Critical patent/CA2275893C/fr
Priority to EP97954298A priority patent/EP1010130A4/fr
Priority to AU58109/98A priority patent/AU5810998A/en
Publication of WO1998028706A1 publication Critical patent/WO1998028706A1/fr
Publication of WO1998028706B1 publication Critical patent/WO1998028706B1/fr

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19604Image analysis to detect motion of the intruder, e.g. by frame subtraction involving reference image or background adaptation with time to compensate for changing conditions, e.g. reference image update on detection of light level change
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19606Discriminating between target movement or movement in an area of interest and other non-signicative movements, e.g. target movements induced by camera shake or movements of pets, falling leaves, rotating fan
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/1961Movement detection not involving frame subtraction, e.g. motion detection on the basis of luminance changes in the image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • This invention relates to video security systems and a method for detecting the presence of an intruder into an area being monitored by the system; and more particularly, to i) the rejection of false alarms which might otherwise occur because of global or local, natural or manmade, lighting changes which occur within a scene observed by the system, ii) the discernment
  • a security system of the invention uses a video camera as the principal sensor and processes a resulting image to determine the presence or non- presence of an intruder.
  • the fundamental process is to establish a reference scene known, or assumed, to have no intruder(s) present.
  • An image of the present scene, as provided by the video camera, is compared with an image of
  • the system and method operate to first eliminate possible sources of false alarms, and to then classify any remaining
  • the video security system and image processing methodology as described herein recognizes anomalies resulting from these other causes so these, too, can be accounted for.
  • the method includes comparing, on a pixel by pixel basis, the current image with the reference image to obtain a difference image.
  • any nonzero pixel in the difference image indicates the possible presence of an intrusion, after image artifacts such as noise, aliasing of the video, and movement within the scene not attributable to a life form (animal or human) such as the hands of a clock, screen savers on computers, oscillating fans, etc., have been accounted for.
  • image artifacts such as noise, aliasing of the video
  • the system and method use an absolute difference technique with pixel by pixel subtraction, the process is sensitive to surface differences between the scene but insensitive to light on dark or dark on light changes, and thus is very sensitive to any intrusion within the scene.
  • each pixel represents a gray level measure of the scene intensity that is reflected from that part of the scene. The gray level intensity can alter for a variety of reasons, the most relevant of these being that there is a
  • Two important features of the video security system is to inform an operator/verifier of the presence of a human intruder, and to not generate false alarms.
  • the system must operate to eliminate as many false alarms as possible without impacting the overall probability of detecting an intruder's presence.
  • a fundamental cause of false alarms stems from the sensor and methodology used to ascertain if an intrusion has occurred. By use of the processing methodology described herein, various effects which could otherwise trigger false alarms are accounted for so that only a life form intruding into the scene will produce an alarm.
  • Gray level intensity can change for a variety of reasons, the most important being a new physical presence within a particular part of the scene. Additionally, the intensity will change at that location if the overall lighting of the total scene changes (a global change), or the lighting at this particular part of the scene changes (a local change), or the AGC (automatic gain control) of the camera changes, or the ALC (automatic light level) of the camera changes. With respect to global or local lighting changes, these can result from natural lighting changes or manmade lighting changes. Finally, there will be a difference of gray level intensity at a pixel level if there is noise present in the video. Only the situation of a physical presence in the scene is a true alarm; the remainder all comprise false alarms within the system.
  • the system For a security system to be economically viable and avoid an unduly high load on an operator who has to verify each alarm, the system must process images in a manner which eliminates as many of false alarms as possible without impacting the overall probability of detecting the presence of an intruder.
  • U.S. patent 5,289,275 to Ishii et al. is directed to a surveillance monitoring system using image processing for monitoring fires and thefts.
  • the patent teaches use of a color camera for monitoring fires and a method of comparing the color ratio at each pixel in an image to estimate the radiant energy represented by each pixel. A resulting ratio is compared to a threshold with the presence of a fire being indicated if the threshold is surpassed.
  • a similar technique for detecting the presence of humans is also described.
  • the patent teaches the use of image processing together with a camera to detect the presence of fires and abnormal objects.
  • U.S. patent 4,697,097 to Yausa et al. also teaches use of a camera to detect the presence of an object.
  • the system automatically dials and sends a difference image, provided the differences are large enough, to a remote site over a telephone line.
  • the image is viewed by a human. While teaching some aspects of detection, Yausa et al. does not go beyond the detection process to attempt and use image processing to recognize that the anomaly is caused by a human presence.
  • U.S. patent 4,257,063 which is directed to a video monitoring system and method, teaches that a video line from a camera can be compared to the same video line viewed at an earlier time to detect the presence of a human.
  • the detection device is not a whole image device, nor does it make any compensation for light changes, nor does it teach attempting to automatically recognize the contents of an image as being derived from a human.
  • U.S. patent 4,161,750 teaches that changes in the average value of a video line can be used to detect the presence of an anomalous object. Whereas the implementation is different from the '063 patent, the teaching is basically the same.
  • a video security system and method for visually monitoring a scene and detecting the presence of an intruder within the scene the provision of such a system and method whose operation is based upon the premise that only the presence of a human intruder is of consequence to the security system, with everything else constituting a false alarm; the provision of such a system and method to readily distinguish between changes within the scene caused by the presence of a person entering the scene as opposed to changes within the scene resulting from lighting changes (whether global or local, natural or man made) and other anomalies which occur within the scene to detect the presence of an intruder; the provision of such a system and method to employ a recognition process rather than an abnormality process such as used in other systems to differentiate between human and non-human objects, so to reduce or substantially eliminate false alarms; the provision of such a system and method to provide a high probability of detection of the presence of a human, while having a low probability of false alarms; the provision of such a system
  • a video detection system detects the presence of an intruder in a scene from video provided by a camera observing the scene.
  • a recognition process differentiates between human and non-human (animal) life forms. The presence of a human is determined with a high degree of confidence so there is a very low probability of false alarms. Possible false alarms resulting from the effects noise, aliasing, non-intruder motion occurring within the scene, and the effects of global or local lighting are first identified and only then is object recognition performed.
  • Performing object recognition includes determining which regions within the image may be an intruder, outlining and growing those regions so the result encompasses all of what may be the intruder, determining a set of shape features from the region and eliminating possible shadow effects, normalizing the set of features and comparing the resulting set with sets of features for humans and non- human (animal) life forms.
  • the result of the comparison produces a confidence level as to whether or not the intruder is a human. If the confidence level is sufficiently high, an alarm is given.
  • Fig. 1 is a simplified block diagram of a video security system of the present invention for viewing a scene and determining the presence of an intruder in the scene;
  • Fig. 2 is a representation of an actual scene viewed by a camera of the system
  • Fig. 3 is the same scene as Fig. 2 but with the presence of an intruder
  • Fig. 4 is a representation of another actual scene under one lighting condition
  • Fig. 5 is a representation of the same scene under different lighting conditions and with no intruder in the scene;
  • Fig. 6A is a representation of the object in Fig. 3 including its shadow
  • Fig. 6B illustrates outlining and segmentation of the object
  • Fig. 6C illustrates the object with its shadow removed and as resampled for determining a set of features for the object
  • Figs. 7A-7C represent non-human (animal) life forms with which features of the object are compared to determine if the object represents a human or non-human life form and wherein Fig. 7A represents a cat, Fig. 7B a dog, and Fig. 7C a bird;
  • Figure 8 is a simplified time line indicating intervals at which images of the scene are viewed by the camera system
  • Figure 9 represents a pixel array such as forms a portion of an image; and, Fig. 10 illustrates masking of an image for those areas within a scene where fixed objects having an associated movement or lighting change are located.
  • a video security system of the invention is indicated generally 10 in Fig. 1.
  • the system employs one or more cameras Cl- Cn each of which continually views a respective scene and produces a signal representative of the scene.
  • the cameras may operate in the visual or infrared portions of the light spectrum and a video output signal of each camera is supplied to a processor means 12.
  • Means 12 processes each received signal from a camera to produce an image represented by the signal and compares the image representing the scene at one point in time with a similar image of the scene at a previous point in time.
  • the signal from the imaging means represented by the cameras may be either an analog or digital signal, and processing means 12 may be an analog, digital, or hybrid processor.
  • FIG. 2 an image of a scene is shown, the representation being the actual image produced by a camera C.
  • Fig. 2 represents, for example, a reference image of the scene.
  • Fig. 3 is an image exactly the same as that in Fig. 2 except that now a person (human intruder) has been introduced into the scene.
  • Fig. 3 is again an actual image produced by a camera C.
  • Fig. 4 represents a reference image of a scene
  • Fig. 5 a later image in which there is a lighting change but not an intrusion.
  • the system and method of the invention operate to identify the presence of such a human intruder and provide an appropriate alarm. However, it is also a principal feature of the invention to not produce false alarms.
  • a single processor can handle several cameras positioned at different locations within a protected site. In use, the processor cycles through the different cameras, visiting each at a predetermined interval. At system power-up, the processor cycles through all of the cameras doing a self-test on each. One important test at this time is to record a reference frame against which later frames will be compared. A histogram of pixel values is formed from this reference frame.
  • a reference frame fl is created. Throughout the monitoring operation, this reference frame is continuously updated if there is no perceived motion within the latest image against which a reference image is compared. At each subsequent visit to the camera a new frame f2 is produced and subtracted from the reference. If the difference is not significant, the system goes on to the next camera. However, if there is a difference, frame f2 is stored and a third frame f3 is created on the next visit and compared to both frames fl and f2. Only if there is a significant difference between frames f3 and f2 and also frames f3 and fl, is further processing done.
  • This three frame procedure eliminates false alarms resulting from sudden, global light changes such as caused by lightning flashes or interior lights going on or off.
  • a lightning flash occurring during frame f2 will be gone by frame f3, so there will be no significant difference between frame f3 and fl .
  • the interior lights have simply gone on or off between frames fl and f2, there will be no significant changes between frames £2 and O. In either instance, the system proceeds on to the next camera with no more processing.
  • Significant differences between frames fl and f2, frames f3 and f2, and frames f3 and fl indicate a possible intrusion requiring more processing.
  • non- intruder motion occurring within the scene is also identified so as not to trigger processing or cause false alarms.
  • movement of the fan blades would also appear as a change from one image to another.
  • the fan is an oscillating fan, its sweeping movement would also be detected as a difference from one image to another.
  • the area within the scene where an object having an associated movement is generally fixed and its movement is spatially constrained movement, the area where this movement occurs is identified and masked so, in most instances, motion effects resulting from operation of the object (fan) are disregarded.
  • Any video alert system which uses frame-to-frame changes in the video to detect intrusions into a secured area is also vulnerable to false alarms from the inadvertent (passing automobile lights, etc.) or deliberate (police or security guard flashlights) introduction of light into the area, even though no one has physically entered the area.
  • the system and method of the invention differentiate between a change in a video frame due to a change in the irradiation of the surfaces in the FOV (field of view) as in Fig. 5, and a change due to the introduction of a new reflecting surface in the FOV as in Fig. 3.
  • the former is then rejected as a light "intrusion" requiring no alarm, whereas the latter is identified as a human intruder for which an alarm is given.
  • the alias process is caused by sampling at or near the intrinsic resolution of the system. As the system is sampled at or near the Nyquist frequency, the video, on a frame by frame basis, appears to scintillate, and certain areas will produce Moire like effects. Subtraction on a frame by frame basis would cause multiple detections on scenes that are unchanging. In many applications where this occurs it is not economically possible to over sample. Elimination of aliasing effects is accomplished by convolving the image with an equivalent two-dimensional (2D) smoothing filter. Whether this is a 3 x 3 or 5 x 5 filter, or a higher filter, is a matter of preference as are the weights of the filter. DETECTION PROCESS
  • the detection process consists of comparing the current image to a reference image. To initialize the system it is assumed that the operator has control over the scene and, therefore, will select a single frame for the reference when there is nothing present. (If necessary, up to 60 successive frames can be selected and integrated together to obtain an averaged reference image). As shown in Fig. 1, apparatus 10 employs multiple cameras Cl-Cn, but the methodology with respect to one camera is applicable for all cameras. For each camera, an image is periodically selected and the absolute difference between the current image (suitably convolved with the antialiasing filter) and the reference is determined. The difference image is then thresholded (an intensity threshold) and all of the pixels exceeding the threshold are accumulated.
  • an intensity threshold an intensity threshold
  • This step eliminates a significant number of pixels that otherwise would result in a non-zero result simply by differencing the two images.
  • Making this threshold value adaptive within a given range of threshold values ensures consistent perfo ⁇ nance. If the count of the pixels exceeding the intensity threshold exceeds a pixel count threshold, then a potential detection has occurred. At this time, all connected hit pixels (pixels that exceed the intensity threshold) are segmented, and a count of each segmented object is taken. If the pixel count of any object exceeds another pixel count threshold, then a detection is declared. Accordingly, detection is defined as the total number of hit pixels in the absolute difference image being large and there is a large connected object in the absolute difference image.
  • Noise induced detections are generally spatially small and distributed randomly throughout the image.
  • the basis for removing these events is to ascertain the size (area) of connected pixels that exceed the threshold set for detection. To achieve this, the region where the detected pixels occur is grown into connected "blobs". This is done by region growing the blobs. After region growing, those blobs that are smaller in size than a given size threshold are removed as false alarms.
  • a region growing algorithm starts with a search for the first object pixel as the outlining algorithm does. Since searching and outlining has already been performed, and since the outline pixels are part of the segmented object, these do not need to be region grown again.
  • Outline pixel arrays are now placed on a stack, and the outline pixels are zeroed out in the absolute difference image. A pixel is then selected (removed from the stack) and the outline pixels are zeroed out in the absolute difference image.
  • the selected pixel P and all of its eight neighbors P1-P8 are examined to see if hit points occur (i.e. they are non- zero). If a neighbor pixel is non-zero, then it is added to the stack and zeroed out in the absolute difference image.
  • region growing all eight neighboring pixels are examined, whereas in outlining, the examination of neighboring pixels stops as soon as an edge pixel is found. Thus, in outlining, as few as one neighbor may be investigated. The region growing segmentation process stops once the stack is empty.
  • Land's theory was introduced to explain why human observers are readily able to identify differences in surface lightness despite greatly varying illumination across a scene.
  • Land's theory is also applicable to viewing systems which function in place of a human viewer. According to the theory, even if the amount of energy reflected (incident energy times surface reflectance) from two different surfaces is the same, an observer can detect differences in the two surface lightness' if such a difference exists. In other words, the human visual system has a remarkable ability to see surface differences and ignore lighting differences.
  • a video signal (gray level) for any pixel is given by g ⁇ E ( ⁇ ) r (X) S ( ⁇ ) d ⁇ (1) where E( ⁇ ) ⁇ scene spectral irradiance at the pixel in question r( ⁇ ) ⁇ scene spectral reflectance at the pixel in question
  • S( ⁇ ) ⁇ sensor spectral response The constant of proportionality in (1) depends on geometry and camera characteristics, but is basically the same for all pixels in the frame.
  • ratios of adjacent pixel values satisfy the requirement of being determined by scene reflectances only and are independent of scene illumination. It remains to consider the practicality of the approximations used to arrive at (3).
  • a basic assumption in the retinex process is that of only gradual spatial variations in the scene irradiance; that is, we must have nearly the same irradiance of adjacent pixel areas in the scene. This assumption is generally true for diffuse lighting, but for directional sources it may not be. For example, the intrusion of a light beam into the area being viewed can introduce rather sharp shadows, or change the amount of light striking a vertical surface without similarly changing the amount of light striking an adjacent tilted surface.
  • ratios between pixels straddling the shadow line in the first instance, or the surfaces in the second instance will change even though no object has been introduced into the scene.
  • the pixel- to-pixel change is often less than it appears to the eye, and the changes only appear at the boundaries, not within the interiors of the shadows or surfaces.
  • edge mapping Another method, based on edge mapping, is also possible. As in the previous situation, the edge mapping process would be employed after an initial detection stage is triggered by pixel value changes from one frame to the next. Within each detected “blob" area, an edge map is made for both the initial (unchanged) frame and the changed frame that triggered the alert. Such an edge map can be constructed by running an edge enhancement filter (such as a Sobel filter) and then thresholding. If the intrusion is just a light change, then the edges within the blob should be basically in the same place in both frames. However, if the intrusion is an object, then some edges from the initial frame will be obscured in the changed frame and some new edges, internal to the intruding object, will be introduced.
  • edge enhancement filter such as a Sobel filter
  • the basic premise of the variable light rejection algorithm used in the method of the invention is to compare ratios of adjacent pixels from a segmented area in frame fl with ratios from corresponding pixels in frame ⁇ , but to restrict the ratios to those across significant edges. Restricting the processing to ratios of pixels tends to reject illumination changes, and using only edge pixels eliminates the dilution of information caused by large uniform areas.
  • a) Ratios R of adjacent pixels (both horizontally and vertically) in frame fl are tested to determine if they significantly differ from unity: R-l >Tj? or (1/R)-1 >T]?, where T] is a predetermined threshold value. Every time such a significant edge pair is found an edge count value is incremented.
  • Those pixel pairs that pass either of the tests in a) have their corresponding ratios R' for frame ⁇ calculated.
  • SHAPE FEATURES Having outlined and region grown an object to be recognized, a series of linear shape features and Fourier descriptors are extracted for each segmented region. Values for shape features are numerically derived from the image of the object based upon the x, y pixel coordinates obtained during outlining and segmentation of the object. These features include, for example, values representing the height of the object (y max . - y min .), its width (x max . - x min .), horizontal and vertical edge counts, and degree of circularity.
  • Fourier descriptors represent a set of features used to recognize a silhouette or contour of an object. As shown in Fig. 6C, the outline of an object is resampled into equally spaced points located about the edge of the object. The Fourier descriptors are computed by treating these points as complex points and creating a point complex FFT (Fast Fourier Transform) for the sequence. The resulting coefficients are a function of the position, size, orientation, and starting point P of the outline. Using these coefficients, Fourier descriptors are extracted which are invariant to these variables. As a result of performing the feature extractions, what remains is a set of features which now describe the segmented object. FEATURE SET NORMALIZATION
  • the set of features may be rescaled if the range of values for one of the features of the object is larger or smaller than the range which the rest of the features of the object have.
  • a test data base is established and when the feature data is tested on this data base, a feature may be found to be skewed.
  • a mathematical function such as a logarithmic function is applied to the feature value.
  • each feature value may be exercised through a linear function; that is, for example, a constant value is added to the feature value, and the result is then multiplied by another constant value. It will be understood that other consistent descriptors such as wavelet coefficients and fractal dimensions can be used instead of Fourier descriptors.
  • An object classifier portion of the processor means is provided as an input the normalized feature set for the object to be classified.
  • the object classifier has already been provided feature set information for humans as well as for a variety of animals (cat, dog, bird) such as shown in Figs. 7A - 7C. These Figs, show the presence of each animal in an actual scene as viewed by the camera of the system.
  • the classifier can determine a confidence value for each of three classes: human, animal and unknown. Operation of the classifier includes implementation of a linear or non-linear classifier.
  • a linear classifier may, for example implement a Bayes technique, as is well known in the art.
  • a non-linear classifier may employ, for example, a neural net which is also well-known in the art, or its equivalent. Regardless of the object classifier used, operation of the classifier produces a "hard" decision as to whether the object is human, non-human, or unknown. Further, the method involves using the algorithm to look at a series of consecutive frames in which the object appears, perform the above described sequence of steps for each individual frame, and integrate the results of the separate classifications to further verify the result.
  • the processing means in response to the results of the object classification provides an indication of an intrusion if the object is classified as a human. It does not provide any indication if the object is classified as an animal. This prevents false alarms. It will be understood, that because an image of a scene provided by a camera C is evaluated on a continual basis, every one-half second for example, the fact that a human is now present in the scene but the result of the classification process may not identify him as such at one instant, does not mean that the intrusion will be missed. Rather, it only means that the human was not recognized as such at that instant.
  • An alarm when it is given, is transmitted to a remote site such as a central monitoring location staffed by security personnel and from which a number of locations can be simultaneously monitored.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Burglar Alarm Systems (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

L'invention se rapporte à un système de détection vidéo (10) et à un procédé destiné à détecter la présence d'un intrus par les images vidéo d'un lieu. Le procédé utilise un procédé de reconnaissance qui permet de faire la différence entre les êtres humains et les animaux. Le procédé n'est mis en application qu'après d'éventuelles fausses alertes résultant de l'identification de bruits, de distorsions, de mouvements effectués par des non-intrus et des effets de changements de lumière complets ou locaux. La reconnaissance d'objets comprend la détermination des endroits où se trouve un intrus éventuel, la vision générale et l'agrandissement desdits endroits de façon à englober tous les intrus éventuels, la détermination d'un ensemble de formes en provenance de cet endroit et l'élimination d'effets d'ombre éventuels, la normalisation de l'ensemble et la comparaison des formes normalisées avec des caractéristiques humaines ou animales. La comparaison offre un niveau fiable de reconnaissance d'un intrus humain. Si le degré de fiabilité est suffisant, l'alerte est donnée. Ainsi, les risques de fausses alertes dérivées de la présence d'un animal ou d'un objet non identifiable sont pratiquement inexistants.
PCT/US1997/024163 1996-12-23 1997-12-23 Systeme de securite video a faible taux de fausses alertes utilisant la classification d'objets WO1998028706A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CA002275893A CA2275893C (fr) 1996-12-23 1997-12-23 Systeme de securite video a faible taux de fausses alertes utilisant la classification d'objets
EP97954298A EP1010130A4 (fr) 1996-12-23 1997-12-23 Systeme de securite video a faible taux de fausses alertes utilisant la classification d'objets
AU58109/98A AU5810998A (en) 1996-12-23 1997-12-23 Low false alarm rate video security system using object classification

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US77199196A 1996-12-23 1996-12-23
US08/772,595 1996-12-23
US08/772,731 1996-12-23
US08/772,595 US5937092A (en) 1996-12-23 1996-12-23 Rejection of light intrusion false alarms in a video security system
US08/772,731 US5956424A (en) 1996-12-23 1996-12-23 Low false alarm rate detection for a video image processing based security alarm system
US08/771,991 1996-12-23

Publications (2)

Publication Number Publication Date
WO1998028706A1 true WO1998028706A1 (fr) 1998-07-02
WO1998028706B1 WO1998028706B1 (fr) 1998-09-11

Family

ID=27419676

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1997/024163 WO1998028706A1 (fr) 1996-12-23 1997-12-23 Systeme de securite video a faible taux de fausses alertes utilisant la classification d'objets

Country Status (4)

Country Link
EP (1) EP1010130A4 (fr)
AU (1) AU5810998A (fr)
CA (1) CA2275893C (fr)
WO (1) WO1998028706A1 (fr)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999065005A3 (fr) * 1998-06-08 2000-03-30 Leiv Eiriksson Nyfotek As Procede et systeme de surveillance d'une zone
EP1079350A1 (fr) * 1999-07-17 2001-02-28 Siemens Building Technologies AG Dispositif de surveillance d' un espace
WO2001048719A1 (fr) * 1999-12-23 2001-07-05 Wespot Ab Technique, systeme et module de surveillance
WO2001049033A1 (fr) * 1999-12-23 2001-07-05 Wespot Ab Traitement de donnees images
WO2002041273A1 (fr) * 2000-11-20 2002-05-23 Visual Protection Limited Système de caméra intelligente
US6774905B2 (en) 1999-12-23 2004-08-10 Wespot Ab Image data processing
US6819353B2 (en) 1999-12-23 2004-11-16 Wespot Ab Multiple backgrounds
WO2004114219A1 (fr) * 2003-06-17 2004-12-29 Mitsubishi Denki Kabushiki Kaisha Procede de detection d'un objet mobile dans une sequence temporelle d'images d'une video
US6844818B2 (en) 1998-10-20 2005-01-18 Vsd Limited Smoke detection
EP1672604A1 (fr) * 2004-12-16 2006-06-21 Siemens Schweiz AG Méthode et dispositif de détection de sabotage d'une caméra de surveillance.
WO2007126839A3 (fr) * 2006-03-29 2008-12-04 Mark Dronge système d'alerte de sécurité
US7479980B2 (en) 1999-12-23 2009-01-20 Wespot Technologies Ab Monitoring system
US7643653B2 (en) 2000-02-04 2010-01-05 Cernium Corporation System for automated screening of security cameras
US7822224B2 (en) 2005-06-22 2010-10-26 Cernium Corporation Terrain map summary elements
CN102169614A (zh) * 2011-01-14 2011-08-31 云南电力试验研究院(集团)有限公司 一种基于图像识别的电力作业安全监护方法
US8073261B2 (en) 2006-12-20 2011-12-06 Axis Ab Camera tampering detection
US9230175B2 (en) 2009-04-22 2016-01-05 Checkvideo Llc System and method for motion detection in a surveillance video
US20180167591A1 (en) * 2016-12-09 2018-06-14 Canon Europa N.V. Surveillance apparatus and a surveillance method for indicating the detection of motion
US10460456B2 (en) 2015-12-10 2019-10-29 Microsoft Technology Licensing, Llc Motion detection of object
US10535252B2 (en) 2016-08-10 2020-01-14 Comcast Cable Communications, Llc Monitoring security
EP3989196A1 (fr) * 2020-10-23 2022-04-27 Yokogawa Electric Corporation Appareil, système, procédé et programme

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110113561A (zh) * 2018-02-01 2019-08-09 广州弘度信息科技有限公司 一种人员滞留检测方法、装置、服务器及系统

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5144685A (en) * 1989-03-31 1992-09-01 Honeywell Inc. Landmark recognition for autonomous mobile robots
US5465308A (en) * 1990-06-04 1995-11-07 Datron/Transoc, Inc. Pattern recognition system
US5493273A (en) * 1993-09-28 1996-02-20 The United States Of America As Represented By The Secretary Of The Navy System for detecting perturbations in an environment using temporal sensor data

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2503613B2 (ja) * 1988-12-23 1996-06-05 松下電工株式会社 異常監視装置
JPH07192112A (ja) * 1993-12-27 1995-07-28 Oki Electric Ind Co Ltd 侵入物体認識方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5144685A (en) * 1989-03-31 1992-09-01 Honeywell Inc. Landmark recognition for autonomous mobile robots
US5465308A (en) * 1990-06-04 1995-11-07 Datron/Transoc, Inc. Pattern recognition system
US5493273A (en) * 1993-09-28 1996-02-20 The United States Of America As Represented By The Secretary Of The Navy System for detecting perturbations in an environment using temporal sensor data

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
GONZALEZ R C, ET AL.: "IMAGE SEGMENTATION AND DESCRIPTION", DIGITAL IMAGE PROCESSING, XX, XX, 1 January 1977 (1977-01-01), XX, pages 320 - 322,345,348, XP002981665 *
KANETA M, ET AL.: "IMAGE PROCESSING METHOD FOR INTRUDER DETECTION AROUND POWER LINE TOWERS", IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS., INFORMATION & SYSTEMS SOCIETY, TOKYO., JP, vol. E76-D, no. 10, 1 October 1993 (1993-10-01), JP, pages 1153 - 1161, XP002925939, ISSN: 0916-8532 *
See also references of EP1010130A4 *

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999065005A3 (fr) * 1998-06-08 2000-03-30 Leiv Eiriksson Nyfotek As Procede et systeme de surveillance d'une zone
US6844818B2 (en) 1998-10-20 2005-01-18 Vsd Limited Smoke detection
EP1079350A1 (fr) * 1999-07-17 2001-02-28 Siemens Building Technologies AG Dispositif de surveillance d' un espace
WO2001048719A1 (fr) * 1999-12-23 2001-07-05 Wespot Ab Technique, systeme et module de surveillance
WO2001049033A1 (fr) * 1999-12-23 2001-07-05 Wespot Ab Traitement de donnees images
US6774905B2 (en) 1999-12-23 2004-08-10 Wespot Ab Image data processing
US6819353B2 (en) 1999-12-23 2004-11-16 Wespot Ab Multiple backgrounds
US7479980B2 (en) 1999-12-23 2009-01-20 Wespot Technologies Ab Monitoring system
US7643653B2 (en) 2000-02-04 2010-01-05 Cernium Corporation System for automated screening of security cameras
WO2002041273A1 (fr) * 2000-11-20 2002-05-23 Visual Protection Limited Système de caméra intelligente
WO2004114219A1 (fr) * 2003-06-17 2004-12-29 Mitsubishi Denki Kabushiki Kaisha Procede de detection d'un objet mobile dans une sequence temporelle d'images d'une video
EP1672604A1 (fr) * 2004-12-16 2006-06-21 Siemens Schweiz AG Méthode et dispositif de détection de sabotage d'une caméra de surveillance.
US7822224B2 (en) 2005-06-22 2010-10-26 Cernium Corporation Terrain map summary elements
WO2007126839A3 (fr) * 2006-03-29 2008-12-04 Mark Dronge système d'alerte de sécurité
US7864983B2 (en) 2006-03-29 2011-01-04 Mark Dronge Security alarm system
US7526105B2 (en) 2006-03-29 2009-04-28 Mark Dronge Security alarm system
US8073261B2 (en) 2006-12-20 2011-12-06 Axis Ab Camera tampering detection
US9230175B2 (en) 2009-04-22 2016-01-05 Checkvideo Llc System and method for motion detection in a surveillance video
CN102169614A (zh) * 2011-01-14 2011-08-31 云南电力试验研究院(集团)有限公司 一种基于图像识别的电力作业安全监护方法
CN102169614B (zh) * 2011-01-14 2013-02-13 云南电力试验研究院(集团)有限公司 一种基于图像识别的电力作业安全监护方法
US10460456B2 (en) 2015-12-10 2019-10-29 Microsoft Technology Licensing, Llc Motion detection of object
US11367341B2 (en) 2016-08-10 2022-06-21 Comcast Cable Communications, Llc Monitoring security
US12223825B2 (en) 2016-08-10 2025-02-11 Comcast Cable Communications, Llc Monitoring security
US10535252B2 (en) 2016-08-10 2020-01-14 Comcast Cable Communications, Llc Monitoring security
US11676478B2 (en) 2016-08-10 2023-06-13 Comcast Cable Communications, Llc Monitoring security
KR20180066859A (ko) * 2016-12-09 2018-06-19 캐논 유로파 엔.브이. 모션의 검출을 표시하기 위한 감시 장치 및 감시 방법
US11310469B2 (en) * 2016-12-09 2022-04-19 Canon Kabushiki Kaisha Surveillance apparatus and a surveillance method for indicating the detection of motion
KR102279444B1 (ko) * 2016-12-09 2021-07-22 캐논 가부시끼가이샤 모션의 검출을 표시하기 위한 감시 장치 및 감시 방법
US20180167591A1 (en) * 2016-12-09 2018-06-14 Canon Europa N.V. Surveillance apparatus and a surveillance method for indicating the detection of motion
EP3989196A1 (fr) * 2020-10-23 2022-04-27 Yokogawa Electric Corporation Appareil, système, procédé et programme

Also Published As

Publication number Publication date
AU5810998A (en) 1998-07-17
CA2275893A1 (fr) 1998-07-02
EP1010130A1 (fr) 2000-06-21
EP1010130A4 (fr) 2005-08-17
CA2275893C (fr) 2005-11-29

Similar Documents

Publication Publication Date Title
CA2275893C (fr) Systeme de securite video a faible taux de fausses alertes utilisant la classification d'objets
US5937092A (en) Rejection of light intrusion false alarms in a video security system
US5956424A (en) Low false alarm rate detection for a video image processing based security alarm system
US6104831A (en) Method for rejection of flickering lights in an imaging system
KR101237089B1 (ko) 랜덤 포레스트 분류 기법을 이용한 산불연기 감지 방법
US7479980B2 (en) Monitoring system
EP1687784B1 (fr) Procede et dispositif de detection de fumee
CN101751744B (zh) 一种烟雾检测和预警方法
JP2000513848A (ja) 大域変化に感応しないビデオ動き検出器
US20060170769A1 (en) Human and object recognition in digital video
US20070188336A1 (en) Smoke detection method and apparatus
WO1998028706B1 (fr) Systeme de securite video a faible taux de fausses alertes utilisant la classification d'objets
KR20090086898A (ko) 비디오 카메라를 사용한 연기 검출
PT1628260E (pt) Processo e dispositivo para a detecção automática de fogos florestais
CN113593161A (zh) 一种周界入侵检测方法
Filippidis et al. Fusion of intelligent agents for the detection of aircraft in SAR images
Tan et al. Embedded human detection system based on thermal and infrared sensors for anti-poaching application
Mahajan et al. Detection of concealed weapons using image processing techniques: A review
Frejlichowski et al. SmartMonitor: An approach to simple, intelligent and affordable visual surveillance system
CN113239772B (zh) 自助银行或atm环境中的人员聚集预警方法与系统
WO2001048719A1 (fr) Technique, systeme et module de surveillance
JPH0620049A (ja) 侵入者識別システム
JPH09293185A (ja) 対象検知装置および対象検知方法および対象監視システム
GB2413231A (en) Surveillance apparatus identifying objects becoming stationary after moving
Frejlichowski et al. Extraction of the foreground regions by means of the adaptive background modelling based on various colour components for a visual surveillance system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE GH GM GW HU ID IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZW AM AZ BY KG KZ MD RU TJ TM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SZ UG ZW AT BE CH DE DK ES FI FR GB GR IE IT

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
ENP Entry into the national phase

Ref document number: 2275893

Country of ref document: CA

Ref country code: CA

Ref document number: 2275893

Kind code of ref document: A

Format of ref document f/p: F

WWE Wipo information: entry into national phase

Ref document number: 1997954298

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWP Wipo information: published in national office

Ref document number: 1997954298

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Ref document number: 1997954298

Country of ref document: EP

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载