WO2006032799A1 - Système d'indexation de vidéo de surveillance - Google Patents
Système d'indexation de vidéo de surveillance Download PDFInfo
- Publication number
- WO2006032799A1 WO2006032799A1 PCT/FR2005/002372 FR2005002372W WO2006032799A1 WO 2006032799 A1 WO2006032799 A1 WO 2006032799A1 FR 2005002372 W FR2005002372 W FR 2005002372W WO 2006032799 A1 WO2006032799 A1 WO 2006032799A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- characterization
- video stream
- stream
- image
- matrix
- Prior art date
Links
- 238000012512 characterization method Methods 0.000 claims abstract description 61
- 238000000034 method Methods 0.000 claims abstract description 30
- 239000011159 matrix material Substances 0.000 claims abstract description 25
- 238000012545 processing Methods 0.000 claims abstract description 17
- 238000006073 displacement reaction Methods 0.000 claims description 5
- 238000003491 array Methods 0.000 claims description 4
- 239000013598 vector Substances 0.000 claims description 4
- 238000004891 communication Methods 0.000 claims description 2
- 230000033001 locomotion Effects 0.000 description 18
- 238000004458 analytical method Methods 0.000 description 9
- 238000001514 detection method Methods 0.000 description 8
- 230000008034 disappearance Effects 0.000 description 5
- 230000006399 behavior Effects 0.000 description 4
- 230000003542 behavioural effect Effects 0.000 description 4
- 230000002123 temporal effect Effects 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 241000282412 Homo Species 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000008021 deposition Effects 0.000 description 1
- 238000010494 dissociation reaction Methods 0.000 description 1
- 230000005593 dissociations Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000010223 real-time analysis Methods 0.000 description 1
- 238000001454 recorded image Methods 0.000 description 1
- 238000012958 reprocessing Methods 0.000 description 1
- 230000009897 systematic effect Effects 0.000 description 1
- 238000011282 treatment Methods 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/78—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/783—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/7837—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using objects detected or recognised in the video content
- G06F16/784—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using objects detected or recognised in the video content the detected or recognised objects being people
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
Definitions
- the present invention relates to the field of video.
- the present invention more particularly relates to a method for characterizing and indexing images of a video stream in order to facilitate searching for elements in a surveillance video stream.
- This process focuses on indexing video sequences.
- the mechanisms implemented evoke reference images, a graphic representation of the objects contained in the sequences.
- the characterizations of the scenes relevant to the criteria of appearance / disappearance, etc. are stored in a "frozen” manner (different from a continuous stream of elements characterizing a video stream) in a database.
- This solution realizes a motion segmentation of the video and creates a symbolic representation "meta-information" of the detected objects and their movement.
- These "meta-information” are stored in a database in the form of direct graphs annotated, then indexed. Such "meta-information” exists only for moving objects.
- a characterization matrix can not be produced for each of the images of the stream and therefore a continuous stream of characterization matrices separate from the video stream is not created.
- the prior art does not propose a solution allowing a light characterization (light compared to the high rate of video) of a video sequence by a flow of characterizing matrices.
- the present invention has an inverse approach in providing a continuous flow of matrices. She describes images, in a systematic way without prejudging the behavior, which makes it possible later to analyze not on the images themselves but on sufficiently simplified simplified matrices.
- the characterization matrices thus obtained are "light" in terms of the amount of information and allow:
- WOO3 / 067884 application proposing an image sequence processing solution for detecting / tracking objects and generating an alarm when the behavior of these tracked objects corresponds to a predefined behavior.
- a characterization of the objects tracked can be provided.
- this document does not propose a stream of characterization matrices timestamped and distinct from the video stream.
- US6330025 is known in which high quality video recording is performed upon detecting a predetermined event. Possibly the operator can add annotations on the stream viewed. It is clear that a recording is only made when a predefined event is detected which goes to against a characterization of each of the images as a stream of characterization matrices.
- the document WO03 / 028376 aims to capture images of people at control points.
- a keyframe and description data are associated with each captured video sequence.
- No "image” processing is performed to characterize each of them and to provide a flow of characterization matrices.
- the present invention intends to overcome the drawbacks of the prior art by proposing a method for characterizing the image elements of a video stream by characterization matrices, these matrices constituting a data stream timestamped and parallel to the video stream.
- the method according to the present invention responds particularly well to the search and the detection of elements inside a video stream and to the temporal constraints that they induce. Specifically, the invention applies both to a "real-time" detection by the detection of a matrix similar to that sought after than to the posterior detection by traversing the flow of characterizing matrices rather than the video stream, too much. heavy.
- the invention relates in its most general sense to a method for characterizing images of a video stream for the search of elements, comprising:
- a step of encoding said digitized video stream by an encoding unit characterized in that
- Said characterization and encoding steps are performed in parallel and in real time;
- Said characterization step produces a characterization matrix of each of the images of said digitized video stream
- said characterization matrix comprises:
- said target characterization comprises:
- said method further comprises a step of capturing a fixed image zoomed on one of said characterized targets.
- said method further comprises a step of associating said fixed image with the characterization matrix comprising said zoomed target.
- said capture is performed by an annex camera.
- the invention also relates to a system for implementing the method comprising an image sensor, a frame memory, a scanning unit, a computing unit, an encoding unit comprising an encoding processor.
- the invention also relates to a method of searching for elements in a video stream, characterized in that it implements the characterization method and in that it furthermore comprises:
- the invention also relates to a method of searching for elements in a video stream, characterized in that it implements the characterization method and in that it further comprises: a step of transmitting said matrix of characterization generated during the characterization step at a remote server through a communication network;
- a comparison step on said remote server of said characterization matrices with search criteria is a comparison step on said remote server of said characterization matrices with search criteria.
- said method further comprises:
- FIG. 1 represents the architecture of the system according to the present invention
- FIG. 2 illustrates a simplified graphic representation of a scene analyzed by the present invention.
- the invention relates to a method and a system for characterizing, in real time, the images of the CCTV video stream. This characterization is stored as a very small file in comparison with the weight of the images. These files are associated with the image streams by a time stamp and are either transmitted in parallel of the video stream or transmitted alone.
- the invention relates to a system for analyzing CCTV images to determine and retrieve key elements during search, in real time or a posteriori.
- the advantage of the method of the present invention is to allow a real-time analysis or a retrospective search on these files to search quickly, efficiently and simplified characteristic elements.
- the present invention comprises an image sensor (1) of the digital camera or video surveillance camera type.
- the sensor delivers analog or digital information containing a series of images (video signal for example).
- a step of digitizing the flow may be necessary if the sensor is analog.
- a frame memory makes it possible to store, temporarily (the processing time of the previous frame) and in real time, the frames generated by the image sensor.
- the image thus acquired and more precisely the frame stored in the frame memory is transmitted to an encoding unit whose purpose is to compress the video signal and to obtain a bitstream (in which the images are indexed by the timestamping system.
- the same image is transmitted to the processing unit whose purpose is to derive the essential characteristics of which the present text refers.
- the characteristic matrices thus obtained are also individually timestamped identically to the compressed images they characterize.
- the video stream and the matrix stream are then two separate data streams whose correspondence between matrix and image is ensured by the timestamping of these same streams.
- the treatments according to the present invention are carried out, preferably in real time in order to maintain the best system performance (speed).
- Real-time processing means processing that lasts up to the time interval between two consecutive frames. Thus, each new frame is processed before receiving the next one.
- the compressed bitstream and the sequence of characteristic matrices are then stored in a mass memory (memory unit) of application-dependent length: in general, several days to several months for video surveillance applications.
- mass memory memory unit
- a network interface makes it possible to give access to the compressed bitstream or to the matrices characteristics.
- the external stations can also launch search queries to the processing unit, these searches being performed on the characteristic matrices. Once found the matrix corresponding to the search criteria (or reference image), it is possible thanks to the timestamp information and a set of index index to access the image or sequence of corresponding images.
- the characterization of the images is carried out in real time at the moment when they are acquired, recorded and possibly transmitted.
- the result of this characterization is a set of information, which we will call descriptive matrix, in the form of a file of small size compared to the size of the image, even compressed, which is associated with it.
- This association is achieved by a temporal marking method with the image that it characterizes.
- Some meta-data association systems of the MPEG-7 type (Motion Picture Expert Group) can be used to make this association.
- This file is saved alongside the image. It can be exploited or transmitted alone, without the image that it characterizes.
- the search for elements may be subsequent to the recording of these data (matrices and video).
- the interest of the invention is to allow search through characteristic matrices and not images.
- it significantly simplifies the reprocessing and accelerates.
- the bandwidth needs are greatly reduced given the small size of the files, which facilitates, moreover, the research on a large number of sources.
- Another application of the present invention relates to real-time search. Characteristics of the searched element are sent to the processing server in real time. When creating the arrays, while the server receives the video stream from the sensor, the server compares the characteristics of the searched element with the matrices created in real time. If the characteristics match, an alarm can be raised.
- the matrix of characterization is specific of the elements that one can be brought to monitor (human, car, object) that one calls "targets".
- the basic structure is composed as follows:
- the essential part of the matrix concerns the characterization of the targets since the search will be all the more refined if the characterizations have been well chosen and well realized.
- characterization of the targets we note: • the position in the image,
- the three-zone color characteristics are particularly well suited to characterization and self-research, "high" corresponding to the head
- the presence duration information of the target in the image makes it possible to carry out calculations on the duration of presence and possibly to establish rules following a too long presence in a passage zone, for example.
- the invention also makes it possible, in real time and from the coordinates of the targets, to make complementary photos to the overall video sequences.
- an image sensor for example a digital camera separate from the camera, or use the same camera.
- the photo can be made either by electronic zoom if the sensor of the camera used for the characterization is sufficient, or by an additional mobile camera that will focus on the target with a high magnification and will record the photo.
- the fixed image thus recorded may also be associated with the characteristic matrix in order to allow a finer analysis.
- the "close-up" photos thus obtained are stored as objects with hypertext links allowing access to them.
- a display device makes it possible to display a graphical representation of the elements analyzed in the scene by the present invention.
- a display device can be realized by computer means (processor, memory and software for interpreting and transcribing the characteristic matrices of the targets) and a display screen.
- FIG. 2 represents the display using simple shapes (round, square, etc.) of the geographic location of the targets (2) in the analyzed scene.
- An insert (3) informs of the number of targets present in the scene (four in this case) and the time of the analysis.
- the device also allows the display of the instantaneous movement of the target with the help of arrows (6) but also that of the path (4) followed by the target (the historization can be done for example by saving the last positions of the target the target in the computer memory).
- ancillary means such as a computer mouse, to "point" a target on the screen and display the associated characteristics (5) to this target.
- a very useful application of the present invention relates to the disappearance of a child in a city whose surveillance is provided by digital cameras or image recording servers implementing the method of the invention. To trace the child, it is sufficient to describe the characteristics of the child: • size, shape (human),
- the query can then be sent to the servers looking in the characteristic matrices if these criteria are found, and if they are not completely, what is the percentage of correspondence
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Library & Information Science (AREA)
- General Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Analysis (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR0452138A FR2875629B1 (fr) | 2004-09-23 | 2004-09-23 | Systeme d'indexation de video de surveillance |
FR0452138 | 2004-09-23 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006032799A1 true WO2006032799A1 (fr) | 2006-03-30 |
Family
ID=34948673
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/FR2005/002372 WO2006032799A1 (fr) | 2004-09-23 | 2005-09-23 | Système d'indexation de vidéo de surveillance |
Country Status (2)
Country | Link |
---|---|
FR (1) | FR2875629B1 (fr) |
WO (1) | WO2006032799A1 (fr) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10424342B2 (en) * | 2010-07-28 | 2019-09-24 | International Business Machines Corporation | Facilitating people search in video surveillance |
US9134399B2 (en) | 2010-07-28 | 2015-09-15 | International Business Machines Corporation | Attribute-based person tracking across multiple cameras |
US8515127B2 (en) | 2010-07-28 | 2013-08-20 | International Business Machines Corporation | Multispectral detection of personal attributes for video surveillance |
US8532390B2 (en) | 2010-07-28 | 2013-09-10 | International Business Machines Corporation | Semantic parsing of objects in video |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5969755A (en) * | 1996-02-05 | 1999-10-19 | Texas Instruments Incorporated | Motion based event detection system and method |
US6330025B1 (en) * | 1999-05-10 | 2001-12-11 | Nice Systems Ltd. | Digital video logging system |
WO2003028376A1 (fr) * | 2001-09-14 | 2003-04-03 | Vislog Technology Pte Ltd | Systeme d'enregistrement de point de controle/comptoir de service client avec capture, indexation et recherche d'image/video et fonction de correspondance avec une liste noire |
WO2003067884A1 (fr) * | 2002-02-06 | 2003-08-14 | Nice Systems Ltd. | Procede et appareil permettant une poursuite d'objets reposant sur une sequence de trame video |
US20040161133A1 (en) * | 2002-02-06 | 2004-08-19 | Avishai Elazar | System and method for video content analysis-based detection, surveillance and alarm management |
-
2004
- 2004-09-23 FR FR0452138A patent/FR2875629B1/fr not_active Expired - Fee Related
-
2005
- 2005-09-23 WO PCT/FR2005/002372 patent/WO2006032799A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5969755A (en) * | 1996-02-05 | 1999-10-19 | Texas Instruments Incorporated | Motion based event detection system and method |
US6330025B1 (en) * | 1999-05-10 | 2001-12-11 | Nice Systems Ltd. | Digital video logging system |
WO2003028376A1 (fr) * | 2001-09-14 | 2003-04-03 | Vislog Technology Pte Ltd | Systeme d'enregistrement de point de controle/comptoir de service client avec capture, indexation et recherche d'image/video et fonction de correspondance avec une liste noire |
WO2003067884A1 (fr) * | 2002-02-06 | 2003-08-14 | Nice Systems Ltd. | Procede et appareil permettant une poursuite d'objets reposant sur une sequence de trame video |
US20040161133A1 (en) * | 2002-02-06 | 2004-08-19 | Avishai Elazar | System and method for video content analysis-based detection, surveillance and alarm management |
Also Published As
Publication number | Publication date |
---|---|
FR2875629B1 (fr) | 2007-07-13 |
FR2875629A1 (fr) | 2006-03-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20200265085A1 (en) | Searching recorded video | |
US11210504B2 (en) | Emotion detection enabled video redaction | |
US20200012674A1 (en) | System and methods thereof for generation of taxonomies based on an analysis of multimedia content elements | |
US9171075B2 (en) | Searching recorded video | |
KR101223424B1 (ko) | 비디오 모션 검출 | |
US8594373B2 (en) | Method for identifying an object in a video archive | |
US20070058842A1 (en) | Storage of video analysis data for real-time alerting and forensic analysis | |
Heller et al. | Interactive lifelog retrieval with vitrivr | |
Han et al. | GlimpseData: Towards continuous vision-based personal analytics | |
CN111222373A (zh) | 一种人员行为分析方法、装置和电子设备 | |
EP3496000A1 (fr) | Extraction automatique d'attributs d'un objet au sein d'un ensemble d'images numeriques | |
WO2006032799A1 (fr) | Système d'indexation de vidéo de surveillance | |
JP6909657B2 (ja) | 映像認識システム | |
CN112036306A (zh) | 基于监控视频解析实现目标追踪的系统及其方法 | |
FR2936627A1 (fr) | Procede d'optimisation de la recherche d'une scene a partir d'un flux d'images archivees dans une base de donnees video. | |
del Molino et al. | Organizing and retrieving episodic memories from first person view | |
Blighe et al. | Exploiting context information to aid landmark detection in sensecam images | |
Kansal et al. | CARF-Net: CNN attention and RNN fusion network for video-based person reidentification | |
Castro-Girón et al. | A Method for Dataset Labeling for Activity Recognition in Videos | |
FR2872326A1 (fr) | Procede de detection d'evenements par videosurveillance | |
DeAngelus et al. | On-demand Forensic Video Analytics for Large-Scale Surveillance Systems | |
Kamble | Life Logging: A Practicable Approach | |
WO2024079119A1 (fr) | Système de surveillance | |
Tejaswini et al. | Finding missing person using image similarity and euclidean loss | |
CN110569778A (zh) | 一种基于大数据处理技术的行人身份识别系统 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 05807456 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 05807456 Country of ref document: EP Kind code of ref document: A1 |