+

CN108051777B - Target tracking method, device and electronic device - Google Patents

Target tracking method, device and electronic device Download PDF

Info

Publication number
CN108051777B
CN108051777B CN201711247108.0A CN201711247108A CN108051777B CN 108051777 B CN108051777 B CN 108051777B CN 201711247108 A CN201711247108 A CN 201711247108A CN 108051777 B CN108051777 B CN 108051777B
Authority
CN
China
Prior art keywords
target object
information
detection area
distance
movement track
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201711247108.0A
Other languages
Chinese (zh)
Other versions
CN108051777A (en
Inventor
刘丹青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Megvii Technology Co Ltd
Original Assignee
Beijing Megvii Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Megvii Technology Co Ltd filed Critical Beijing Megvii Technology Co Ltd
Priority to CN201711247108.0A priority Critical patent/CN108051777B/en
Publication of CN108051777A publication Critical patent/CN108051777A/en
Application granted granted Critical
Publication of CN108051777B publication Critical patent/CN108051777B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0294Trajectory determination or predictive filtering, e.g. target tracking or Kalman filtering
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/50Systems of measurement based on relative movement of target
    • G01S17/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Image Analysis (AREA)

Abstract

本发明提供了一种目标的追踪方法、装置及电子设备,涉及智能识别的技术领域,该方法包括:获取多个距离传感器在探测区域中对目标对象进行探测时得到的距离信息;获取每个距离传感器在探测区域的安装位置;结合距离信息和安装位置对各个目标对象进行追踪,以确定各个目标对象在探测区域中的移动轨迹,本发明缓解了现有技术中无法对探测区域内的目标进行无感知追踪的技术问题。

Figure 201711247108

The invention provides a target tracking method, device and electronic equipment, and relates to the technical field of intelligent identification. The method includes: acquiring distance information obtained when a plurality of distance sensors detect a target object in a detection area; acquiring each The installation position of the distance sensor in the detection area; each target object is tracked in combination with the distance information and the installation position to determine the movement trajectory of each target object in the detection area, and the present invention alleviates the inability to detect targets in the detection area in the prior art Technical issues with sensorless tracking.

Figure 201711247108

Description

Target tracking method and device and electronic equipment
Technical Field
The invention relates to the technical field of intelligent identification, in particular to a target tracking method and device and electronic equipment.
Background
With the rapid development of big data analysis technology, big data analysis has been applied to various fields, for example, physical stores such as shops. In a traditional physical shopping place such as a supermarket, a convenience store, a shopping mall and the like, when the purchase behavior of a customer is analyzed through big data, the analysis can be performed only through the type of goods purchased by the customer when the customer checks out at a bank desk. With the development of tracking and positioning technology, the moving track of the customer can also be determined by the existing tracking and positioning mode.
Existing tracking and locating methods require that the bluetooth or WiFi of the client's mobile device remain on all the time, thus limiting its use. In addition, the conventional tracking and positioning method cannot achieve the expected tracking accuracy due to the fact that the signal intensity is blocked by the direction of the mobile equipment, a human body or an indoor object. Moreover, the existing tracking and positioning mode needs active cooperation of customers, opens a special APP at the mobile equipment end, and keeps the front camera up without shielding. This undoubtedly brings trouble to the customer, affecting the customer experience.
No effective solution has been proposed to the above problems.
Disclosure of Invention
In view of the above, an object of the present invention is to provide a method, an apparatus and an electronic device for tracking a target, so as to alleviate the technical problem that the target in a detection area cannot be tracked without sensing in the prior art.
In a first aspect, an embodiment of the present invention provides a method for tracking a target, including: acquiring distance information obtained when a target object is detected in a detection area by a plurality of distance sensors; acquiring the installation position of each distance sensor in the detection area; and tracking each target object by combining the distance information and the installation position to determine the moving track of each target object in the detection area.
Further, tracking each target object by combining the distance information and the installation position to determine a moving track of each target object in the detection area comprises: starting with a starting distance sensor, determining one or more continuous target distance sensors in the plurality of distance sensors, and drawing a moving track of the target object in combination with the installation positions of the target distance sensors; the starting distance sensor is the first sensor triggered in the plurality of distance sensors when the target object enters the detection area, and the target distance sensor is a distance sensor which continuously detects the same target object.
Further, the method further comprises: acquiring attribute characteristics of the target object; establishing an association relation between the attribute characteristics and the moving track according to the acquisition time of the attribute characteristics and the initial trigger time of the moving track to obtain association data; the associated data comprises attribute characteristics and a moving track of the same target object, and the starting trigger time is a trigger time corresponding to the starting point of the moving track.
Further, the obtaining of the attribute feature of the target object includes: acquiring image information which is acquired by an image acquisition device and contains the target object, wherein the image information is an image acquired when the target object enters or leaves the detection area, and the image information comprises physical information and/or clothing information of the target object; and performing attribute analysis on the image information to obtain attribute characteristics of the target object.
Further, the image information includes a plurality of target objects which appear at the same time, and the establishing of the association relationship between the attribute feature and the movement track according to the acquisition time of the attribute feature and the start trigger time of the movement track includes: acquiring position information of a plurality of target objects included in the image information; and establishing an association relation between the attribute characteristics of the target object and the moving track according to the position information of each target object, the acquisition time of the attribute characteristics of each target object and the initial trigger time of the moving track.
Further, the method further comprises: acquiring attribute characteristics of the moving track; and analyzing the associated data by combining the attribute characteristics of the movement track and/or the attribute characteristics of the target object to obtain a movement track distribution map belonging to each attribute characteristic.
Further, the method further comprises: determining label information, wherein the label information is used for distinguishing each moving track; and binding the label information and the moving track.
Further, the tag information is determined by any one of the following methods: determining the label information by using the face feature information, wherein one face feature information corresponds to one label information; determining the label information by using the generation time of the movement track; and determining the label information by using the moving track.
Further, the method further comprises: and carrying out data analysis on the moving tracks belonging to different label information to obtain a moving track distribution diagram of each label information.
Further, the plurality of distance sensors are mounted at the top end of the detection area in the form of a sensor array, wherein the number of the sensor array is one or more.
In a second aspect, an embodiment of the present invention further provides a tracking apparatus for a target, including: a first acquisition unit configured to acquire distance information obtained when a target object is detected in a detection region by a plurality of distance sensors; a second acquisition unit configured to acquire an installation position of each distance sensor in the detection area; and the track tracking unit is used for tracking each target object by combining the distance information and the installation position so as to determine the moving track of each target object in the detection area.
In a third aspect, an embodiment of the present invention further provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the method when executing the computer program.
In a fourth aspect, the present invention also provides a computer-readable medium having a non-volatile program code executable by a processor, where the program code causes the processor to execute the method described above.
In the embodiment of the invention, firstly, distance information obtained when a plurality of distance sensors detect a target object in a detection area is obtained; then acquiring the installation position of each distance sensor in a detection area; and finally, tracking each target object by combining the distance information and the installation position to determine the moving track of each target object in the detection area. In the embodiment of the invention, the target object can be accurately positioned and tracked in an imperceptible manner by acquiring the distance information through the distance sensor, so that the technical problem that the target in the detection area cannot be tracked in an imperceptible manner in the prior art is solved, and the technical effect of carrying out the imperceptible tracking on the target object in the detection area is realized.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a schematic diagram of an electronic device according to an embodiment of the invention;
FIG. 2 is a flow chart of a method for tracking a target according to an embodiment of the invention;
FIG. 3 is a schematic illustration of an installation of a sensor array according to an embodiment of the invention;
fig. 4 is a flowchart of tracking each target object by combining the distance information and the installation position to determine the moving track of each target object in the detection area according to the embodiment of the present invention;
fig. 5 is a schematic diagram of a target tracking device according to an embodiment of the invention.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The first embodiment is as follows:
first, an example electronic device 100 for implementing a tracking method of an object of an embodiment of the present invention is described with reference to fig. 1.
As shown in FIG. 1, electronic device 100 includes one or more processors 102, one or more memory devices 104, an input device 106, an output device 108, and a distance sensor 110, which are interconnected via a bus system 112 and/or other form of connection mechanism (not shown). It should be noted that the components and structure of the electronic device 100 shown in fig. 1 are exemplary only, and not limiting, and the electronic device may have other components and structures as desired.
The processor 102 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in the electronic device 100 to perform desired functions.
The storage 104 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, Random Access Memory (RAM), cache memory (cache), and/or the like. The non-volatile memory may include, for example, Read Only Memory (ROM), hard disk, flash memory, etc. On which one or more computer program instructions may be stored that may be executed by processor 102 to implement client-side functionality (implemented by the processor) and/or other desired functionality in embodiments of the invention described below. Various applications and various data, such as various data used and/or generated by the applications, may also be stored in the computer-readable storage medium.
The input device 106 may be a device used by a user to input instructions and may include one or more of a keyboard, a mouse, a microphone, a touch screen, and the like.
The output device 108 may output various information (e.g., images or sounds) to the outside (e.g., a user), and may include one or more of a display, a speaker, and the like.
The distance sensor 110 may collect distance information and store the collected distance information in the memory device 104 for use by other components.
Exemplarily, an exemplary electronic device for implementing the tracking method of the target according to the embodiment of the present invention may be implemented on a mobile terminal such as a smartphone, a tablet computer, or the like.
Example two:
in accordance with an embodiment of the present invention, there is provided an embodiment of a method for tracking objects, it is noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions and that, although a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than that described herein.
Fig. 2 is a flowchart of a method for tracking a target according to an embodiment of the present invention, as shown in fig. 2, the method includes the following steps:
step S202, obtaining distance information obtained when a plurality of distance sensors detect a target object in a detection area;
in the embodiment of the present invention, the detection area may be a store, a mall, or a food court. The target object is a person entering the detection area, e.g. a customer entering a shop, a customer entering a mall.
It should be noted that, in the embodiment of the present invention, the target object is not limited to a human being, and may be any object moving in the detection area, and may be specifically determined according to the actual needs of the user.
Step S204, acquiring the installation position of each distance sensor in the detection area;
in the embodiment of the present invention, the installation position is expressed as coordinate information of the distance sensor within the detection area. When the distance sensor detects the passage of the target object, the detected distance information will change. For example, when the distance information detected by a certain distance sensor changes from 2.5 meters to 1 meter, it indicates that the distance sensor detects that a target object passes through.
Step S206, tracking each target object by combining the distance information and the installation position to determine the moving track of each target object in the detection area;
in the embodiment of the invention, the target object can be tracked through the distance information output by the distance sensors and the installation positions of the distance sensors so as to determine the moving track of each target object in the detection area. According to the above description, in the process of tracking the target object, the whole process is perceptive-free, and the tracking of the moving track can be realized without starting any terminal equipment by the target object.
In the embodiment of the present invention, the above-described steps S202 to S206 may be performed by a processor. The processor may be a processor installed in the detection area, and may also be a cloud processor. When the processor is a processor installed in the detection area, the processor acquires distance information acquired by the distance sensor and then generates a movement trajectory of the target object based on the distance information, wherein at this time, the processor and the distance sensor may be connected by wire or wirelessly. When the processor is a cloud processor, the distance sensor transmits the distance information to the cloud processor through the local area network, so that the cloud processor generates a moving track of the target object based on the distance information.
It should be noted that, in addition to the two processors, the processors capable of executing step S202 to step S206 may be applied to the embodiment of the present invention, and this is not particularly limited.
In the embodiment of the invention, firstly, distance information obtained when a plurality of distance sensors detect a target object in a detection area is obtained; then acquiring the installation position of each distance sensor in a detection area; and finally, tracking each target object by combining the distance information and the installation position to determine the moving track of each target object in the detection area. In the embodiment of the invention, the target object can be accurately positioned and tracked in an imperceptible manner by acquiring the distance information through the distance sensor, so that the technical problem that the target in the detection area cannot be tracked in an imperceptible manner in the prior art is solved, and the technical effect of carrying out the imperceptible tracking on the target object in the detection area is realized.
In an embodiment of the present invention, a plurality of distance sensors are mounted at the top end of the detection region in the form of a sensor array, wherein the number of the sensor array is one or more.
The single distance sensor in the plurality of distance sensors includes, but is not limited to, ToF, ultrasonic, infrared light switch, microwave, and the like. In addition, the plurality of distance sensors may be selected as a lidar, and may be selected as a low line count lidar, such as 1-8 line lidar, to enable tracking of a person's trajectory.
It should be noted that, when a single distance sensor is selected from the group consisting of ToF, ultrasonic, infrared switch, microwave, and the like, a plurality of distance sensors are mounted on the top end of the detection area in the form of a sensor array, that is, the sensor array is mounted in the detection area by ceiling mounting.
As shown in fig. 3, a sensor array 2 is installed between two rows of shelves 1 by means of ceiling installation. Because each distance sensor in the sensor array has corresponding detection range and detection precision, when the area of the detection area exceeds the detection range of the sensor array, a plurality of sensor arrays can be installed at the top end of the detection area in a suspended ceiling installation mode. The specific installation mode can be that a plurality of sensor arrays are installed at the top end of the detection area at equal intervals. For example, as shown in fig. 3, a sensor array is mounted at the top between any two adjacent shelves (i.e., in the hallway). In addition to the equidistant mounting, the sensor arrays may be mounted in non-equidistant mounting.
In the embodiment of the present invention, the plurality of sensor arrays may be mounted in parallel or in a non-parallel manner. In the non-parallel installation mode, the installation mode can be a vertical intersection installation mode, a non-vertical intersection installation mode, or a non-vertical intersection installation mode. In the embodiment of the present invention, the installation manner of the plurality of sensor arrays is not specifically limited, and a user may set the number of the plurality of sensor arrays and the installation manner of the plurality of sensor arrays according to the actual channel width in the detection region and the required data accuracy.
Through installing sensor array at the top of surveying the region, can carry out distance detection in real time to following through sensor array, if there is the pedestrian (promptly, target object) to appear, can judge that the object that has certain stable height constantly changes in the position, and the change of target object's position can obtain through the reading change of different distance sensor. In this way, the system can recognize that an independent shape (i.e., the target object) with a certain height moves under the sensor array, so that the target object can be tracked, and the moving track of the target object can be obtained.
In the embodiment of the invention, after the plurality of distance sensors are installed, the target object in the detection area can be detected by the plurality of distance sensors, so that the distance information is obtained. Then, the respective target objects are tracked based on the installation position and the distance information of each distance sensor to determine the movement trajectories of the respective target objects in the detection area.
In an alternative embodiment, as shown in fig. 4, in step S206, the tracking each target object by combining the distance information and the installation position to determine the moving track of each target object in the detection area includes the following steps:
step S2061, starting with a starting distance sensor, determining one or more continuous target distance sensors among the plurality of distance sensors, and drawing a movement trajectory of the target object in combination with the installation positions of the target distance sensors;
the starting distance sensor is the first sensor triggered in the plurality of distance sensors when the target object enters the detection area, and the target distance sensor is a distance sensor which continuously detects the same target object.
In the embodiment of the present invention, when a target object enters the detection area, one or more distance sensors (i.e., initial distance sensors) installed at the entrance of the detection area will detect that the target object enters the detection area, and at this time, the one or more distance sensors will output distance information obtained when the target object is detected. Next, one or more target distance sensors that continuously detect the target object may be determined starting from the one or more distance sensors, where the one or more target distance sensors are continuous sensors and the one or more target distance sensors are distance sensors that continuously detect the same target object. After one or more target distance sensors are determined, the moving track of the target object can be drawn by combining the installation positions of the target distance sensors in the detection area.
For example, when a target object a enters the detection area, a distance sensor B located at the entrance of the detection area detects the target object a, wherein the distance sensor B is a starting distance sensor. At this time, the distance sensor B outputs a distance information. If the distance between the distance sensor B and the ground is 2.5 meters, and the height of the target object A is 1.8 meters, the position information output by the distance sensor B is 0.7 meter. At this time, a range sensor that continuously outputs 0.7 m of range information is determined as a target range sensor in the sensor array starting with the range sensor B. Then, a movement locus of the target object is drawn based on the installation position of the target distance sensor in the detection region.
In the embodiment of the invention, the target object can be accurately positioned and tracked in an imperceptible manner by acquiring the distance information through the distance sensor, so that the technical problem that the target in the detection area cannot be tracked in an imperceptible manner in the prior art is solved, and the technical effect of carrying out the imperceptible tracking on the target object in the detection area is realized.
But trajectory tracking is most important for retail sales to know the trajectory of a particular group of people, i.e., what the trajectory of people of different attributes (gender, age, etc.) is within the store. Therefore, the association relationship between the attribute characteristics of the target object and the movement track of the target object still needs to be established in combination with machine vision.
In the embodiment of the present invention, when tracking the movement trajectory of each target object, it is further necessary to perform data association between the attribute features of the target object and the movement trajectory of the target object in a time axis synchronization manner.
In an optional embodiment, establishing the association relationship between the attribute feature of the target object and the movement track of the target object may be implemented by the following description process:
firstly, acquiring the attribute characteristics of the target object;
wherein the obtaining of the attribute characteristics of the target object comprises: acquiring image information which is acquired by an image acquisition device and contains the target object, wherein the image information is an image acquired when the target object enters or leaves the detection area, and the image information comprises physical information and/or clothing information of the target object; and performing attribute analysis on the image information to obtain attribute characteristics of the target object.
Then, according to the acquisition time of the attribute characteristics and the initial trigger time of the moving track, establishing an association relationship between the attribute characteristics and the moving track to obtain association data; the associated data comprises attribute characteristics and a moving track of the same target object, and the starting trigger time is a trigger time corresponding to the starting point of the moving track.
It should be noted that, if the target object is a person, the physical information may be information such as face information, hair style information, body type information, posture and gait; the clothing information may be clothing information, for example, information related to clothes, and information related to hats, for example, whether or not a hat is worn, what type of hat is worn, and the like; the above attribute features include information on sex, age, height, race (caucasian, caucasian and black), hairstyle and clothing.
In the embodiment of the present invention, when a target object enters the detection area or when a tracked target object exits the detection area, image information of each target object entering or exiting the detection area may be acquired by using an image acquisition device (for example, an RGB camera with a face recognition function), and as can be seen from the above description, the acquired image information includes the physical appearance information and/or clothing information of the target object. At this time, an attribute analysis may be performed according to the physical information and/or the clothing information to determine an attribute feature of the target object, wherein the attribute analysis includes analysis of a human face, a human body, and the like, and the analyzed attribute feature includes information of gender, age, height, race (caucasian, yellow-seeded, and black-seeded), hair style, clothing, and the like.
After the attribute features are obtained through analysis, the attribute features of the person and the moving tracks identified by the sensor array can be subjected to data association in a time axis synchronization mode to obtain associated data. For example, the attribute feature obtained by analyzing the image information captured by the RGB camera at the time T0 (i.e., the start acquisition time) is combined with the position information of the sensor array at the time T0 (i.e., the start trigger time) to establish the association information between the attribute feature and the movement track. After the association relationship is established, association data is generated.
In the embodiment of the invention, the shooting angle of the RGB camera can be properly adjusted, a shooting range can be set, only the face in the shooting range is identified, and the remote face can be filtered by a face size threshold; for a plurality of faces transversely appearing in the shooting range, the position information of the faces can be extracted, and according to the difference of the position information, the faces are combined with the position information obtained by the sensor array at the time of T0 to distinguish different people. At subsequent times, different persons are tracked separately.
It should be noted that, the attribute analysis of the person can be performed when entering the detection area, or when leaving the detection area, and the data is more accurate because only one face is aligned with the camera when leaving the detection area; according to the acquired human face or human body RGB image, human attribute analysis including sex, age, height, hairstyle, clothes and the like is carried out, and the attribute characteristics of the human are associated with subsequent track information.
When the attribute feature of the target object is in data association with the movement track of the target object, the acquisition time of the attribute feature of the target object and the start trigger time of the movement track can be acquired. The acquisition time is the time when the image acquisition device shoots that the target object enters the detection area. And if the two times are the same, performing data association on the attribute characteristics and the movement track with the same time.
For example, a certain object a is 13:00 enter a convenience store. At this point, the image capture device 1 will be at 13: 00. At this time, the distance sensor B will also detect the entry of the object a into the detection area at 13:00 minutes (i.e., the initial trigger time). After the distance sensor B detects that the object a enters the detection area, the moving trajectory of the object a will be tracked based on the distance information detected by each distance sensor and the installation position of each distance sensor, and the moving trajectory of the object a will be drawn. In establishing the data association relationship between the attribute features of the target object and the movement track of the target object, the association relationship is established based on the time when the object A enters the store (i.e., 13:00 minutes) and the initial trigger time of the object A (i.e., 13:00 minutes). It should be noted that the obtaining time and the starting triggering time are not required to be strictly consistent, and a certain error may exist, and the error value may be set according to an actual situation, which is not specifically limited.
When a plurality of customers enter a shop simultaneously, a plurality of target objects to be presented simultaneously in the image information, in this case, according to the acquisition time of the attribute feature and the start trigger time of the movement trajectory, establishing an association relationship between the attribute feature and the movement trajectory, and obtaining association data includes the following steps:
firstly, acquiring position information of a plurality of target objects included in the image information;
then, according to the position information of each target object, the acquisition time of the attribute feature of each target object and the starting trigger time of the movement track, establishing an association relationship between the attribute feature of the target object and the movement track.
If two customers enter a certain shop at the same time, the image acquisition device acquires image information containing the two customers at the same time. At this time, the image capturing device may transmit the position relationship between two customers in the image information to the gateway device of the sensor array, so that the sensor array establishes an association relationship between the attribute features of each customer and the movement trajectory according to the position information of each customer, the attribute features of each customer, and the start trigger time of the movement trajectory of each customer.
Specifically, when the image acquisition device acquires image information containing two customers, the distance sensors in the sensor array respectively detect the two customers and output corresponding distance information. At this time, the association relationship between the attribute features of the two customers and the corresponding movement trajectories may be established based on the position information of the distance sensors that detect the two customers and the position information of the two customers.
In the embodiment of the present invention, after the attribute features of the target object are associated with the movement trajectory, the associated data may be classified, and the classification may be specifically implemented through the following processes:
firstly, acquiring the attribute characteristics of the moving track;
and then, analyzing the associated data by combining the attribute characteristics of the movement track and/or the attribute characteristics of the target object to obtain a movement track distribution diagram belonging to each attribute characteristic.
In the embodiment of the present invention, after the movement trajectory is generated, the attribute feature of the movement trajectory may be further generated based on the generation time of the movement trajectory and the movement trajectory itself. The associated data may then be classified according to the attribute features of the target object and/or the attribute features of the movement trajectory. For example, the associated data is classified according to the generation time of the movement trajectory to determine the movement trajectory of the target object and the number of target objects in each time period. For another example, the related data with the ages of 20-35 years can be classified into one group according to the attribute characteristics of the target object, and the related data with the ages of more than 55 years can be classified into one group, so as to determine the purchasing behaviors of customers with different ages. For another example, the movement trajectories may be grouped in combination with the generation time of the movement trajectories and the age of the target object. The specific grouping manner is not particularly limited in the embodiment of the present invention.
In the embodiment of the invention, the movement track distribution diagram of each label information can be obtained by analyzing the associated data. The data analysis includes thermodynamic analysis of the trajectory, analysis of the trajectory of a male or female, analysis of the trajectory of a particular age group, and the like. If the detection area is a shop or the like, the result of the big data analysis can be used for shopping guide, commodity placement position planning and the like.
In the embodiment of the present invention, after the movement tracks are generated, the movement tracks also need to be distinguished, and specifically, the movement tracks can be distinguished in the following manner:
firstly, determining label information, wherein the label information is used for distinguishing each moving track; wherein the tag information is determined by any one of the following methods: determining the label information by using the face feature information, wherein one face feature information corresponds to one label information; determining the label information by using the generation time of the movement track; determining the label information by using the moving track;
and then, binding the label information and the movement track.
In the embodiment of the present invention, the recorded tracks all contain their own tag information (i.e., ID) for distinguishing between the tracks. At this time, the face feature information may be used as a tag for distinguishing different persons and person trajectories, and at this time, a unique tag information (i.e., ID) is generated using the face attribute feature detected by the image detection apparatus without registering a base; in addition, a tag information (i.e., ID) may be generated using the time when the movement trace is generated and the movement trace itself. The purpose of the tag information is to distinguish between different track data.
In the embodiment of the present invention, the face information of the target object may be analyzed from the image information captured by the RGB capture camera, so as to obtain the face feature information of the target object, where the face feature information includes feature information of eyes, feature information of mouth, feature information of nose, and the like, for example, feature points of eyes, mouth, and nose, and positions thereof. As can be seen from the above description, by determining tag information for a movement trajectory, it is possible to distinguish a large number of movement trajectories.
It should be noted that after determining the tag information for each movement track, an association relationship between the movement track and the attribute features of the target object may also be established, so as to implement big data analysis and processing on the movement track.
In the embodiment of the invention, the data output by the distance sensor is not interfered by the signal communication quality at all, and is not interfered by the background color and the light intensity in the image information. The target object can be located and tracked by simply installing a plurality of distance sensors on the ceiling of the detection area. Compared with the traditional tracking method, the method provided by the embodiment of the invention greatly reduces the requirements on algorithm and computing power, improves the data precision and enables low-cost track tracking to be possible.
Example three:
the embodiment of the present invention further provides a target tracking device, which is mainly used for executing the target tracking method provided by the above-mentioned embodiments of the present invention, and the following describes the target tracking device provided by the embodiments of the present invention in detail.
Fig. 5 is a schematic diagram of a target tracking apparatus according to an embodiment of the present invention, as shown in fig. 5, the target tracking apparatus mainly includes a first obtaining unit 10, a second obtaining unit 20 and a trajectory tracking unit 30, wherein:
a first acquisition unit 10 configured to acquire distance information obtained when a target object is detected in a detection region by a plurality of distance sensors;
a second acquisition unit 20 for acquiring the mounting position of each distance sensor in the detection area;
a trajectory tracking unit 30, configured to track each target object by combining the distance information and the installation position to determine a moving trajectory of each target object in the detection area.
In the embodiment of the invention, firstly, distance information obtained when a plurality of distance sensors detect a target object in a detection area is obtained; then acquiring the installation position of each distance sensor in a detection area; and finally, tracking each target object by combining the distance information and the installation position to determine the moving track of each target object in the detection area. In the embodiment of the invention, the target object can be accurately positioned and tracked in an imperceptible manner by acquiring the distance information through the distance sensor, so that the technical problem that the target in the detection area cannot be tracked in an imperceptible manner in the prior art is solved, and the technical effect of carrying out the imperceptible tracking on the target object in the detection area is realized.
Optionally, the plurality of distance sensors are mounted at the top end of the detection region in the form of a sensor array, wherein the number of the sensor array is one or more.
Optionally, the trajectory tracking unit 30 is configured to: starting with a starting distance sensor, determining one or more continuous target distance sensors in the plurality of distance sensors, and drawing a moving track of the target object in combination with the installation positions of the target distance sensors; the starting distance sensor is the first sensor triggered in the plurality of distance sensors when the target object enters the detection area, and the target distance sensor is a distance sensor which continuously detects the same target object.
Optionally, the apparatus further comprises: a third obtaining unit, configured to obtain an attribute feature of the target object; the establishing unit is used for establishing an association relation between the attribute characteristics and the moving track according to the acquisition time of the attribute characteristics and the initial trigger time of the moving track to obtain association data; the associated data comprises attribute characteristics and a moving track of the same target object, and the starting trigger time is a trigger time corresponding to the starting point of the moving track.
Optionally, the third obtaining unit is configured to: acquiring image information which is acquired by an image acquisition device and contains the target object, wherein the image information is an image acquired when the target object enters or leaves the detection area, and the image information comprises physical information and/or clothing information of the target object; and performing attribute analysis on the image information to obtain attribute characteristics of the target object.
Optionally, the establishing unit is further configured to: acquiring position information of a plurality of target objects included in the image information under the condition that the image information includes the plurality of target objects which appear at the same time; and establishing an association relation between the attribute characteristics of the target object and the moving track according to the position information of each target object, the acquisition time of the attribute characteristics of each target object and the initial trigger time of the moving track.
Optionally, the apparatus is further configured to: acquiring attribute characteristics of the moving track; and analyzing the associated data by combining the attribute characteristics of the movement track and/or the attribute characteristics of the target object to obtain a movement track distribution map belonging to each attribute characteristic.
Optionally, the apparatus is further configured to: determining label information, wherein the label information is used for distinguishing each moving track; and binding the label information and the moving track.
Optionally, the tag information is determined by any one of the following methods: determining the label information by using the face feature information, wherein one face feature information corresponds to one label information; determining the label information by using the generation time of the movement track; and determining the label information by using the moving track.
The device provided by the embodiment of the present invention has the same implementation principle and technical effect as the method embodiments, and for the sake of brief description, reference may be made to the corresponding contents in the method embodiments without reference to the device embodiments.
In addition, in the description of the embodiments of the present invention, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
The method, the apparatus, and the computer program product for tracking a target provided in the embodiments of the present invention include a computer-readable storage medium storing a non-volatile program code executable by a processor, where instructions included in the program code may be used to execute the method described in the foregoing method embodiments, and specific implementation may refer to the method embodiments, and will not be described herein again.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present invention, which are used for illustrating the technical solutions of the present invention and not for limiting the same, and the protection scope of the present invention is not limited thereto, although the present invention is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1.一种目标的追踪方法,其特征在于,包括:1. a tracking method of target, is characterized in that, comprises: 获取多个距离传感器在探测区域中对目标对象进行探测时得到的距离信息;Obtain the distance information obtained when multiple distance sensors detect the target object in the detection area; 获取每个距离传感器在所述探测区域的安装位置;Obtain the installation position of each distance sensor in the detection area; 结合所述距离信息和所述安装位置对各个目标对象进行追踪,以确定各个所述目标对象在所述探测区域中的移动轨迹;Tracking each target object in combination with the distance information and the installation position to determine the movement trajectory of each target object in the detection area; 其中,所述方法还包括:Wherein, the method also includes: 获取所述目标对象的属性特征;Obtain the attribute characteristics of the target object; 依照所述属性特征的获取时间和所述移动轨迹的起始触发时间,建立所述属性特征和所述移动轨迹之间的关联关系,得到关联数据;According to the acquisition time of the attribute feature and the initial trigger time of the movement track, establish an association relationship between the attribute feature and the movement track to obtain associated data; 其中,所述关联数据中包括同一目标对象的属性特征和移动轨迹,所述起始触发时间为所述移动轨迹的起点所对应的触发时刻;Wherein, the associated data includes the attribute feature and movement track of the same target object, and the start trigger time is the trigger time corresponding to the start point of the movement track; 所述方法还包括:The method also includes: 获取所述移动轨迹的属性特征;Obtain the attribute feature of the movement track; 结合所述移动轨迹的属性特征和/或所述目标对象的属性特征对所述关联数据进行分析,得到所属于各个属性特征的移动轨迹分布图。The associated data is analyzed in combination with the attribute feature of the movement track and/or the attribute feature of the target object, to obtain a movement track distribution diagram belonging to each attribute feature. 2.根据权利要求1所述的方法,其特征在于,结合所述距离信息和所述安装位置对各个目标对象进行追踪,以确定各个所述目标对象在所述探测区域中的移动轨迹包括:2. The method according to claim 1, wherein tracking each target object in combination with the distance information and the installation position to determine the movement trajectory of each target object in the detection area comprises: 以起始距离传感器开始,在所述多个距离传感器中确定一个或者多个连续的目标距离传感器,并结合所述目标距离传感器的安装位置绘制所述目标对象的移动轨迹;Starting with an initial distance sensor, one or more continuous target distance sensors are determined among the plurality of distance sensors, and a moving trajectory of the target object is drawn in combination with the installation positions of the target distance sensors; 其中,所述起始距离传感器为所述目标对象进入到所述探测区域时在所述多个距离传感器中触发的首个传感器,所述目标距离传感器为连续探测到同一目标对象的距离传感器。Wherein, the starting distance sensor is the first sensor that is triggered among the plurality of distance sensors when the target object enters the detection area, and the target distance sensor is a distance sensor that continuously detects the same target object. 3.根据权利要求1所述的方法,其特征在于,获取所述目标对象的属性特征包括:3. The method according to claim 1, wherein acquiring the attribute feature of the target object comprises: 获取图像采集器采集的包含所述目标对象的图像信息,其中,所述图像信息为所述目标对象进入或者走出所述探测区域时采集的图像,所述图像信息中包括所述目标对象的体貌信息和/或衣着信息;Acquiring image information including the target object collected by an image collector, wherein the image information is an image collected when the target object enters or exits the detection area, and the image information includes the physical appearance of the target object information and/or clothing information; 对所述图像信息进行属性分析,得到所述目标对象的属性特征。Attribute analysis is performed on the image information to obtain attribute features of the target object. 4.根据权利要求3所述的方法,其特征在于,所述图像信息中包括同时出现的多个目标对象,依照所述属性特征的获取时间和所述移动轨迹的起始触发时间,建立所述属性特征和所述移动轨迹之间的关联关系,得到关联数据包括:4. The method according to claim 3, wherein the image information includes multiple target objects that appear simultaneously, and according to the acquisition time of the attribute feature and the start trigger time of the movement track, establish the The association relationship between the attribute feature and the movement trajectory, and the obtained association data includes: 获取所述图像信息中包括的多个目标对象的位置信息;obtaining the position information of multiple target objects included in the image information; 依照每个所述目标对象的位置信息、每个所述目标对象的属性特征的获取时间和所述移动轨迹的起始触发时间,建立所述目标对象的属性特征和所述移动轨迹之间的关联关系。According to the position information of each target object, the acquisition time of the attribute feature of each target object, and the initial trigger time of the movement track, establish the relationship between the attribute feature of the target object and the movement track. connection relation. 5.根据权利要求1所述的方法,其特征在于,所述方法还包括:5. The method according to claim 1, wherein the method further comprises: 确定标签信息,其中,所述标签信息为用于对各个移动轨迹进行区分的信息;determining label information, wherein the label information is information used to distinguish each movement trajectory; 将所述标签信息和所述移动轨迹进行绑定。Bind the label information and the movement track. 6.根据权利要求5所述的方法,其特征在于,通过以下任一种方法确定标签信息:6. The method according to claim 5, wherein the label information is determined by any of the following methods: 利用人脸特征信息确定所述标签信息,其中,一个人脸特征信息对应一个标签信息;The label information is determined by using face feature information, wherein one face feature information corresponds to one label information; 利用所述移动轨迹的生成时间确定所述标签信息;Determine the label information by using the generation time of the movement track; 利用所述移动轨迹确定所述标签信息。The label information is determined by using the movement track. 7.根据权利要求1所述的方法,其特征在于,所述多个距离传感器以传感器阵列的形式安装在所述探测区域的顶端,其中,所述传感器阵列的数量为一个或者多个。7 . The method according to claim 1 , wherein the plurality of distance sensors are installed at the top of the detection area in the form of a sensor array, wherein the number of the sensor array is one or more. 8 . 8.一种目标的追踪装置,其特征在于,包括:8. A tracking device for a target, comprising: 第一获取单元,用于获取多个距离传感器在探测区域中对目标对象进行探测时得到的距离信息;a first acquisition unit, configured to acquire distance information obtained when a plurality of distance sensors detect a target object in a detection area; 第二获取单元,用于获取每个距离传感器在所述探测区域的安装位置;a second acquiring unit, configured to acquire the installation position of each distance sensor in the detection area; 轨迹追踪单元,用于结合所述距离信息和所述安装位置对各个目标对象进行追踪,以确定各个所述目标对象在所述探测区域中的移动轨迹;a trajectory tracking unit, configured to track each target object in combination with the distance information and the installation position, so as to determine the movement trajectory of each of the target objects in the detection area; 其中,所述装置还用于:Wherein, the device is also used for: 获取所述目标对象的属性特征;Obtain the attribute characteristics of the target object; 依照所述属性特征的获取时间和所述移动轨迹的起始触发时间,建立所述属性特征和所述移动轨迹之间的关联关系,得到关联数据;According to the acquisition time of the attribute feature and the initial trigger time of the movement track, establish an association relationship between the attribute feature and the movement track to obtain associated data; 其中,所述关联数据中包括同一目标对象的属性特征和移动轨迹,所述起始触发时间为所述移动轨迹的起点所对应的触发时刻;Wherein, the associated data includes the attribute feature and movement track of the same target object, and the start trigger time is the trigger time corresponding to the start point of the movement track; 所述装置还用于:The device is also used to: 获取所述移动轨迹的属性特征;Obtain the attribute feature of the movement track; 结合所述移动轨迹的属性特征和/或所述目标对象的属性特征对所述关联数据进行分析,得到所属于各个属性特征的移动轨迹分布图。The associated data is analyzed in combination with the attribute feature of the movement track and/or the attribute feature of the target object, to obtain a movement track distribution diagram belonging to each attribute feature. 9.一种电子设备,包括存储器、处理器及存储在所述存储器上并可在所述处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现上述权利要求1至7中任一项所述的方法。9. An electronic device comprising a memory, a processor and a computer program stored on the memory and running on the processor, wherein the processor implements the above claims when executing the computer program The method of any one of 1 to 7. 10.一种具有处理器可执行的非易失的程序代码的计算机可读介质,其特征在于,所述程序代码使所述处理器执行所述权利要求1-7中任一所述方法。10. A computer-readable medium having non-volatile program code executable by a processor, wherein the program code causes the processor to perform the method of any one of claims 1-7.
CN201711247108.0A 2017-12-01 2017-12-01 Target tracking method, device and electronic device Expired - Fee Related CN108051777B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711247108.0A CN108051777B (en) 2017-12-01 2017-12-01 Target tracking method, device and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711247108.0A CN108051777B (en) 2017-12-01 2017-12-01 Target tracking method, device and electronic device

Publications (2)

Publication Number Publication Date
CN108051777A CN108051777A (en) 2018-05-18
CN108051777B true CN108051777B (en) 2020-06-02

Family

ID=62121104

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711247108.0A Expired - Fee Related CN108051777B (en) 2017-12-01 2017-12-01 Target tracking method, device and electronic device

Country Status (1)

Country Link
CN (1) CN108051777B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110766101B (en) * 2018-07-26 2023-10-20 杭州海康威视数字技术股份有限公司 Method and device for determining movement track
CN110969644B (en) * 2018-09-28 2023-12-01 杭州海康威视数字技术股份有限公司 Personnel track tracking method, device and system
CN109448026A (en) * 2018-11-16 2019-03-08 南京甄视智能科技有限公司 Passenger flow statistical method and system based on head and shoulder detection
CN110012266A (en) * 2019-03-14 2019-07-12 中电海康集团有限公司 A kind of system and method for specification local police station supervision of law enforcement
CN111784730B (en) * 2020-07-01 2024-05-03 杭州海康威视数字技术股份有限公司 Object tracking method and device, electronic equipment and storage medium
CN111950421A (en) * 2020-08-05 2020-11-17 广东金杭科技有限公司 Face recognition system and trajectory tracking system
CN111879315B (en) * 2020-08-14 2023-01-13 支付宝(杭州)信息技术有限公司 Multi-target tracking system and method
CN112379384B (en) * 2020-11-10 2024-04-09 浙江华消科技有限公司 Object position determining method and device
CN113688194A (en) * 2021-07-13 2021-11-23 金钱猫科技股份有限公司 Personnel movement track monitoring method and system and storage equipment
CN114353794B (en) * 2021-11-25 2024-10-11 深圳市鸿逸达科技有限公司 Target positioning method based on fusion of wearing type positioning device and distance sensor

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107104971A (en) * 2017-05-03 2017-08-29 哈尔滨工业大学 A kind of joint-monitoring method based on laser radar and video, apparatus and system
CN107782316A (en) * 2017-11-01 2018-03-09 北京旷视科技有限公司 The track of destination object determines method, apparatus and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SG11201807291PA (en) * 2016-02-29 2018-09-27 Signpost Corp Information processing system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107104971A (en) * 2017-05-03 2017-08-29 哈尔滨工业大学 A kind of joint-monitoring method based on laser radar and video, apparatus and system
CN107782316A (en) * 2017-11-01 2018-03-09 北京旷视科技有限公司 The track of destination object determines method, apparatus and system

Also Published As

Publication number Publication date
CN108051777A (en) 2018-05-18

Similar Documents

Publication Publication Date Title
CN108051777B (en) Target tracking method, device and electronic device
JP7251569B2 (en) Store device, store management method, program
CN108010008B (en) Target tracking method and device and electronic equipment
JP7260022B2 (en) Store equipment, store system, store management method, program
US11631253B2 (en) People counting and tracking systems and methods
US11295139B2 (en) Human presence detection in edge devices
US20210056498A1 (en) Method and device for identifying product purchased by user and intelligent shelf system
US8254633B1 (en) Method and system for finding correspondence between face camera views and behavior camera views
JP6800820B2 (en) People flow analysis method, people flow analyzer, and people flow analysis system
JP4972491B2 (en) Customer movement judgment system
JP6517325B2 (en) System and method for obtaining demographic information
JP6731097B2 (en) Human behavior analysis method, human behavior analysis device, device and computer-readable storage medium
WO2014050518A1 (en) Information processing device, information processing method, and information processing program
CN107782316A (en) The track of destination object determines method, apparatus and system
US11461733B2 (en) Behavior analysis device, behavior analysis system, behavior analysis method, and program
CN110648186B (en) Data analysis method, device, equipment and computer readable storage medium
CA3014365C (en) System and method for gathering data related to quality of service in a customer service environment
CN110689389A (en) Computer vision-based shopping list automatic maintenance method and device, storage medium and terminal
WO2022030558A1 (en) Person detection or tracking device, system, method, and program
KR101355206B1 (en) A count system of coming and going using image analysis and method thereof
CN109034887B (en) Method, device and system for adjusting price of article
JP2021039784A (en) Purchased product estimation device
JP2016045743A (en) Information processing apparatus and program
US20240285100A1 (en) Methods and systems for detecting and tracking objects
JP6944020B2 (en) Information processing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200602

CF01 Termination of patent right due to non-payment of annual fee
点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载