+

CN114071003B - Shooting method and system based on optical communication device - Google Patents

Shooting method and system based on optical communication device Download PDF

Info

Publication number
CN114071003B
CN114071003B CN202010781849.2A CN202010781849A CN114071003B CN 114071003 B CN114071003 B CN 114071003B CN 202010781849 A CN202010781849 A CN 202010781849A CN 114071003 B CN114071003 B CN 114071003B
Authority
CN
China
Prior art keywords
user
photographing
image
optical communication
position information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010781849.2A
Other languages
Chinese (zh)
Other versions
CN114071003A (en
Inventor
李江亮
方俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Whyhow Information Technology Co Ltd
Original Assignee
Beijing Whyhow Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Whyhow Information Technology Co Ltd filed Critical Beijing Whyhow Information Technology Co Ltd
Priority to CN202010781849.2A priority Critical patent/CN114071003B/en
Publication of CN114071003A publication Critical patent/CN114071003A/en
Application granted granted Critical
Publication of CN114071003B publication Critical patent/CN114071003B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C17/00Compasses; Devices for ascertaining true or magnetic north for navigation or surveying purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/04Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by terrestrial means
    • G01C21/08Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by terrestrial means involving use of the magnetic field of the earth
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C22/00Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers, using pedometers

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Geology (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • Environmental & Geological Engineering (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Studio Devices (AREA)

Abstract

Provided are a device positioning method and system, including: acquiring position information of a user and a shooting request, wherein the position information of the user is obtained by acquiring and analyzing an image containing the optical communication device; transmitting a photographing instruction to a movable image pickup apparatus based on the position information and the photographing request; the movable image pickup device adjusts the position and the posture thereof based on the shooting instruction; and the movable image pickup apparatus performs photographing based on the photographing instruction.

Description

Shooting method and system based on optical communication device
Technical Field
The present invention relates to the field of information technologies, and in particular, to a photographing method and system based on an optical communication device.
Background
The statements in this section merely provide background information related to the present disclosure and may not necessarily constitute prior art to the present disclosure.
During travel, people often carry self-help devices (e.g., selfie sticks or tripod) to take a self-photograph in order to avoid disturbing bystanders. However, the photographing effect of self-service devices such as selfie sticks is not very good due to the limitations of distance, people flow and the like. With the development of artificial intelligence technology, mobile devices such as unmanned aerial vehicles and unmanned automobiles are widely applied to various aspects of life of people, and particularly, the unmanned aerial vehicles are used for photography, so that the mobile devices are touted by wide photography lovers. However, the mobile devices such as unmanned aerial vehicle are mainly used by users at present, and the devices are relatively large in size, light in weight and inconvenient for users to carry about.
Therefore, there is a need for a convenient shooting method and system to meet the needs of people for instant shooting and instant use.
Disclosure of Invention
It is therefore an object of the present invention to overcome the above-mentioned drawbacks of the prior art, and to provide a photographing method based on an optical communication device, comprising: acquiring position information of a user and a shooting request, wherein the position information of the user is obtained by acquiring and analyzing an image containing the optical communication device; transmitting a photographing instruction to a movable image pickup apparatus based on the position information and the photographing request; the movable image pickup device adjusts the position and the posture thereof based on the shooting instruction; and the movable image pickup apparatus performs photographing based on the photographing instruction.
Optionally, the location information of the user is location information of the user in a world coordinate system or a scene coordinate system or relative to the optical communication device.
Optionally, the location information of the user is obtained by: acquiring an image containing the optical communication device; obtaining identification information of the optical communication device based on the image containing the optical communication device, and determining position information of the user relative to the optical communication device; determining pose information of the optical communication device based on the identification information; and determining location information of the user based on the pose information of the optical communication device and the location information of the user relative to the optical communication device.
Optionally, the shooting instruction includes at least one of the following: location information of the user; shooting mode; or a control instruction to the movable image pickup apparatus.
Optionally, the method further comprises: acquiring an image containing the user; determining a user to be photographed based on the image containing the user and the image acquired by the movable image pickup device; and/or adjusting the position and the posture of the movable image pickup device based on the image containing the user and the shooting instruction.
Another aspect of the present invention further provides a photographing system based on an optical communication device, for implementing the method of any one of the above, the photographing system comprising: one or more optical communication devices; a control device for acquiring position information and a shooting request of a user, and sending a shooting instruction to a movable image pickup device based on the position information and the shooting request; and a movable image pickup apparatus for receiving the photographing instruction from the control device and performing photographing.
Another aspect of the present invention also provides a photographing method based on an optical communication apparatus, including: acquiring position information of a user and a shooting request, wherein the position information of the user is obtained by acquiring and analyzing an image containing the optical communication device; acquiring an image containing the user by using a first camera device; determining the user based at least on the user's location information and the image containing the user; tracking, by the first image capturing apparatus, position information of the user; transmitting a photographing instruction to a movable second image capturing apparatus based on the tracked position information of the user and the photographing request; and the movable second image pickup apparatus adjusts its position and posture based on the photographing instruction and performs photographing.
Optionally, the determining the user based at least on the location information of the user and the image including the user includes: the user is determined based on the position information of the user, the image including the user, and the relative pose information between the optical communication device and the first image capturing apparatus.
Optionally, the method further comprises: acquiring information of a user by using a sensor; and determining the user based on the position information of the user, the image containing the user and the information of the user acquired by using the sensor.
Another aspect of the present invention further provides a photographing system based on an optical communication device, for implementing the method of any one of the above, the photographing system comprising: one or more optical communication devices; a first image pickup apparatus for collecting an image including a user; a movable second image pickup apparatus for photographing the user; and a control device for acquiring the position information of the user and the shooting request, determining the user at least based on the position information of the user and the image containing the user, and sending a shooting instruction to the movable second shooting equipment.
Optionally, the control device is capable of learning relative pose information between the optical communication device and the first image capturing apparatus.
Another aspect of the invention also provides a storage medium having stored therein a computer program which, when executed by a processor, is operable to carry out the method of any of the preceding claims.
Another aspect of the invention also provides an electronic device comprising a processor and a memory, the memory having stored therein a computer program which, when executed by the processor, is operable to carry out the method of any of the preceding claims.
By adopting the scheme of the invention, the movable camera equipment can be automatically controlled to accurately aim at the user according to the position information and the shooting request of the user, so that the user can freely complete shooting without the assistance of other people. Because any user can use the device to scan the optical tag to complete the shooting, convenient and personalized shooting service can be provided for a plurality of users, and the device has good applicability and flexibility.
Drawings
Embodiments of the invention are further described below with reference to the accompanying drawings, in which:
FIG. 1A illustrates an exemplary optical label;
FIG. 1B illustrates an exemplary optical label network;
FIG. 2 illustrates a photographing system based on an optical communication device according to one embodiment;
fig. 3 illustrates a photographing method based on an optical communication apparatus according to an embodiment;
fig. 4 shows a photographing method based on an optical communication apparatus according to another embodiment;
fig. 5 shows a photographing system based on an optical communication device according to another embodiment;
fig. 6 illustrates a photographing method based on an optical communication apparatus according to another embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be further described in detail by the following examples with reference to the accompanying drawings. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Optical communication devices are also referred to as optical labels, and these two terms are used interchangeably herein. The optical tag can transmit information through different light emitting modes, has the advantages of long identification distance, loose requirements on visible light conditions and strong directivity, and the information transmitted by the optical tag can change with time, so that large information capacity and flexible configuration capability can be provided.
The light label may typically include a controller and at least one light source, the controller being capable of driving the light source in different driving modes to convey different information outwards. Fig. 1A shows an exemplary optical label 100 that includes three light sources (a first light source 101, a second light source 102, and a third light source 103, respectively). The optical label 100 further comprises a controller (not shown in fig. 1A) for selecting a respective driving mode for each light source in dependence of the information to be transferred. For example, in different driving modes, the controller may control the light emitting manner of the light source using different driving signals, so that when the light label 100 is photographed using the apparatus having the image pickup device, the image of the light source therein may take on different appearances (e.g., different colors, patterns, brightness, etc.). By analyzing the imaging of the light sources in the optical label 100, the driving pattern of each light source at the moment can be resolved, thereby resolving the information transferred by the optical label 100 at the moment. Fig. 1A is used as an example only, and the optical label may have a different shape than the example shown in fig. 1A, and may have a different number and/or different shape of light sources than the example shown in fig. 1A.
To provide the corresponding service to the user based on the optical labels, each optical label may be assigned an identification Information (ID). Typically, the light source may be driven by a controller in the optical label to pass the identification information outward, the image capture device may image capture the optical label to obtain one or more images containing the optical label, and by analyzing the imaging of the optical label (or the respective light source in the optical label) in the images to identify the identification information passed by the optical label, after which other information associated with the identification information, such as the location information of the optical label corresponding to the identification information, may be obtained.
Information associated with each optical label may be stored in a server. In reality, a large number of optical labels may also be constructed as an optical label network. Fig. 1B illustrates an exemplary optical label network that includes a plurality of optical labels and at least one server. Identification Information (ID) or other information of each optical label, such as service information related to the optical label, description information or attribute information related to the optical label, such as position information, model information, physical size information, physical shape information, posture or orientation information, etc., of the optical label may be stored on the server. The optical tag may also have uniform or default physical size information, physical shape information, and the like. The device may use the identification information of the identified optical tag to query from the server for additional information related to the optical tag. The location information of the optical tag may refer to the actual location of the optical tag in the physical world, which may be indicated by geographical coordinate information.
The movable image pickup apparatus mentioned in the present application refers to an image pickup apparatus that can change a position or an attitude, and may be, for example, an unmanned aerial vehicle carrying a camera, an image pickup apparatus capable of moving in a predetermined orbit, an image pickup apparatus capable of adjusting an attitude mounted at a fixed position, or the like. There may be one or more movable image capturing apparatuses in the photographing system.
The device used by the user mentioned in the present application may be a device operated by a person, such as a mobile phone, a tablet computer, smart glasses, a wearable device, etc. The apparatus comprises an image acquisition device for acquiring an image. The device may also include a data processing system for storage, computation, output, display, etc. of data, including for example volatile or non-volatile memory, one or more processors. The apparatus may also include communication means for wired or wireless communication with an external system or other device (e.g., a server) for transmission and reception of data. The device may also be equipped with one or more sensors, such as odometers, acceleration sensors, magnetic force sensors, direction sensors, gravity sensors, gyroscopes, compasses, etc., for measuring or tracking changes in the position of the device in space.
The control device mentioned in the present application may be a server, which may include a hardware device having a personal computer capability, for example, a computing device having a central processing unit (including an operator and a controller), a memory, an input device, and an output device, or a cluster formed by a plurality of computing devices, or may include a software program running on the computing device, for example, a cloud server, a VPN server, or the like. In one embodiment, the control means may also be integrated within the device used by the user.
Fig. 2 shows an optical communication device-based photographing system 200 according to one embodiment, which includes an optical tag 201, an apparatus 202, a server 203, and a removable image pickup apparatus 204. The optical tag 201 is typically installed in a specific scene (e.g., near a point of view) with a fixed installation position and attitude. The user-held device 202 is located around the particular scene. One or more cameras are mounted on the device 202 for capturing images. The device 202 may also have mounted thereon one or more sensors, such as odometers, acceleration sensors, magnetic force sensors, direction sensors, gravity sensors, gyroscopes, compasses, etc., for measuring or tracking changes in the position and attitude of the device in space. The server 203 stores therein information related to the optical tag, which can communicate with the device 202 and the removable image pickup device 204. The movable image pickup apparatus 204 can take a picture of the user in accordance with a shooting instruction transmitted from the server 203. In one embodiment, server 203 may be integrated within device 202. In one embodiment, the server 203 may be integrated inside the removable image pickup apparatus 204. In one embodiment, the removable camera device 204 may also communicate directly with the device 202.
In one embodiment, the shooting system may not include the user device 202, but may communicate with the user device 202 to perform the corresponding function.
Fig. 3 shows a photographing method based on an optical communication device according to an embodiment, which can be applied to, for example, the photographing system shown in fig. 2, including the steps of:
s310, the server acquires the location information of the user and the photographing request.
The location information of the device (or user) acquired by the server may be location information of the device relative to the optical tag, or may be other location information, for example, location information under a world coordinate system or a scene coordinate system. The user may scan the optical label using the device to acquire one or more images containing the optical label. By analyzing the imaging of the optical label (or the individual light sources in the optical label) in these images, the positional information of the device relative to the optical label and the identification information of the optical label conveyed by the optical label can be determined. The position information and the posture information (hereinafter referred to as "posture information") of the optical label can be queried by the identification information of the optical label, for example, the posture information of the optical label in a world coordinate system or a scene coordinate system. Based on the position information of the device relative to the optical tag and the pose information of the optical tag in the world coordinate system or the scene coordinate system, the position information of the device in the world coordinate system or the scene coordinate system can be determined.
On the other hand, the user can access the corresponding photographing service and submit a photographing request. The user may specify a shooting mode in the shooting request, which may be predetermined or custom or selected by the user, such as around the user shooting, tracking the user shooting, etc.
In one embodiment, the image containing the optical tag may be analyzed by the device to determine identification information of the optical tag and location information of the device relative to the optical tag, and pose information of the optical tag in a spatial coordinate system may be determined by the identification information. The device can determine the position information of the device in the space coordinate system based on the pose information of the optical label in the space coordinate system and the position information of the device relative to the optical label, and send the position information to the server.
In another embodiment, the device may also send an image containing the optical tag to the server, the identification information of the optical tag is analyzed and identified by the server to determine pose information of the optical tag in a spatial coordinate system, and the server determines position information of the device relative to the optical tag by analyzing the image. The server may obtain position information of the device in the spatial coordinate system based on pose information of the optical tag in the spatial coordinate system and position information of the device relative to the optical tag.
In other embodiments, the location information of the device obtained by the server may also be location information of the device relative to the optical tag.
In one embodiment, the pose information of the device may also be determined by analyzing the image containing the optical tag. The pose information may be pose information of the device relative to the optical tag, or may be other pose information, for example, pose information in a world coordinate system or a scene coordinate system.
In one embodiment, after the device scans the optical tag, the device's location information may be continuously acquired by a sensor in the device and sent to a server.
S320, the server transmits a photographing instruction to the movable image capturing apparatus based on the position information of the user and the photographing request.
The photographing instruction may include, for example, current position information of the apparatus, posture information, a photographing manner specified by a user (e.g., a photographing angle, a photographing mode, etc.), a control instruction of the movable image pickup apparatus by the server, and the like.
In one embodiment, when the movable image capturing apparatus has a high autonomy, the server may transmit a photographing instruction including current location information of the apparatus, a photographing mode specified by a user, and the like to the movable image capturing apparatus (e.g., a drone carrying a camera). The movable image capturing apparatus can determine how it should move to aim at the user and how to take a photograph according to the instruction.
In one embodiment, when the movable image pickup apparatus has low autonomy, the server may directly send a control instruction to the movable image pickup apparatus to control movement and photographing of the image pickup apparatus, and the like.
S330, the movable image pickup apparatus adjusts its position and posture based on the photographing instruction.
The movable image pickup apparatus may adjust its position and posture in accordance with the photographing instruction transmitted from the server so that the apparatus or the user transmitting the photographing instruction is located within the camera view of the movable image pickup apparatus (but not necessarily in the center of the camera view).
In one embodiment, when the shooting mode specified by the user includes a specific shooting angle (for example, a sideways shot), a specific shooting size (for example, a half shot), or a specific shooting mode (for example, a black-and-white/color/beauty mode, etc.), the mobile image capturing apparatus may also adjust its position, posture, and/or shooting mode according to the shooting mode, so as to satisfy the shooting request of the user.
S340, the movable image pickup apparatus performs photographing based on the photographing instruction.
Based on the above embodiments, it can be seen that by using the optical tag, the present invention can automatically control the movable image capturing apparatus to precisely aim at the user according to the position information and the capturing request of the user, so that the user can freely complete capturing without assistance of other people. Since any user can scan the optical tag using the device to complete the photographing, the method can provide a convenient and personalized photographing service for a plurality of users.
In some cases, when there are more interference factors in the scene (e.g., more guests near the scene), the user may also take an image containing himself and upload the image to the server or the mobile image capturing device, so that the server or the mobile image capturing device compares the image with the user in the field of view of the mobile image capturing device to further confirm the user to be taken.
Fig. 4 shows another embodiment of a photographing method based on an optical communication apparatus, which includes the steps of:
s410, the server acquires the location information and the photographing request of the user, and transmits a photographing instruction to the mobile image capturing apparatus based on the location information and the photographing request of the user.
This step is similar to steps S310-S320 described above and will not be described again here.
S420, the server acquires an image including the user, and transmits it to the movable image pickup apparatus.
In one embodiment, a server may receive an image containing a user from a user device.
S430, the mobile image capturing apparatus determines the user to be captured based on the position information of the user and the image containing the user.
In one embodiment, after the server receives the image containing the user, it may be transmitted to the removable image capturing apparatus. The movable camera equipment can initially adjust the position and the posture of the movable camera equipment according to the shooting instruction, so that the user is positioned in the visual field of the camera of the movable camera equipment; and then comparing the image containing the user with the user in the visual field range (for example, using image analysis or face recognition technology) to further confirm the user to be photographed. In one embodiment, the user or user device may also transmit the image containing the user directly to the mobile image capturing apparatus, and the user to be captured is confirmed by the mobile image capturing apparatus.
In one embodiment, the user to be photographed may also be determined by the server based on the image containing the user and the image captured by the movable image capturing apparatus. After the server receives the image containing the user, the image may not be transmitted to the movable image capturing device, and after the movable image capturing device reaches the capturing position, the user to be captured may be determined according to the image returned by the movable image capturing device and the image containing the user.
S440, the mobile image capturing apparatus adjusts its position and posture and captures an image based on the user to be captured and the capturing instruction.
In one embodiment, in a case where the user to be photographed is determined by the server, the server may further transmit a photographing instruction to the movable image pickup apparatus based on the user to be photographed, thereby controlling the movable image pickup apparatus to perform photographing.
In the above embodiment, by capturing an image including the user himself using the apparatus, it is possible to accurately identify the user to be photographed by the movable image capturing apparatus in a scene where the environment is complex, so as to ensure the accuracy of photographing.
In some scenes, two image capturing apparatuses may be used to mutually cooperate to complete capturing of a user, where a first image capturing apparatus is used to capture an image or video containing a user to be captured to determine the user to be captured, and a second image capturing apparatus is movable to capture the user according to a capturing instruction.
Fig. 5 shows an optical communication device based photographing system 500 according to another embodiment, which includes an image capturing apparatus 505 disposed around an optical tag 501 for capturing an image containing a user to determine a matching user, in addition to the optical tag 501, the apparatus 502, the server 503, and the movable image capturing apparatus 504. In order to distinguish the functions of the movable image pickup apparatus 504 and the image pickup apparatus 505 in the photographing system, the movable image pickup apparatus 504 is also referred to as a movable second image pickup apparatus 504, and the image pickup apparatus 505 is also referred to as a first image pickup apparatus 505. The first image pickup apparatus 505 may be held in a fixed relative position with the optical tag 501, or may be freely movable (e.g., an unmanned aerial vehicle, etc.). The optical tag 501 and the first image capturing apparatus 505 may have a relative pose therebetween, which can be known by the server 503, for example, the server 503 may know pose information of each of the optical tag 501 and the first image capturing apparatus 505 (for example, pose information in a world coordinate system or a scene coordinate system), or the relative pose information therebetween. Operations such as conversion and comparison between the position in the optical tag coordinate system and the position in the first image capturing apparatus coordinate system can be realized by the pose information of each of the optical tag 501 and the first image capturing apparatus 505 or the relative pose information between the two. The first image capturing apparatus 505 may communicate with the server 503, and may adjust its position and posture according to data transmitted by the server 503.
In one embodiment, the server 503 may be integrated within the device 502. In one embodiment, the server 503 may be integrated inside the removable image pickup apparatus 504. In one embodiment, the first image capturing apparatus 505 may communicate with the movable second image capturing apparatus 504 to transmit data. In one embodiment, a removable second image capture device 504 may also be in communication with device 502. In one embodiment, the device 502 may not be included in the photographing system, but may communicate with the device 502 to perform the corresponding functions.
In one embodiment, the photographing system may further include one or more sensors for determining a user to be photographed in combination with the first photographing apparatus. The sensor may be, for example, a sensor in a user device, or a sensor disposed in an environment, such as a smart light pole (capable of detecting information of a nearby identification card), a gateway gate, an infrared probe, or the like.
Fig. 6 shows another embodiment of a photographing method based on an optical communication device, which can be applied to, for example, the photographing system shown in fig. 5, including the steps of:
s610, the server acquires the location information of the user and the photographing request.
This step is similar to step S310 described above and will not be described again here.
S620, an image including the user is acquired using the first image capturing apparatus and transmitted to the server.
The optical tag and the first image pickup apparatus have a relative pose therebetween, and the relative pose can be known by the server. In some embodiments, when the optical tag and the first image capturing apparatus are both fixed, the pose information of each of the optical tag and the first image capturing apparatus, or the relative pose information between the two, may be preset and transmitted to the server. At this time, the first image pickup apparatus can only collect images including one or more users in the vicinity of the optical tag.
In other embodiments, when the first image capturing apparatus is movable, and the server can timely acquire current pose information of the first image capturing apparatus, for example, the current pose information of the first image capturing apparatus may be set by the server, and movement of the first image capturing apparatus is controlled based on the pose information, or movement of the first image capturing apparatus is controlled by the first image capturing apparatus itself or other devices, and the current pose information of the first image capturing apparatus is transmitted to the server. In this case, the server may transmit the position information of the apparatus (i.e., the user) acquired from the apparatus to the first image capturing apparatus, and the first image capturing apparatus adjusts its own position and posture after receiving the position information of the user to acquire an image containing the user and transmit it to the server.
S630, the server determines the user to be shot according to the acquired position information of the user and the image acquired by the first camera equipment, and tracks the position of the user to be shot through the first camera equipment.
The server may obtain the position information of the user with respect to the first image capturing apparatus by analyzing the image including the user acquired by the first image capturing apparatus. Meanwhile, according to the relative pose between the optical tag and the first image capturing apparatus, the server may convert and compare the position information of the user acquired from the apparatus and the position information of the user acquired from the first image capturing apparatus into the same coordinate system. For example, the position information of the device (or the user) with respect to the optical tag (i.e., the position of the user in the optical tag coordinate system) is converted into the position information of the user with respect to the first image capturing device (i.e., the position of the user in the first image capturing device coordinate system), and vice versa; or converting the position information of the device (or user) relative to the optical tag and the position information of the device (or user) relative to the first image capturing device into position information in a world coordinate system or a scene coordinate system.
The server can determine the user to be photographed by comparing the position information of the user acquired from the apparatus and the position information of the user acquired by the first image capturing apparatus, which are located in the same coordinate system. In one embodiment, the server may compare the location information of one or more users in the image captured by the first image capturing device with the location information of the device (i.e., user) acquired from the user device to determine the user to be captured in the field of view of the first image capturing device. In another embodiment, the server may directly extract an image area including a certain user from the image acquired by the first image capturing apparatus according to the position information acquired from the user apparatus, and take the user as the user to be captured.
In one embodiment, the server may further obtain pose information of one or more users in the image relative to the first image capturing apparatus by analyzing the image including the users acquired by the first image capturing apparatus, and determine the user to be captured by comparing the pose information of the users acquired from the apparatus located under the same coordinate system with the pose information of the one or more users in the image acquired from the first image capturing apparatus.
After determining the user to be photographed, the first image capturing apparatus may track the user to continuously acquire the position information of the user.
S640, the server transmits a photographing instruction to the movable second image capturing apparatus based on the tracked position information of the user and the photographing request.
S650, the movable second image capturing apparatus adjusts its position and posture based on the photographing instruction and performs photographing.
In another embodiment, the first image capturing apparatus may also send the image acquired by the first image capturing apparatus, which includes the user, directly to the movable second image capturing apparatus or via the server to the movable second image capturing apparatus, and the movable second image capturing apparatus may determine the user to be captured by analyzing the image to acquire the position information of the user with respect to the first image capturing apparatus and comparing the position information of the user acquired from the apparatus.
In one embodiment, a sensor may also be used in conjunction with the first image capture device to determine the user to be photographed. For example, sensors in the user's handset may be used to track the user's location movement information, or sensors deployed in the environment (e.g., smart light poles) may be used to obtain the user's identity information, etc. In one embodiment, the sensor and the first image capturing device may also be used for mutual dynamic calibration.
Through the mutual cooperation of two camera shooting equipment, the target user who waits to shoot can be more accurate confirm to reach better shooting effect.
In one embodiment of the invention, the invention may be implemented in the form of a computer program. The computer program may be stored in various storage media (e.g. hard disk, optical disk, flash memory, etc.), which, when executed by a processor, can be used to carry out the method of the invention.
In another embodiment of the invention, the invention may be implemented in the form of an electronic device. The electronic device comprises a processor and a memory, in which a computer program is stored which, when being executed by the processor, can be used to carry out the method of the invention.
Reference herein to "various embodiments," "some embodiments," "one embodiment," or "an embodiment" or the like, means that a particular feature, structure, or property described in connection with the embodiments is included in at least one embodiment. Thus, appearances of the phrases "in various embodiments," "in some embodiments," "in one embodiment," or "in an embodiment" in various places throughout this document are not necessarily referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. Thus, a particular feature, structure, or characteristic described in connection with or illustrated in one embodiment may be combined, in whole or in part, with features, structures, or characteristics of one or more other embodiments without limitation, provided that the combination is not logically or otherwise inoperable. The expressions appearing herein like "according to a", "based on a", "by a" or "using a" are meant to be non-exclusive, i.e. "according to a" may cover "according to a only" as well as "according to a and B", unless the meaning of "according to a only" is specifically stated. In this application, some exemplary operation steps are described in a certain order for clarity of explanation, but it will be understood by those skilled in the art that each of these operation steps is not essential, and some of them may be omitted or replaced with other steps. The steps do not have to be performed sequentially in the manner shown, but rather, some of the steps may be performed in a different order, or concurrently, as desired, provided that the new manner of execution is not non-logical or non-operational. For example, in some embodiments, the distance or depth of the virtual object relative to the electronic device may be set first, and then the orientation of the virtual object relative to the electronic device may be determined.
Having thus described several aspects of at least one embodiment of this invention, it is to be appreciated various alterations, modifications, and improvements will readily occur to those skilled in the art. Such alterations, modifications, and improvements are intended to be within the spirit and scope of the invention. While the invention has been described in terms of preferred embodiments, the invention is not limited to the embodiments described herein, but encompasses various changes and modifications that may be made without departing from the scope of the invention.

Claims (13)

1. A photographing method based on an optical communication apparatus, comprising:
the method comprises the steps that a server obtains position information and a shooting request of a user from equipment carried by the user, wherein the position information of the user is obtained by collecting and analyzing an image containing the optical communication device through the equipment carried by the user;
transmitting, by the server, a photographing instruction to the movable image capturing apparatus based on the position information and the photographing request, the photographing instruction including at least position information of the user;
the movable image pickup apparatus adjusts its position and posture based on the photographing instruction to locate a user within a field of view of the movable image pickup apparatus; and
the movable image pickup apparatus photographs a user based on the photographing instruction.
2. The photographing method of claim 1, wherein the location information of the user is location information of the user in a world coordinate system or a scene coordinate system or with respect to the optical communication device.
3. The photographing method of claim 1, wherein the user's location information is obtained by:
acquiring an image containing the optical communication device;
obtaining identification information of the optical communication device based on the image containing the optical communication device, and determining position information of the user relative to the optical communication device;
determining pose information of the optical communication device based on the identification information; and
and determining the position information of the user based on the pose information of the optical communication device and the position information of the user relative to the optical communication device.
4. A photographing method as claimed in any one of claims 1 to 3, wherein the photographing instruction further comprises at least one of:
shooting mode; or alternatively
Control instructions for the movable image pickup apparatus.
5. A photographing method as claimed in any one of claims 1 to 3, further comprising:
acquiring an image containing the user;
determining a user to be photographed based on the image containing the user and the image acquired by the movable image pickup device; and
and adjusting the position and the posture of the movable image pickup device based on the user to be photographed and the photographing instruction.
6. A photographing system based on an optical communication device for implementing the method of any one of claims 1-5, the photographing system comprising:
one or more optical communication devices;
a control device for acquiring position information and a shooting request of a user, and sending a shooting instruction to a movable image pickup device based on the position information and the shooting request; and
and the movable image pickup equipment is used for receiving the shooting instruction from the control device and shooting.
7. A photographing method based on an optical communication apparatus, comprising:
the method comprises the steps that a server obtains position information and a shooting request of a user from equipment carried by the user, wherein the position information of the user is obtained by collecting and analyzing an image containing the optical communication device through the equipment carried by the user;
acquiring an image containing the user by using a first camera device and sending the image to a server;
determining, by a server, the user based at least on the location information of the user and the image containing the user;
tracking, by the first image capturing apparatus, position information of the user;
transmitting, by a server, a photographing instruction to a movable second image capturing apparatus based on the tracked position information of the user and the photographing request, the photographing instruction including at least the position information of the user; and
the movable second image pickup apparatus adjusts its position and posture based on the photographing instruction so that the user is located within the field of view of the second image pickup apparatus, and photographs the user.
8. The photographing method of claim 7, wherein the determining the user based at least on the location information of the user and the image including the user comprises:
the user is determined based on the position information of the user, the image including the user, and the relative pose information between the optical communication device and the first image capturing apparatus.
9. The photographing method of claim 8, further comprising:
acquiring information of a user by using a sensor;
and determining the user based on the position information of the user, the image containing the user and the information of the user acquired by using the sensor.
10. A photographing system based on an optical communication device for implementing the method of any one of claims 7-9, the photographing system comprising:
one or more optical communication devices;
a first image pickup apparatus for collecting an image including a user;
a movable second image pickup apparatus for photographing the user; and
and the control device is used for acquiring the position information and the shooting request of the user, determining the user at least based on the position information of the user and the image containing the user, and sending a shooting instruction to the movable second shooting equipment.
11. The photographing system of claim 10, wherein the control device is capable of learning relative pose information between the optical communication device and the first image capturing apparatus.
12. A storage medium having stored therein a computer program which, when executed by a processor, is operable to carry out the method of any one of claims 1-5 and/or 7-9.
13. An electronic device comprising a processor and a memory, the memory having stored therein a computer program which, when executed by the processor, is operable to carry out the method of any one of claims 1-5 and/or 7-9.
CN202010781849.2A 2020-08-06 2020-08-06 Shooting method and system based on optical communication device Active CN114071003B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010781849.2A CN114071003B (en) 2020-08-06 2020-08-06 Shooting method and system based on optical communication device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010781849.2A CN114071003B (en) 2020-08-06 2020-08-06 Shooting method and system based on optical communication device

Publications (2)

Publication Number Publication Date
CN114071003A CN114071003A (en) 2022-02-18
CN114071003B true CN114071003B (en) 2024-03-12

Family

ID=80232312

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010781849.2A Active CN114071003B (en) 2020-08-06 2020-08-06 Shooting method and system based on optical communication device

Country Status (1)

Country Link
CN (1) CN114071003B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103327250A (en) * 2013-06-24 2013-09-25 深圳锐取信息技术股份有限公司 Method for controlling camera lens based on pattern recognition
JP2014075635A (en) * 2012-10-02 2014-04-24 Casio Comput Co Ltd Imaging system, imaging method, light-emitting device, imaging apparatus, and program
CN103945123A (en) * 2014-04-03 2014-07-23 北京大恒图像视觉有限公司 Method for adjusting level angle of industrial camera
CN106713659A (en) * 2017-01-20 2017-05-24 维沃移动通信有限公司 panoramic shooting method and mobile terminal
JP2017201753A (en) * 2016-05-06 2017-11-09 キヤノン株式会社 Network system and method for controlling the same
WO2018027533A1 (en) * 2016-08-09 2018-02-15 深圳市瑞立视多媒体科技有限公司 Camera configuration method and device
WO2018191091A1 (en) * 2017-04-14 2018-10-18 Microsoft Technology Licensing, Llc Identifying a position of a marker in an environment
CN108713179A (en) * 2017-09-18 2018-10-26 深圳市大疆创新科技有限公司 Mobile article body controlling means, equipment and system
AU2019100420A4 (en) * 2018-05-07 2019-05-30 Apple Inc. Creative camera
CN109862273A (en) * 2019-02-28 2019-06-07 四川爱联科技有限公司 Shooting object space automatic induction system and method based on camera array
CN111026107A (en) * 2019-11-08 2020-04-17 北京外号信息技术有限公司 Method and system for determining the position of a movable object
CN111083364A (en) * 2019-12-18 2020-04-28 华为技术有限公司 A control method, electronic device, computer-readable storage medium, and chip
CN111256701A (en) * 2020-04-26 2020-06-09 北京外号信息技术有限公司 Equipment positioning method and system
CN111479055A (en) * 2020-04-10 2020-07-31 Oppo广东移动通信有限公司 Shooting method, device, electronic device and storage medium

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014075635A (en) * 2012-10-02 2014-04-24 Casio Comput Co Ltd Imaging system, imaging method, light-emitting device, imaging apparatus, and program
CN103327250A (en) * 2013-06-24 2013-09-25 深圳锐取信息技术股份有限公司 Method for controlling camera lens based on pattern recognition
CN103945123A (en) * 2014-04-03 2014-07-23 北京大恒图像视觉有限公司 Method for adjusting level angle of industrial camera
JP2017201753A (en) * 2016-05-06 2017-11-09 キヤノン株式会社 Network system and method for controlling the same
WO2018027533A1 (en) * 2016-08-09 2018-02-15 深圳市瑞立视多媒体科技有限公司 Camera configuration method and device
CN106713659A (en) * 2017-01-20 2017-05-24 维沃移动通信有限公司 panoramic shooting method and mobile terminal
WO2018191091A1 (en) * 2017-04-14 2018-10-18 Microsoft Technology Licensing, Llc Identifying a position of a marker in an environment
CN108713179A (en) * 2017-09-18 2018-10-26 深圳市大疆创新科技有限公司 Mobile article body controlling means, equipment and system
AU2019100420A4 (en) * 2018-05-07 2019-05-30 Apple Inc. Creative camera
CN109862273A (en) * 2019-02-28 2019-06-07 四川爱联科技有限公司 Shooting object space automatic induction system and method based on camera array
CN111026107A (en) * 2019-11-08 2020-04-17 北京外号信息技术有限公司 Method and system for determining the position of a movable object
CN111083364A (en) * 2019-12-18 2020-04-28 华为技术有限公司 A control method, electronic device, computer-readable storage medium, and chip
CN111479055A (en) * 2020-04-10 2020-07-31 Oppo广东移动通信有限公司 Shooting method, device, electronic device and storage medium
CN111256701A (en) * 2020-04-26 2020-06-09 北京外号信息技术有限公司 Equipment positioning method and system

Also Published As

Publication number Publication date
CN114071003A (en) 2022-02-18

Similar Documents

Publication Publication Date Title
JP7236565B2 (en) POSITION AND ATTITUDE DETERMINATION METHOD, APPARATUS, ELECTRONIC DEVICE, STORAGE MEDIUM AND COMPUTER PROGRAM
EP3134870B1 (en) Electronic device localization based on imagery
JP6943988B2 (en) Control methods, equipment and systems for movable objects
CN111256701A (en) Equipment positioning method and system
WO2019240452A1 (en) Method and system for automatically collecting and updating information related to point of interest in real space
CN108139757A (en) For the system and method for detect and track loose impediment
KR101038918B1 (en) Apparatus and system for generating vector information of subjects and method of generating the same
KR101555428B1 (en) System and Method for Taking Pictures Using Attribute-Information of Professional Background-Picture Data
CN107036602B (en) Indoor autonomous navigation system and method for hybrid unmanned aerial vehicle based on environmental information code
KR101358064B1 (en) Method for remote controlling using user image and system of the same
CN112528699B (en) Method and system for obtaining identification information of devices or users thereof in a scene
WO2022109860A1 (en) Target object tracking method and gimbal
KR20200048414A (en) Selfie support Camera System Using Augmented Reality
CN114071003B (en) Shooting method and system based on optical communication device
CN110574365B (en) Image generation device and image generation method
CN112581630B (en) User interaction method and system
JP4556096B2 (en) Information processing apparatus and method, recording medium, and program
CN111242107B (en) Method and electronic device for setting virtual object in space
JP2019062279A (en) Imaging device, imaging system, and control method of imaging system
KR101497838B1 (en) System and Method for Cooperation of Taking a Picture in Network
TWI738315B (en) Automatic tracking photographic system based on light label
TWI741588B (en) Optical communication device recognition method, electric device, and computer readable storage medium
US20240323249A1 (en) Communication control server, communication system, and communication control method
CN111752293B (en) Method and electronic device for guiding an autonomously movable machine
WO2022121606A1 (en) Method and system for obtaining identification information of device or user thereof in scenario

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载