+

WO2018133063A1 - Procédé de commande d'instrument de mesure à base de reconnaissance de geste et instrument de mesure - Google Patents

Procédé de commande d'instrument de mesure à base de reconnaissance de geste et instrument de mesure Download PDF

Info

Publication number
WO2018133063A1
WO2018133063A1 PCT/CN2017/072049 CN2017072049W WO2018133063A1 WO 2018133063 A1 WO2018133063 A1 WO 2018133063A1 CN 2017072049 W CN2017072049 W CN 2017072049W WO 2018133063 A1 WO2018133063 A1 WO 2018133063A1
Authority
WO
WIPO (PCT)
Prior art keywords
operation control
motion
control instruction
application
instruction
Prior art date
Application number
PCT/CN2017/072049
Other languages
English (en)
Chinese (zh)
Inventor
袁剑敏
Original Assignee
深圳华盛昌机械实业有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳华盛昌机械实业有限公司 filed Critical 深圳华盛昌机械实业有限公司
Priority to CN201780000026.8A priority Critical patent/CN107077219A/zh
Priority to PCT/CN2017/072049 priority patent/WO2018133063A1/fr
Publication of WO2018133063A1 publication Critical patent/WO2018133063A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Definitions

  • the present invention belongs to the field of information recognition, and in particular, to a measuring instrument control method and a measuring instrument based on motion gesture recognition.
  • the present invention provides a measuring method and a measuring instrument based on motion gesture recognition to solve the problem of inconvenience of the user in controlling the measuring instrument in the prior art.
  • a measuring instrument including:
  • a command library selecting unit configured to determine the motion track category, and select an operation control instruction library corresponding to the motion track category, the operation control instruction library includes a first operation control instruction library and a second operation control Instruction library
  • the present invention provides a measuring method and a measuring instrument based on motion gesture recognition, which provides a completely new method for controlling the use of the measuring instrument.
  • the motion data of the measuring instrument is first collected, and the motion track is generated when the user makes a motion gesture while holding the measuring instrument, and the corresponding operation control instruction is selected according to the motion data category.
  • the final operation control instruction is determined in conjunction with the motion data, the foreground application, and the operation control instruction library, and the task operation corresponding to the operation control instruction is executed.
  • FIG. 3 is a block diagram showing a system configuration of a measuring instrument in Embodiment 3 of the present invention.
  • the present invention provides a measuring device control method and a measuring instrument based on motion gesture recognition, the method comprising: acquiring motion data of a measuring instrument, and calculating a motion trajectory of the measuring instrument based on the motion data, The motion trajectory is generated by the user making a motion gesture while holding the measuring instrument; determining the motion trajectory category, and selecting an operation control instruction library corresponding to the motion trajectory category, the operation control instruction library including the first An operation control instruction library and a second operation control instruction library; when the motion track category is the first type of motion track, identifying a foreground application of the measuring instrument, and filtering out from the first operation control instruction library The motion trajectory and the operation control instruction matched by the foreground application, performing a task operation corresponding to the operation control instruction in the foreground application; when the motion trajectory class When the second type of motion trajectory is not selected, the operation control instruction matching the motion trajectory is filtered out from the second operation control instruction library, and the task operation corresponding to the operation control instruction is executed.
  • the meter includes, but is not limited to
  • Step S101 acquiring motion data of a measuring instrument, and calculating the measurement based on the motion data A motion trajectory of the instrument, which is generated by a user making a motion gesture while holding the meter.
  • the motion data refers to the motion data of the measuring instrument when the user operates the measuring instrument through the motion gesture
  • the motion trajectory is the trajectory corresponding to the motion gesture.
  • Common simple motion trajectories include swaying, swaying, left swaying, and right swaying, etc.
  • Commonly complex motion data include, for example, left upper twitching, squeaking direction, and reverse tweeling direction, etc. The motion trajectories all correspond one-to-one with the trajectory of the user's motion gesture.
  • the research and development personnel add a motion sensor module to the measuring instrument, so that the measuring instrument can calculate the motion trajectory by reading the motion data collected by the motion sensor module, thereby realizing the user through the motion gesture. Identify the meter for control.
  • Motion sensors are sensors that have the ability to measure motion data, including but not limited to gravity sensors, accelerometers, gyroscopes, geomagnetic sensors, and magnetic sensors.
  • the motion data of the measuring instrument is acquired, the motion data is analyzed and identified to determine the corresponding specific motion trajectory.
  • the falling of the measuring instrument when falling (the present invention will When the measuring instrument is dropped, it is also regarded as a kind of motion gesture of the user, and the corresponding motion track is named as falling.
  • These motion trajectories cannot control the task operation of the measuring instrument in normal operation, but are closely related to the normal measurement and use of the measuring instrument.
  • each second type of motion trajectory corresponds to A fixed abnormal measurement use scenario, the fall mentioned above corresponds to the fall of the gauge.
  • the motion track corresponding task operation setting may collide.
  • the foreground application that the user is operating is also recognized.
  • the present invention is applied to an infrared camera as an example for explanation.
  • the infrared camera is a measuring instrument that converts the invisible infrared energy emitted by an object into a visible thermal image. It also has the function of capturing infrared images and visible images.
  • the infrared camera is an infrared image and a visible image captured by two cameras respectively. In actual situations, the two images need to be merged into one thermal image for analysis. Therefore, the two images need to be merged and set. It combines distance parameters.
  • the operation steps of setting the fusion distance parameter in the prior art are generally as follows:
  • step 1 when step 1 is performed to view the hot image that has been initially merged, step 2 is skipped, and the fusion distance parameter is directly adjusted by a simple motion gesture operation, but motion trajectory may occur at this time.
  • the corresponding task operation sets conflicting scenarios. For example, you want to implement the task of adjusting the distance parameter size directly by left and right flapping. However, in the viewing image application, the left and right flapping have been set to Viewing the task operations of the previous and next images, there is a conflict between the task operation settings corresponding to the two motion data.
  • the user is more humanized by using the motion gesture control measuring instrument. After the motion data is recognized in this embodiment, the foreground application that the user is operating is also recognized, so that the subsequent can be The task operation corresponding to the final motion data is determined in conjunction with a specific operational application.
  • the captured image fusion degree is adjusted to the next level image fusion degree of the current image fusion degree.
  • the image fusion degree of the captured image is adjusted to the next level. For example, if the current image fusion degree is the default level 2, if the motion trajectory is detected as the left sway, then The image fusion degree is adjusted to a level of -50% image fusion degree.
  • the captured image fusion degree is adjusted to the upper level image fusion degree of the current image fusion degree.
  • the degree of fusion of the captured image is adjusted upwards, for example, the current image fusion degree is the default level 2 ⁇ , and if the motion trajectory is detected as right twitch, the image fusion degree is adjusted to Level 1 - 0% image fusion.
  • Step S102 Determine the motion track category, and select an operation control instruction library corresponding to the motion track category, where the operation control instruction library includes a first operation control instruction library and a second operation control instruction library.
  • each second type of motion track corresponding operation control instruction does not change with the change of the foreground application, and the relationship list and the operation control instructions are stored in the second operation control instruction.
  • the corresponding operation control instruction library can be determined according to the motion track.
  • Step S104 when the motion trajectory category is the second type of motion track ⁇ , the operation control instruction matching the motion trajectory is filtered out from the second operation control instruction library, and Performing a task operation corresponding to the operation control instruction.
  • the motion gesture control function is only one of the control modes that can be controlled by the measuring instrument. Generally, the control mode is not turned on by default. When the user needs to turn on the motion gesture recognition control, only the corresponding motion gesture needs to be triggered. Control function, the ⁇ measuring instrument will enter the motion gesture control mode, and the user can control the measuring instrument through the motion gesture.
  • the motion gesture control button can exist either in the form of a physical button or as a function module in the operating system of the meter.
  • the method further includes:
  • step S201 receiving an application selection instruction input by the user, and selecting, according to the application selection instruction, setting Set the application, and display the corresponding motion track setting interface of the application.
  • the user can modify and set the relationship between the first type of motion track and the operation control instruction for each application.
  • the user first selects the application to be modified, and the measuring instrument operating system pops up the motion track setting interface corresponding to the application, and the user sets the interface according to the motion track, and selects the application. I want to modify the first motion trajectory of the association.
  • the measuring instrument operating system After the user selects the corresponding modified first motion track in the motion track setting interface, the measuring instrument operating system automatically pops up the operation control instruction setting interface corresponding to the first type of motion track.
  • step S203 receiving an operation control setting instruction input by the user on the operation control instruction setting interface, and setting an association between the motion track and the operation control instruction in the application according to the operation control setting instruction. relationship.
  • each of the second type of motion trajectories corresponds to a fixed abnormal measurement use scenario, and is operated with the abnormal measurement use scenario.
  • the command is fixed, and the following is generated when the meter is dropped. It is fixed corresponding to the shutdown command. Therefore, under normal circumstances, the user does not need to set any information in the second operation control command library. Only the function of setting the association relationship in the first operation control instruction library is included in the embodiment.
  • the meter activates a motion gesture control mode.
  • the meter monitors the foreground application of the user operation in real time, and automatically activates the motion gesture control mode and collects the motion data when the foreground application is a data measurement application.
  • the motion gesture can be used to control the measuring instrument without any mode switching or opening operation, which provides great convenience for the user.
  • the data measurement application is selected as the application that is automatically turned on in the motion gesture control mode, it does not mean that the motion measurement control mode can be automatically turned on only by using the data measurement application.
  • the technician can automatically select the corresponding application program for the motion gesture control mode according to actual needs. For example: You can select the data measurement application + view image application as the application that is automatically turned on in the motion gesture control mode. At this time, the meter automatically detects when the foreground application that the user operates is the data measurement application or the image application. The motion gesture control mode is turned on, and the task operation is controlled according to the motion data collected by the motion sensor.
  • the first type of motion trajectory includes upper swaying, lower swaying, left swaying, right swaying, upper left swaying, squeezing direction, and reverse squeezing direction.
  • the current station application is a viewing image application, and the motion trajectory is the left ⁇ ⁇ , the selected
  • the switching of the previous image instruction in an operation control instruction library is the operation control instruction.
  • the current station application is an image fusion application, and when the motion trajectory is the left tilting, the reducing the fusion distance parameter instruction in the first operation control instruction library is selected as the operation control instruction.
  • the current station application is an image fusion application, and the motion trajectory is the right motion ⁇ , and the step of increasing the fusion distance parameter in the first operation control instruction library is selected as the operation control instruction.
  • step S103 the method further includes:
  • the second type of motion trajectory includes dropping, picking up, and falling.
  • the motion data is the dropout, selecting a standby instruction in the second operation control instruction library as the operation control instruction.
  • the scene corresponding to the placement of the measuring instrument is lowered, and the user temporarily does not use the measuring instrument.
  • the fixed corresponding standby instruction is preferably lowered, so that the measuring instrument is lowered by the user. At the same time, it can enter the standby state.
  • the meter includes:
  • an obtaining unit 31 configured to acquire motion data of the measuring instrument, and calculate a motion trajectory of the measuring instrument based on the motion data, where the motion trajectory is performed by a user at a peer holding the measuring instrument to perform a motion gesture And produced.
  • a command library selecting unit 32 configured to determine the motion track category, and select an operation control instruction library corresponding to the motion track category, the operation control instruction library includes a first operation control instruction library and a second operation Control instruction library.
  • the second task execution unit 34 is configured to: when the motion track category is the second type of motion track, select, from the second operation control instruction library, a device that matches the motion track The operation control instruction is executed, and the task operation corresponding to the operation control instruction is executed.
  • the method further includes:
  • the first receiving unit is configured to receive an application selection instruction input by the user, select an application to be set according to the application selection instruction, and display a motion track setting interface corresponding to the application.
  • a second receiving unit configured to receive a motion track setting instruction input by the user on the motion track setting interface, determine the first type of motion track to be set according to the motion track setting instruction, and display the first A type of motion track corresponding operation control instruction setting interface.
  • an association setting unit configured to receive an operation control setting instruction input by the user at the operation control instruction setting interface, and set the motion trajectory and the operation control in the application according to the operation control setting instruction The association of the instructions.
  • the method further includes:
  • the application determining unit is configured to identify whether the foreground application is a data measurement application.
  • a mode on unit configured to: when the foreground application is the data measurement application, the meter starts a motion gesture control mode.
  • the first task execution unit 33 when the measuring instrument is an infrared camera, comprises: [0083] the first type of motion track includes upper swing, lower tilt, left tilt, Right twirling, top left twirling, slewing direction and circular direction.
  • a first instruction selecting unit configured to: when the current station application is a viewing image application, and the motion track is the left tilting, select the first image control instruction library to switch the previous image instruction For the operation Control instruction.
  • a second instruction selecting unit configured to: when the current station application is a viewing image application, and the motion trajectory is the right swaying, select the first operation control instruction library to switch the next image instruction to the The operation control instructions.
  • a third instruction selecting unit configured to: when the current station application is an image fusion application, and the motion track is the left tilting, select the instruction to reduce the fusion distance parameter in the first operation control instruction library as The operation control instruction.
  • a fourth instruction selecting unit configured to: when the current station application is an image fusion application, and the motion trajectory is the right sway, selecting the instruction to increase the fused distance parameter in the first operation control instruction library is The operation control instructions.
  • the second task execution unit 3 4 further includes:
  • the second type of motion trajectory includes dropping, picking up, and falling.
  • the special motion data includes dropping, picking up, and falling.
  • a fifth instruction selecting unit configured to: when the motion data is the down, select a standby instruction in the second operation control instruction library as the operation control instruction.
  • a sixth instruction selecting unit configured to: when the motion data is the pick-up, select a wake-up instruction in the second operation control instruction library as the operation control instruction.
  • a seventh instruction selecting unit configured to: when the motion data is the falling, select a shutdown command in the second operation control instruction library as the operation control instruction.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple networks. On the unit. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the functions, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium.
  • the technical solution of the present invention which is essential or contributes to the prior art, or a part of the technical solution, may be embodied in the form of a software product, which is stored in a storage medium, including
  • the instructions are used to cause a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present invention.
  • the foregoing storage medium includes: a U disk, a removable hard disk, a read-only memory (ROM), a random access memory (RAM, a random access memory), a magnetic disk, or an optical disk, and the like, which can store program code. .

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un procédé de commande d'instrument de mesure en fonction de la reconnaissance d'un geste et un instrument de mesure, utilisés dans le domaine de la reconnaissance d'informations, le procédé consistant : à acquérir des données de mouvement d'un instrument de mesure, et à calculer la trajectoire de mouvement sur la base des données de mouvement (S101); à déterminer la catégorie de la trajectoire de mouvement et à sélectionner une base de données d'instructions de commande de fonctionnement correspondante (S102); lorsque la catégorie de la trajectoire de mouvement est une première catégorie de trajectoire de mouvement, à identifier l'application de premier plan de l'instrument de mesure, à filtrer, d'une première base de données d'instructions de commande d'opération, une instruction de commande d'opération correspondant à la trajectoire de mouvement et à l'application de premier plan, et à exécuter une opération de tâche correspondante sur l'application de premier plan (S103); lorsque la catégorie de la trajectoire de mouvement est une seconde catégorie de trajectoire de mouvement, à filtrer, d'une seconde base de données d'instructions de commande d'opération, une instruction de commande d'opération correspondant à la trajectoire de mouvement, et à exécuter une opération de tâche correspondante (S104). Pendant la commande, un utilisateur peut commander l'instrument de mesure uniquement en effectuant simplement certains geste, sans avoir besoin d'une opération de frappe de touche ou d'une opération tactile d'écran, de telle sorte que l'utilisateur peut commander facilement et commodément l'instrument de mesure.
PCT/CN2017/072049 2017-01-22 2017-01-22 Procédé de commande d'instrument de mesure à base de reconnaissance de geste et instrument de mesure WO2018133063A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201780000026.8A CN107077219A (zh) 2017-01-22 2017-01-22 一种基于运动手势识别的测量仪控制方法及测量仪
PCT/CN2017/072049 WO2018133063A1 (fr) 2017-01-22 2017-01-22 Procédé de commande d'instrument de mesure à base de reconnaissance de geste et instrument de mesure

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/072049 WO2018133063A1 (fr) 2017-01-22 2017-01-22 Procédé de commande d'instrument de mesure à base de reconnaissance de geste et instrument de mesure

Publications (1)

Publication Number Publication Date
WO2018133063A1 true WO2018133063A1 (fr) 2018-07-26

Family

ID=59613509

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/072049 WO2018133063A1 (fr) 2017-01-22 2017-01-22 Procédé de commande d'instrument de mesure à base de reconnaissance de geste et instrument de mesure

Country Status (2)

Country Link
CN (1) CN107077219A (fr)
WO (1) WO2018133063A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107688432A (zh) * 2017-09-04 2018-02-13 中国电子科技集团公司第四十研究所 一种基于触控技术的电子测量仪器人机交互系统架构
CN110477451A (zh) * 2018-05-15 2019-11-22 深圳市艾维普思科技有限公司 电子烟的控制方法、计算机存储介质、供电组件及电子烟

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101341544A (zh) * 2005-12-20 2009-01-07 索尼爱立信移动通讯有限公司 具有随机播放操作的电子设备
CN102984369A (zh) * 2012-11-20 2013-03-20 广东欧珀移动通信有限公司 一种移动终端查看相册的方法及系统
CN103067630A (zh) * 2012-12-26 2013-04-24 刘义柏 一种通过手机的手势动作产生无线控制指令的方法
CN103699220A (zh) * 2013-12-09 2014-04-02 乐视致新电子科技(天津)有限公司 一种根据手势运动轨迹进行操作的方法及装置
US20160077597A1 (en) * 2013-06-18 2016-03-17 Panasonic Intellectual Property Corporation Of America Input device and method for inputting operational request

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202956192U (zh) * 2012-03-16 2013-05-29 东莞华仪仪表科技有限公司 智能型红外热像仪的装置和系统
CN102880287B (zh) * 2012-08-16 2017-02-15 深圳Tcl新技术有限公司 手势识别方法及手势识别装置
CN103701995A (zh) * 2013-12-31 2014-04-02 上海华勤通讯技术有限公司 电子设备的控制方法及电子设备
CN104941203A (zh) * 2015-06-03 2015-09-30 赵旭 一种基于手势轨迹识别的玩具及其识别、控制方法
CN105759961A (zh) * 2016-02-03 2016-07-13 林勇 智能设备以及智能设备控制方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101341544A (zh) * 2005-12-20 2009-01-07 索尼爱立信移动通讯有限公司 具有随机播放操作的电子设备
CN102984369A (zh) * 2012-11-20 2013-03-20 广东欧珀移动通信有限公司 一种移动终端查看相册的方法及系统
CN103067630A (zh) * 2012-12-26 2013-04-24 刘义柏 一种通过手机的手势动作产生无线控制指令的方法
US20160077597A1 (en) * 2013-06-18 2016-03-17 Panasonic Intellectual Property Corporation Of America Input device and method for inputting operational request
CN103699220A (zh) * 2013-12-09 2014-04-02 乐视致新电子科技(天津)有限公司 一种根据手势运动轨迹进行操作的方法及装置

Also Published As

Publication number Publication date
CN107077219A (zh) 2017-08-18

Similar Documents

Publication Publication Date Title
US11861069B2 (en) Gesture operated wrist mounted camera system
EP3051463B1 (fr) Procédé de traitement d'images et dispositif électronique supportant celui-ci
EP3276950B1 (fr) Dispositif électronique pour la fourniture de contenu vidéo au ralenti
CN106642578B (zh) 空调器的控制方法及装置
CN106843786B (zh) 显示屏的开启方法、显示屏的开启装置和终端
KR102432620B1 (ko) 외부 객체의 근접에 따른 동작을 수행하는 전자 장치 및 그 방법
RU2628558C2 (ru) Способ и устройство для управления интеллектуальным терминалом
CN105138123B (zh) 设备控制方法及装置
CN105120160B (zh) 拍摄装置和拍摄方法
CN105259765B (zh) 生成控制界面的方法及装置
CN105760102B (zh) 终端交互控制方法、装置及应用程序交互控制方法
RU2617325C2 (ru) Способ и аппарат для отображения данных о состоянии здоровья
JP2015526927A (ja) カメラ・パラメータのコンテキスト駆動型調整
WO2014130966A1 (fr) Application mobile pour des dispositifs de surveillance et commande
EP3104304B1 (fr) Appareil électronique et procédé d'extraction d'images fixes
WO2016177200A1 (fr) Procédé et terminal de mise en œuvre de commande d'écran
WO2020220154A1 (fr) Procédé de commutation d'affichage d'écran, dispositif d'affichage et plateforme mobile
KR20180020374A (ko) 이벤트 검색 시스템, 장치 및 방법
WO2018133063A1 (fr) Procédé de commande d'instrument de mesure à base de reconnaissance de geste et instrument de mesure
CN109240759A (zh) 应用程序启动方法、装置、终端设备和可读存储介质
WO2015081485A1 (fr) Procédé et dispositif permettant à un dispositif terminal d'identifier les gestes d'un utilisateur
CN106648040B (zh) 一种终端控制方法及装置
JP6756103B2 (ja) 電子機器、表示システム、表示装置、撮像装置、表示制御方法及びプログラム
US20150048173A1 (en) Method of processing at least one object in image in computing device, and computing device
CN110730222B (zh) 一种远程摄像的呈现方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17892904

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17892904

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载