+

WO2018123229A1 - Dispositif terminal - Google Patents

Dispositif terminal Download PDF

Info

Publication number
WO2018123229A1
WO2018123229A1 PCT/JP2017/037990 JP2017037990W WO2018123229A1 WO 2018123229 A1 WO2018123229 A1 WO 2018123229A1 JP 2017037990 W JP2017037990 W JP 2017037990W WO 2018123229 A1 WO2018123229 A1 WO 2018123229A1
Authority
WO
WIPO (PCT)
Prior art keywords
recognition
character string
terminal device
recognized
character
Prior art date
Application number
PCT/JP2017/037990
Other languages
English (en)
Japanese (ja)
Inventor
出野 徹
北村 隆
Original Assignee
オムロンヘルスケア株式会社
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロンヘルスケア株式会社, オムロン株式会社 filed Critical オムロンヘルスケア株式会社
Priority to RU2019118662A priority Critical patent/RU2727464C1/ru
Priority to MX2019007047A priority patent/MX2019007047A/es
Priority to CN201780076261.3A priority patent/CN110088770B/zh
Publication of WO2018123229A1 publication Critical patent/WO2018123229A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/98Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns

Definitions

  • the present disclosure relates to a terminal device, and more particularly, to a terminal device that can capture a measurement result of the health device by imaging the health device.
  • Various measuring devices for measuring various biological information related to the health condition of the human body such as blood pressure and pulse have been developed.
  • the user's health condition is confirmed by taking it into a terminal device for collecting measurement data of such a measuring instrument.
  • Patent Document 1 discloses a data input device.
  • the data input device includes image acquisition means for obtaining an image of data displayed on the data display unit of the measuring instrument, number reading means for reading the numbers in the obtained image, and display means for displaying the read numbers. .
  • the display data analysis apparatus uses various markers attached to the measurement apparatus, the color of the display unit on which the measurement data is displayed, reflectance information, and the like in order to facilitate detection of measurement data from the captured image. I'm considering doing that.
  • the display data analysis device since the accuracy of the actually detected measurement data is not studied at all, the display data analysis device displays measurement data different from the measurement data displayed on the display unit of the measurement device. There is a possibility of output.
  • the present disclosure has been made in view of the above, and an object in one aspect is to provide a terminal device capable of more accurately acquiring a measurement result displayed on a display unit of a health device. .
  • a terminal device includes an imaging unit that images a health device as a subject, and characters indicating a measurement value displayed on a display of the health device from an image of the health device included in a captured image generated by the imaging unit.
  • Recognizing part for executing processing for recognizing a column, and each of a plurality of recognized characters included in a recognized character string that is a recognition result of the character string indicating the displayed measurement value represents the recognition accuracy of the recognized character
  • a determination unit that determines whether or not the accuracy is less than a first threshold value. When the accuracy of at least one recognition character among the plurality of recognition characters is less than the first threshold, the recognition unit further includes each of N (N is an integer of 1 or more) captured images further generated by the imaging unit.
  • a process of recognizing the character string indicating the displayed measurement value is further executed.
  • the terminal device Based on the recognition result for N + 1 times of the character string indicating the displayed measurement value, the terminal device recognizes at least one recognized character obtained by recognizing the character for each character included in the character string. Among them, a recognition character that identifies the recognition character with the highest number of times of recognition is determined, and a determination unit that determines a recognition character string composed of each of the specified recognition characters as a measurement value of a health device, and an output that outputs a determination result of the determination unit And a control unit.
  • the determination unit determines a recognition character string including the plurality of recognition characters as a measurement value of the health device.
  • the output control unit displays each of the plurality of recognized characters included in the recognized character string, which is a recognition result of the character string indicating the displayed measurement value, in a predetermined color according to the accuracy of the recognized character. On the display.
  • the output control unit displays, for each of the plurality of recognized characters included in the recognized character string, which is a recognition result of the character string indicating the displayed measurement value, information indicating the accuracy of the recognized character on the display of the terminal device To display.
  • the determination unit has, for each of the plurality of recognized characters included in the recognized character string that is the recognition result of the character string indicating the displayed measurement value, a second accuracy that is lower than the first threshold value for the recognized character. It is further determined whether or not it is less than the threshold value.
  • the terminal device determines, from the captured image generated by the imaging unit, each remaining recognized character other than at least one recognized character Is further provided with a generating unit that generates a partial image obtained by cutting out a region including each character corresponding to, and a combining unit that combines a plurality of partial images generated by the generating unit.
  • the recognition unit executes processing for recognizing a character string indicating the displayed measurement value from the image synthesized by the synthesis unit.
  • the output control unit outputs abnormality information when the numerical value represented by the recognized character string determined as the measurement value of the health device is out of the predetermined range.
  • the measurement result displayed on the display unit of the health device can be acquired more accurately.
  • FIG. 1 is a diagram illustrating a schematic configuration of a system 1 including a terminal device and a health device.
  • system 1 includes a terminal device 10 that is a user terminal, blood pressure monitors 20 and 21 that are examples of health equipment for measuring the user's biological information, and a weight / body composition meter 23.
  • the health equipment is not limited to the sphygmomanometers 21 and 21 and the body weight / body composition meter 23, and may be a sleep meter, a pedometer, a thermometer or the like capable of displaying the measurement results.
  • the sphygmomanometer 20 is a stationary upper arm sphygmomanometer in which the main body and the cuff (armband) are separate bodies.
  • the sphygmomanometer 21 is a wrist watch type sphygmomanometer in which a main body and a cuff are integrated.
  • the sphygmomanometer 21 has a function of wearing a wrist for a long time like a wristwatch and measuring a pulsation every beat for 24 hours continuously, a function capable of being measured by simply wearing a wristwatch and pressing a measurement start button. . Thereby, the sphygmomanometer 21 can always measure the blood pressure of the user.
  • the body weight / body composition meter 23 can measure body weight, body fat percentage, visceral fat level, subcutaneous fat percentage, and the like, and displays the measurement results on a display.
  • the blood pressure monitor 20 will be described as a representative example of “health equipment”.
  • the terminal device 10 is, for example, a smartphone having a camera and a touch panel.
  • the terminal device 10 has a function of detecting (recognizing) a measurement result displayed on the display of each health device from captured images generated by capturing each health device using a camera.
  • a smartphone that is a portable device will be described as a representative example of a “terminal device”.
  • the terminal device may be another terminal device having an imaging function such as a foldable mobile phone, a tablet terminal device, or a PDA (Personal Data Assistance).
  • the terminal device 10 images (photographs) the health device in a state where the measurement result is displayed, using a camera.
  • the terminal device 10 recognizes the measurement results from the generated captured images using various recognition methods, and outputs the recognized measurement results. For example, the terminal device 10 displays the recognized measurement value (maximum blood pressure value or the like) on the display of the own device.
  • FIG. 2 is a block diagram illustrating an example of a hardware configuration of the terminal device 10.
  • the terminal device 10 includes, as main components, a processor 152, a memory 154, an input device 156, a display 158, a wireless communication unit 160, a communication antenna 162, a camera 163, A memory interface (I / F) 164, a communication interface (I / F) 166, a speaker 168, a microphone 170, and a direction sensor 172 are included.
  • the terminal device 10 further includes a timer unit for measuring time.
  • the processor 152 is typically an arithmetic processing unit such as a CPU (Central Processing Unit) or an MPU (Multi Processing Unit).
  • the processor 152 functions as a control unit that controls the operation of each unit of the terminal device 10 by reading and executing the program stored in the memory 154.
  • the processor 152 implements each process (step) of the terminal device 10 to be described later by executing the program.
  • the memory 154 is realized by a RAM (Random Access Memory), a ROM (Read-Only Memory), a flash memory, or the like.
  • the memory 154 stores a program executed by the processor 152, data used by the processor 152, and the like.
  • the input device 156 accepts an operation input to the terminal device 10.
  • the input device 156 is realized by a touch panel.
  • the touch panel is provided on a display 158 having a function as a display unit, and is, for example, a capacitance type.
  • the touch panel detects a touch operation on the touch panel by an external object every predetermined time, and inputs touch coordinates to the processor 152.
  • the input device 156 may include a button or the like.
  • the wireless communication unit 160 connects to the mobile communication network via the communication antenna 162 and transmits and receives signals for wireless communication. Thereby, the terminal device 10 can communicate with other communication devices via a mobile communication network such as LTE (Long Term Evolution).
  • LTE Long Term Evolution
  • the camera 163 is realized by, for example, a CCD (Charge Coupled Device) method, a CMOS (Complementary Mental Oxide Semiconductor) method, or other methods.
  • CCD Charge Coupled Device
  • CMOS Complementary Mental Oxide Semiconductor
  • Memory interface 164 reads data from external storage medium 165.
  • the processor 152 reads data stored in the storage medium 165 via the memory interface 164 and stores the data in the memory 154.
  • the processor 152 reads data from the memory 154 and stores the data in the external storage medium 165 via the memory interface 164.
  • the storage medium 165 is non-volatile such as CD (Compact Disc), DVD (Digital Versatile Disk), BD (Blu-ray (registered trademark) Disc), USB (Universal Serial Bus) memory, SD (Secure Digital) memory card, etc. Includes media for storing programs.
  • CD Compact Disc
  • DVD Digital Versatile Disk
  • BD Blu-ray (registered trademark) Disc
  • USB Universal Serial Bus
  • SD Secure Digital
  • the communication interface (I / F) 166 is a communication interface for exchanging various data with other devices, and is realized by an adapter or a connector.
  • BLE Bluetooth (registered trademark) low energy
  • the communication method may be a wireless communication method using a wireless LAN or the like, or a wired communication method using USB (Universal Serial Bus) or the like.
  • the speaker 168 converts the audio signal given from the processor 152 into audio and outputs it to the outside of the terminal device 10.
  • the microphone 170 receives an audio input to the terminal device 10 and gives an audio signal corresponding to the audio input to the processor 152.
  • the direction sensor 172 detects the direction in which the terminal device 10 is facing. Specifically, the direction sensor 172 detects in which direction the display surface of the display 158 is directed.
  • the direction sensor 172 includes at least one of a three-axis acceleration sensor, a three-axis gyro sensor, and a three-axis geomagnetic sensor.
  • the terminal device 10 can acquire the posture information of the terminal device 10 when the user is holding the terminal device 10 by detecting the inclination of the own device with three axes using the direction sensor 172. it can.
  • the terminal device 10 may acquire azimuth information by detecting a geomagnetic vector with three axes using a three-axis geomagnetic sensor.
  • the health device only needs to provide information processing as described below as a whole, and a well-known hardware configuration can be adopted.
  • the health device includes a processor for executing various processes, a memory for storing programs and data, a display for displaying measurement results, and an input interface for receiving instructions from the user.
  • a time measuring unit for measuring time and various hardware used for measuring biological information are included.
  • the sphygmomanometer 20 includes a cuff that can be wound around the user's upper arm (or wrist), a cuff pressure adjusting mechanism such as a piezoelectric pump and an exhaust valve, and a pressure for detecting the cuff pressure as the various hardware.
  • a sensor and a temperature sensor for detecting the outside air temperature are included.
  • FIG. 3 is a diagram showing an example of the appearance of the main body of the sphygmomanometer 20.
  • main body 22 of sphygmomanometer 20 includes a display 24, a forward / back button 26, a lamp 27, a user number switch 28, and a blood pressure measurement start / stop button 29.
  • brand information 41 for example, brand “AAA”
  • model information 43 for example, model “XYZ-ABC”
  • the user number switch 28 is used when storing measurement results for two persons separately. In the example of FIG. 3, the user with the number “1” is selected.
  • the lamp 27 is a lamp indicating whether the cuff is wound properly, and is shown to be appropriate in the example of FIG.
  • the measurement result is displayed on the display 24.
  • the measurement result includes a measurement value including a maximum blood pressure value (for example, 118 mmHg), a minimum blood pressure value (for example, 78 mmHg), and a pulse rate (for example, 70 nights / minute) displayed in the region 50,
  • the measurement date and time (for example, 7:30 on July 1) displayed in 52, the user number (for example, “1”) displayed in the area 54, and the current temperature (for example, displayed in the area 56) 24 degrees) and various messages (for example, a message indicating the completion of measurement) displayed in the area 58.
  • the measurement result may include information indicating the pulse wave state, information indicating the suitability of the cuff winding method, and the like.
  • the terminal device 10 has various functions used for recognizing measurement results displayed on a health device. Hereinafter, various functions will be described.
  • the “character” in the present embodiment means a symbol that combines lines, dots, etc. to transmit and record a word (language), and not only alphanumeric characters, kanji, kana characters, etc., but also pictograms
  • an arbitrary figure such as a stamp is attached with an identification code.
  • FIG. 4 is a diagram for explaining a guide display method (part 1).
  • the terminal device 10 receives an instruction to shift to the character recognition mode from the user via the input device 156, the terminal device 10 shifts the operation mode of the terminal device 10 to the recognition mode and recognizes the character string included in the captured image.
  • the terminal device 10 automatically repeats generation of a captured image, display of the generated captured image, and character recognition processing at a predetermined period during the character recognition mode.
  • the screen 500 in FIG. 4 is a screen in which the captured image of the sphygmomanometer 20 and the frame 510 are superimposed.
  • the frame 510 is displayed as a frame indicating the four outer corners of the recognition range (analysis range) in which the character string can be recognized. Therefore, the inside of the frame 510 is a recognition range.
  • the terminal device 10 executes processing for recognizing a measurement result that is captured so as to be within the recognition range. Specifically, the terminal device 10 recognizes various character strings included in the measurement result.
  • the terminal device 10 stores screen information indicating what information is displayed in which area of the display 24 of the sphygmomanometer 20 in the memory 154. Therefore, for example, when the image is captured so that the display 24 fits within the frame 510 that is the recognition range, the terminal device 10 uses the screen information to determine what information each recognized character string is. Can be identified. For example, in the terminal device 10, a plurality of character strings “118”, “78”, and “70” recognized in the area 50 of the display 24 within the frame 510 are respectively a maximum blood pressure value, a minimum blood pressure value, and a pulse rate. Can be identified.
  • the terminal device 10 receives an instruction from the user and specifies that the health device that is the subject is the blood pressure monitor 20.
  • the terminal device 10 extracts feature information (shape, model, brand) and the like of the health device from the captured image, and compares the extracted feature information with the feature information of each health device stored in advance.
  • the health device as the subject is the sphygmomanometer 20.
  • the terminal device 10 since the user can grasp that the inside of the frame 510 is within the recognition range, the terminal device 10 can easily recognize the measurement value.
  • FIG. 5 is a diagram for explaining a guide display method (part 2).
  • a screen 530 including an object 520 representing the contour of the shape of sphygmomanometer 20 is displayed on display 158 during the character recognition mode.
  • the terminal device 10 may further display an object indicating a character (brand information 41 or the like) provided on the outer surface of the sphygmomanometer 20.
  • the captured image of the sphygmomanometer 20 is not shown for easy viewing of the object 520.
  • the user takes an image so that the sphygmomanometer 20 overlaps the object 520 by moving the terminal device 10.
  • the terminal device 10 recognizes various character strings included in the measurement result.
  • the display method (part 2), it is possible to guide the user to image the sphygmomanometer 20 at a position where the terminal device 10 can easily recognize the measurement result.
  • FIG. 6 is a diagram for explaining a guide display method (part 3).
  • a screen 550 including a rectangular translucent object 540 and translucent objects 542, 544, and 546 indicating each segment in 7 segments during the character recognition mode. Is displayed on the display 158.
  • the character string indicating the measurement value (maximum blood pressure value, minimum blood pressure value, pulse rate) of the sphygmomanometer 20 is composed of characters of a 7 segment display system. Therefore, as in the example of FIG. 6, a translucent object 542 indicating each segment for the systolic blood pressure value, a translucent object 544 indicating each segment for the systolic blood pressure value, and a translucent object 546 indicating each segment for the pulse rate, Is displayed.
  • a semi-transparent object 540 indicates a recognition range and corresponds to an inner portion of the frame 510 in FIG. In the example of FIG. 6, the captured image of the sphygmomanometer 20 is not shown for easy viewing of each translucent object.
  • the user moves the terminal device 10 to capture images so that the measured values displayed on the display 24 of the sphygmomanometer 20 overlap each object 542, 544, 546.
  • the terminal device 10 recognizes various character strings included in the measurement result.
  • the terminal device 10 can grasp in which area of the display 24 each measurement value is displayed by using the screen information of the sphygmomanometer 20. Therefore, the terminal device 10 includes the entire display 24 in which the recognition range indicated by the semi-transparent object 540 is included in the captured image when each semi-transparent object 542, 544, 546 is imaged so that each measurement value overlaps. As described above, the position, size, and the like of each translucent object can be set.
  • the user can be guided to image the sphygmomanometer 20 at a position where the terminal device 10 can easily recognize the measurement result.
  • the 7-segment display method since each segment is easy to read, there is an advantage that the user can easily adjust each measured value to each translucent object 542, 544, 546.
  • the character string indicating the measurement value of the sphygmomanometer 20 is displayed in a display method other than the seven-segment display method (for example, dot matrix display method)
  • each of the translucent objects 542, 544, and 546 is also a dot. Displayed in a matrix display mode.
  • the terminal device 10 may use a display method in which at least two of the above display methods (part 1) to (part 3) are combined.
  • the terminal device 10 recognizes a character string indicating a measurement value (hereinafter, also referred to as “display measurement value”) displayed on the display 24 of the sphygmomanometer 20 from the captured image, and uses the recognized character string as the blood pressure. It is regarded as a total of 20 measured values. At this time, in order to increase the accuracy of the measured value displayed on the display 158 (that is, the degree of coincidence between the numerical value indicated by the displayed measured value and the numerical value indicated by the recognized character string) as much as possible, the terminal device 10 recognizes the recognized character. The accuracy indicating the recognition accuracy of (recognized character) is used.
  • the terminal device 10 executes a process of recognizing a display measurement value from a captured image. Thereby, the terminal device 10 acquires the recognized character string recognized from the captured image and the accuracy of each recognized character included in the recognized character string.
  • the accuracy may be calculated using a known technique. For example, the accuracy is calculated by comparing the feature amount of the character image with a dictionary pattern registered in advance in the memory 154.
  • the terminal device 10 For each of the plurality of recognized characters included in the recognized character string that is the recognition result of the character string indicating the display measurement value, the terminal device 10 has an accuracy of the recognized character that is less than a threshold Th1 (for example, 90%). It is determined whether or not. For example, as illustrated in FIG. 4, it is assumed that a character string “118” indicating a systolic blood pressure value is included in the captured image, and a recognized character string “118” is obtained by character recognition. In this case, the accuracies for the recognized characters “1”, “1”, and “8” are acquired as, for example, “85”, “90”, and “95”, respectively. The terminal device 10 determines whether or not each accuracy “85”, “90”, “95” is less than the threshold value Th1. Note that the terminal device 10 makes the same determination for a character string indicating a minimum blood pressure value and a character string indicating a pulse rate.
  • a threshold Th1 for example, 90%.
  • the display device 158 uses the recognized character string including these recognized characters as the measured value of the sphygmomanometer 20. To display. For example, the accuracy “85”, “90”, and “95” for each of the plurality of recognition characters corresponding to the maximum blood pressure value are all equal to or higher than the threshold Th1, and thus the terminal device 10 uses the sphygmomanometer 20 to measure the maximum blood pressure.
  • the hypertension value is determined to be “118” (deemed), and the determination result (for example, the maximum blood pressure value “118”) is displayed on the display 158.
  • the terminal device 10 obtains a display measurement value from each of the captured images for a predetermined number of times (for example, twice).
  • a process for recognizing the indicated character string is executed. For example, a character string “118” indicating a systolic blood pressure value is included in the captured image, and a recognition character string “110” is obtained by the recognition process, and each recognition character “1”, “1”, “0” Assume that the probabilities are “85”, “90”, and “60”, respectively.
  • the terminal device 10 executes a process of recognizing a character string indicating a display measurement value from each of the captured images for a predetermined number of times. To do.
  • the terminal device 10 is based on the recognition result of the character string indicating the display measurement value included in the first captured image and the recognition result of the character string included in each of the two additional captured images.
  • the recognition character string to be adopted as the measurement value of the sphygmomanometer 20 is determined.
  • a plurality of characters in the character string included in the first captured image are recognized as “1”, “1”, and “0”, respectively, and the plurality of characters in the second captured image are respectively “1”, Assume that “1” and “8” are recognized, and the plurality of characters in the third captured image are recognized as “1”, “1”, and “8”, respectively.
  • the recognized character obtained by recognizing the hundredth character is only “1”, and the number of times recognized as the recognized character “1” is three.
  • the recognized character obtained by recognizing the tens place character is only “1”, and the number of times recognized as the recognized character “1” is three.
  • the terminal device 10 specifies that the recognized character with the highest number of recognition is “8” among the plurality of recognized characters “0” and “8”, which are the recognition results of the first place character. To do. For the hundredth character and the tenth character, the recognized character “1” is specified. The terminal device 10 determines the recognized character string “118” composed of the recognized characters “1”, “1”, and “8” thus identified as the maximum blood pressure value, and the determination result is displayed on the display 158. indicate.
  • the terminal device 10 determines that the accuracy of the recognized character is low, the terminal device 10 executes the recognition processing a predetermined number of times again, and determines each recognized character adopted by the majority vote as a measurement value of the sphygmomanometer 20. . Therefore, a recognized character string composed of recognized characters with low accuracy is not adopted as a measurement value of the sphygmomanometer 20. Therefore, the measurement value displayed on the display 24 of the sphygmomanometer 20 can be acquired more accurately.
  • the terminal device 10 erroneously recognizes a character string indicating a display measurement value included in the captured image.
  • the terminal device 10 determines an erroneous recognition character string as a measurement value of the sphygmomanometer 20, and the determination result is displayed.
  • the user compares the measured value displayed on the display 24 of the sphygmomanometer 20 with the measured value displayed on the display 158 of the terminal device 10 so that the terminal device 10 erroneously recognizes the measured value. To figure out. Therefore, the user needs to perform an operation for causing the terminal device 10 to recognize the measurement value displayed on the display 24 of the sphygmomanometer 20.
  • FIG. 7 is a diagram showing an example of a screen displaying the recognition result in real time.
  • the terminal device 10 repeats generation of a captured image, display of the captured image, and recognition of a character string included in the captured image at a predetermined period during the character recognition mode. It is assumed that the display 24 of the sphygmomanometer 20 displays the maximum blood pressure value “133”, the minimum blood pressure value “106”, and the pulse rate “82”.
  • a region 570 in FIG. 7 indicates a region where the screen brightness is high due to reflection of external light or the like.
  • terminal apparatus 10 recognizes character strings “133”, “106”, and “82” of the maximum blood pressure value, the minimum blood pressure value, and the pulse rate from the captured image.
  • the terminal device 10 displays a captured image, a frame 510, and a screen 600 including objects 562, 564, and 566 that are recognition results.
  • the screen 600 shows only a captured image of the display 24 portion of the sphygmomanometer 20. Further, a character string other than the measurement value displayed on the display 24 is also omitted.
  • the object 562 indicates the recognition result of the character string indicating the systolic blood pressure value (that is, the recognition character string “133”), and the object 564 indicates the recognition result of the character string indicating the diastolic blood pressure value (that is, the recognition character string “ 106 "), and the object 566 indicates the recognition result of the character string indicating the pulse rate (that is, the recognized character string” 82 ").
  • the user can grasp that the measured value displayed on the display 24 of the sphygmomanometer 20 is correctly recognized by the terminal device 10. Thereafter, the user can determine the recognized character string as the measurement value of the sphygmomanometer 20, for example, by pressing a determination button or the like.
  • the terminal device 10 may display each of the plurality of recognized characters in a predetermined color according to the accuracy of the recognized character.
  • the accuracy for the hundreds digit “1” is 70% or more and less than 90%.
  • the accuracy for the digit “3” is 90% or more. Therefore, the terminal apparatus 10 displays the hundreds digit “1” of the object 562 and the tens and first digits “3” in different colors. Thereby, the user can perform operations such as changing the direction of the camera 163 of the terminal device 10 while confirming the recognized character string and the color of the recognized character string.
  • FIG. 8 is a diagram for explaining a method of combining a plurality of images.
  • a method of synthesizing a captured image generated in a certain cycle p and a captured image generated in a cycle p + 1 next to the cycle p will be described.
  • the captured images 722 and 726 indicate captured images of the display 24 portion of the sphygmomanometer 20. Further, a character string other than the measurement value displayed on the display 24 is also omitted.
  • each region indicating the first place of the highest blood pressure value, the first place of the lowest blood pressure value, and the pulse rate is reflected by external light or the like. It overlaps with a region 752 where the screen brightness is high. Therefore, the accuracy for each recognized character, which is the recognition result of each character indicating the highest blood pressure value, the lowest blood pressure value, and the pulse rate, is low, for example, the threshold Th2 (for example, 40%) Less than.
  • the terminal device 10 determines, from the captured image 722, each character corresponding to each remaining recognized character other than the recognized character having the accuracy less than the threshold Th2 (in this case, hundreds and tens of the maximum blood pressure value).
  • a partial image 724 is generated by cutting out a region including characters and the tenth digit of the minimum blood pressure value.
  • the terminal device 10 stores the partial image 724 in the memory 154.
  • the terminal device 10 determines, from the captured image 726, each character corresponding to each remaining recognized character other than the recognized character having the accuracy less than the threshold Th2 (in this case, each character other than the hundreds character of the maximum blood pressure value).
  • a partial image 728 is generated by cutting out a region including the character. The terminal device 10 stores the partial image 728 in the memory 154.
  • the terminal device 10 generates a composite image 730 by combining the partial image 724 and the partial image 728 stored in the memory 154.
  • the terminal device 10 executes again the process of recognizing the character string indicating the display measurement value from the composite image. Since the composite image includes only the characters corresponding to the recognized characters having relatively high accuracy, the recognition accuracy of the character string indicating the display measurement value can be improved.
  • FIG. 9 is a diagram for explaining the imaging direction notification function.
  • FIG. 9 shows only the captured image of the display 24 portion of the sphygmomanometer 20 among the captured images displayed on the screen of the display 158. Note that character strings other than the measured values displayed on the display 24 are also omitted.
  • the terminal device 10 compares the captured image 702 included in the screen 810 displayed on the display 158 with the reference image 700, and detects the inclination of the captured image 702 with respect to the reference image 700.
  • the inclination is detected by comparing the shape and size of the display 24 included in the reference image 700 and the captured image 702.
  • the depth direction, the short side direction, and the long side direction of the display 24 are defined as an X axis direction, a Y axis direction, and a Z axis direction, respectively.
  • the terminal device 10 detects that the captured image 702 is tilted around the X axis with respect to the reference image 700. In order to reduce the inclination with respect to the reference image 700, the terminal device 10 displays an object 772 that prompts the terminal device 10 to rotate around the X axis on the screen 810. When the terminal device 10 is rotated, a screen 840 including the captured image 708 is obtained. Note that the terminal device 10 may use the three-axis posture information of the own device detected via the direction sensor 172 when calculating the rotation direction for reducing the inclination with respect to the reference image 700.
  • the terminal device 10 compares the captured image 704 included in the screen 820 with the reference image 700, and detects the inclination of the captured image 704 with respect to the reference image 700. Subsequently, the terminal device 10 detects that the captured image 704 is tilted around the Y axis with respect to the reference image 700. In order to reduce the inclination with respect to the reference image 700, the terminal device 10 displays an object 774 that prompts the terminal device 10 to rotate around the Y axis on the screen 820. When the terminal device 10 is rotated, a screen 840 including the captured image 708 is obtained.
  • the terminal device 10 compares the captured image 706 included in the screen 830 with the reference image 700, and detects the inclination of the captured image 706 with respect to the reference image 700. Subsequently, the terminal device 10 detects that the captured image 706 is tilted around the Z axis with respect to the reference image 700. In order to reduce the inclination with respect to the reference image 700, the terminal device 10 displays an object 776 that prompts the terminal device 10 to rotate around the Z axis on the screen 830. When the terminal device 10 is rotated, a screen 840 including the captured image 708 is obtained. Thereby, the recognition accuracy of the character string which shows a display measurement value can be improved.
  • the terminal device 10 may generate an image using an HDR (high dynamic range imaging) function.
  • HDR high dynamic range imaging
  • FIG. 10 is a diagram for explaining a method of generating a captured image using the HDR function.
  • the terminal device 10 when imaging the display 24 of the sphygmomanometer 20, the terminal device 10 adjusts the exposure amount to generate a captured image 742 with less whiteout and a captured image 744 with less blackout. To do. Then, the terminal device 10 generates a high dynamic range combined image 746 by combining the captured image 742 and the captured image 744.
  • Terminal apparatus 10 according to the present embodiment further has the following functions.
  • the terminal device 10 includes identification information (device ID) of the health device, characteristic information such as the shape of the housing of the health device, character information provided on the housing, the position of the display, the position of the button or switch, and the like. Are stored in the memory 154 in association with each other.
  • the terminal device 10 compares the feature information extracted from the captured image of the health device with the feature information of the various health devices stored in the memory 154, so that the type and model of the health device included in the captured image It is possible to specify a brand or the like.
  • the terminal device 10 If each of the maximum blood pressure value, the minimum blood pressure value, and the pulse rate acquired by recognizing the character string of the display measurement value included in the captured image is an abnormal value, the terminal device 10 notifies the abnormality. Good. Specifically, when the acquired systolic blood pressure value is not included in the predetermined range Ra (for example, 40 to 260 mmHg), the terminal device 10 notifies that the systolic blood pressure value is abnormal. When the acquired minimum blood pressure value is not included in the predetermined range Rb (for example, 40 to 260 mmHg), the terminal device 10 notifies that the minimum blood pressure value is abnormal.
  • the terminal device 10 When the acquired pulse rate is not included in the predetermined range Rc (for example, 40 to 180 beats / minute), the terminal device 10 notifies that the pulse rate is abnormal. In addition, when the minimum blood pressure value is higher than the maximum blood pressure value, the terminal device 10 notifies that these values are abnormal.
  • the terminal device 10 is configured not to repeatedly store the same measurement result in order to ensure the reliability of the measurement result and to effectively use the memory 154. Specifically, the terminal device 10 compares the acquired measurement result with the measurement result already stored in the memory 154 and determines whether or not they match. For example, the terminal device 10 compares the user number, the measurement date and time, and the measurement value included in each measurement result with the model specified by the image recognition. Then, the terminal device 10 does not store the acquired measurement result in the memory 154 when each piece of information of the acquired measurement result and each piece of information of the stored measurement result completely match. Thereby, the terminal device 10 can prevent the measurement results from being stored redundantly.
  • FIG. 11 is a block diagram illustrating a functional configuration of the terminal device 10.
  • the terminal device 10 includes an imaging unit 202, an output control unit 204, a recognition unit 206, a determination unit 208, a determination unit 210, an abnormality determination unit 212, an image generation unit 214, A combining unit 216 and an inclination detecting unit 218 are included.
  • the imaging unit 202 captures a subject and generates a captured image at predetermined intervals.
  • the imaging unit 202 generates a captured image including an image of the sphygmomanometer 20 as a subject.
  • the imaging unit 202 is a function that is realized mainly by the cooperation of the processor 152 and the camera 163.
  • the imaging unit 202 sequentially outputs the generated captured images to the output control unit 204, the recognition unit 206, and the image generation unit 214.
  • the output control unit 204 causes the display 158 to display captured images generated by sequentially capturing images by the image capturing unit 202. In one aspect, the output control unit 204 causes the display 158 to display a captured image generated by the imaging unit 202 and a guide for assisting the user's operation.
  • the guide is displayed to assist the user in the imaging operation so that the recognition process is facilitated when the character string indicating the measurement result displayed on the display 24 of the sphygmomanometer 20 is recognized by the recognition unit 206. Is done.
  • the guide includes an object (for example, a frame 510 and a translucent object 540) indicating the recognition range of the recognition unit 206.
  • the guide further includes objects (for example, semi-transparent objects 542, 544, and 546) that indicate each of the seven segments that exist in the object that indicates the recognition range.
  • the guide may include an object representing the shape of the sphygmomanometer 20 (for example, the object 520).
  • the recognition unit 206 recognizes a character string indicating the measurement result displayed on the display 24 of the sphygmomanometer 20 from the image of the sphygmomanometer 20 included in the captured image generated by the imaging unit 202.
  • the measurement result includes, for example, a measurement value (maximum blood pressure value, minimum blood pressure value, pulse rate), measurement date and time, temperature, user number, various messages, information (symbol) indicating appropriateness of how to wind the cuff, and a pulse wave state. Contains information (symbols) to indicate.
  • the recognizing unit 206 also recognizes a plurality of recognized characters included in a recognized character string that is a recognition result of a character string (for example, a character string indicating a systolic blood pressure value) indicating a measured value (displayed measured value) displayed on the display 24. For each of the above, an accuracy representing the recognition accuracy of the recognized character is calculated.
  • the recognition unit 206 outputs a recognition result (recognized characters and accuracy) to the determination unit 208 and the determination unit 210.
  • the determining unit 208 determines, for each of the plurality of recognized characters, whether or not the accuracy of the recognized character is less than a threshold Th1.
  • the recognition unit 206 When the determination unit 208 determines that the accuracy of at least one of the plurality of recognized characters is less than the threshold Th1, the recognition unit 206 further generates N (N is A process of recognizing a character string indicating a display measurement value from each of the captured images of an integer of 1 or more is further executed.
  • the determination unit 210 determines a recognition character string that is a recognition result of the character string indicating the display measurement value as a measurement value of the sphygmomanometer 20 based on a predetermined condition.
  • the determination unit 210 sets a character string ( For example, based on N + 1 recognition results (that is, N + 1 recognized character strings) of the character string “118” representing the systolic blood pressure value, each character (characters “1” and “1”) included in the character string. , “8”), among at least one recognized character obtained as a recognition result of the character (for example, two recognized characters “8” and “0” obtained as a recognition result of the character “8”), Identify the recognized character with the highest number of recognitions.
  • N + 1 recognition results that is, N + 1 recognized character strings
  • the determination unit 210 determines the recognized character (for example, the recognized character “1” specified for the hundreds character “1”, the recognized character “1” specified for the tens character “1”).
  • a recognized character string (for example, recognized character string “118”) specified for the first character “8”) is determined as a measurement value of the sphygmomanometer 20.
  • the determination unit 210 converts the recognition character string including the plurality of recognition characters into the sphygmomanometer 20. Determined as measured value. In other words, when the accuracy for each recognized character included in the recognized character string first recognized by the recognizing unit 206 is greater than or equal to the threshold Th1, the determining unit 210 determines the first recognized character string as It is determined as a measurement value of the sphygmomanometer 20.
  • the determination unit 210 stores the determination result in the memory 154 and outputs it to the output control unit 204 and the abnormality determination unit 212.
  • the output control unit 204 outputs the determination result of the determination unit 210.
  • the output control unit 204 causes the display 158 to display the determination result.
  • the output control unit 204 may output the determination result to an external device via the communication interface 166 (or the wireless communication unit 160), or output the determination result via the speaker 168 by voice. Also good.
  • the abnormality determination unit 212 determines the presence or absence of an abnormality based on the numerical value represented by the recognized character string determined by the determination unit 210 as the measurement value of the sphygmomanometer 20. For example, the abnormality determination unit 212 determines that there is no abnormality when the numerical value is within a predetermined range, and determines that there is an abnormality when the numerical value is outside the predetermined range. The abnormality determination unit 212 outputs the determination result to the output control unit 204. The output control unit 204 outputs abnormality information when the abnormality determination unit 212 determines that there is an abnormality. For example, the output control unit 204 causes the display 158 to display an error notification indicating that the numerical value is outside a predetermined range.
  • the determination unit 208 for each of a plurality of recognized characters included in a recognized character string that is a recognition result of the character string indicating the display measurement value, the determination unit 208 has a threshold Th2 ( ⁇ You may further judge whether it is less than threshold value Th1). The determination unit 208 outputs the determination result to the image generation unit 214 or the inclination detection unit 218.
  • the determination unit 208 has an accuracy of at least one recognized character (for example, recognized character “0”) out of a plurality of recognized characters (for example, recognized characters “1”, “2”, “0”) less than the threshold Th2.
  • the image generation unit 214 determines that the remaining recognition characters other than the at least one recognition character from the captured image generated by the imaging unit 202 (for example, the recognition characters “1” and “2”).
  • a partial image for example, partial images 724 and 728) obtained by cutting out an area including each character (for example, the characters “1” and “2”) corresponding to is generated.
  • the image generation unit 214 outputs the partial image to the synthesis unit 216.
  • the synthesizing unit 216 synthesizes a plurality of partial images generated by the image generating unit 214.
  • the composition unit 216 outputs the composite image to the recognition unit 206.
  • the recognition unit 206 executes processing for recognizing a character string indicating a display measurement value from the composite image (for example, the composite image 730) combined by the combining unit 216.
  • the inclination detection unit 218 captures the reference image.
  • the inclination of the captured image generated by the unit 202 is detected.
  • the output control unit 204 causes the display 158 to display a rotation direction for reducing the inclination of the captured image with respect to the reference image.
  • the output control unit 204 displays each of the plurality of recognized characters included in the recognized character string, which is the recognition result of the character string indicating the display measurement value, on the display 158 with a predetermined color according to the accuracy of the recognized character. It may be displayed. In addition, the output control unit 204 may display information indicating the accuracy of the recognized character on the display 158. For example, information indicating the accuracy is displayed at a predetermined position in the frame 510. Further, the output control unit 204 may display information indicating the accuracy of the recognized character in the vicinity of the recognized character.
  • FIG. 12 is a flowchart illustrating an example of a processing procedure of the terminal device 10.
  • processor 152 of terminal device 10 generates a captured image of sphygmomanometer 20 captured using camera 163, and displays the captured image on display 158 (step S10).
  • the processor 152 displays a guide for assisting the user's operation on the display 158 (step S12).
  • the processor 152 recognizes a character string indicating the measurement value (display measurement value) displayed on the display 24 of the sphygmomanometer 20 from the captured image (step S14).
  • the accuracy of each of the plurality of recognized characters included in the recognized character string that is the recognition result of the character string indicating the display measurement value is greater than or equal to the threshold Th1 (that is, the accuracy of each recognized character is greater than or equal to the threshold Th1) Whether or not (step S16).
  • processor 152 determines a recognized character string formed of the plurality of recognized characters as a measurement value of sphygmomanometer 20. (Step S24), the determination result (the numerical value indicated by the recognized character string) is displayed on the display 158 (Step S26), and the process is terminated.
  • the processor 152 determines whether the captured image is different from the captured image that is the target of the recognition process.
  • the character string recognition process indicating the display measurement value is re-executed (step S18).
  • the processor 152 determines whether or not the character string recognition process indicating the display measurement value has been performed a predetermined number of times (step S20).
  • step S20 If the recognition process has not been executed a predetermined number of times (NO in step S20), the processor 152 repeats the process of step S18.
  • the processor 152 converts the recognition character string composed of each recognition character specified based on the recognition result for the predetermined number of times to the sphygmomanometer 20 Is determined as a measured value (step S22). Specifically, for each character included in the character string indicating the display measurement value, the processor 152 selects the recognized character having the highest number of recognitions among at least one recognized character obtained by recognizing the character. Identify. Then, the processor 152 displays the determination result on the display 158 (step S26), and ends the process.
  • FIG. 13 is a flowchart illustrating a modification of the processing procedure of the terminal device 10.
  • steps S50 to S56 are the same as the processes in steps S10 to S16 in FIG. 12, respectively, and therefore detailed description thereof will not be repeated.
  • step S66 When the accuracy of each of the plurality of recognized characters included in the recognized character string that is the recognition result of the character string indicating the display measurement value is less than a threshold Th1 (for example, 90%) (NO in step S56), step S66, The process of S68 is executed. Since the processes in steps S66 and S68 are the same as the processes in steps S24 and S26 in FIG. 12, respectively, detailed description thereof will not be repeated.
  • Th1 for example, 90%
  • the processor 152 determines that the accuracy of each of the plurality of recognized characters is a threshold Th2 (for example, 40 %) Or more is determined (step S58).
  • Th2 for example, 40 % Or more is determined.
  • the processes of steps S60 to S64 and S68 are executed. Since the processes in steps S60 to S64 and S68 are the same as the processes in steps S18 to S22 and S26 in FIG. 12, detailed description thereof will not be repeated.
  • the processor 152 determines each remaining character other than at least one recognized character from the generated captured image.
  • a partial image is generated by cutting out a region including each character corresponding to the recognized character (that is, a recognized character having a relatively high accuracy equal to or higher than the threshold Th2) (step S70).
  • the processor 152 stores the generated partial image in the memory 154.
  • the processor 152 determines whether or not a predetermined number of partial images are stored in the memory 154 (step S72). If the predetermined number of partial images is not stored (NO in step S72), processor 152 repeats the processing from step S50. If a predetermined number of partial images are stored (YES in step S72), processor 152 combines the predetermined number of partial images (step S74). The processor 152 executes processing for recognizing a character string indicating a display measurement value from the composite image (step S76).
  • the processor 152 determines whether or not the accuracy of each of the plurality of recognized characters included in the recognized character string recognized from the composite image is greater than or equal to the threshold Th1 (step S78). When the accuracy for each of the plurality of recognized characters is equal to or greater than threshold value Th1 (YES in step S78), processor 152 executes the process of step S66. When the accuracy of at least one recognized character among the plurality of recognized characters is less than the threshold Th1 (NO in step S78), the processor 152 notifies an error (step S80) and ends the process.
  • the terminal device 10 determines that the accuracy of the recognized character is low, the terminal device 10 executes the recognition process a predetermined number of times again, and recognizes a recognized character string composed of each recognized character adopted by majority vote. Is determined as a measured value, and the determined recognized character string is displayed on the display 158. Therefore, the recognized character with low accuracy is not displayed as it is on the display 158, and the accuracy of the measurement value displayed on the display 158 can be improved.
  • a guide for assisting the user's operation is displayed. Moreover, since the object which shows a frame, the recognition range, each segment of a 7 segment display system, etc. is displayed as a guide, a user can be induced to image the sphygmomanometer 20 at an optimal position.
  • the terminal device 10 may evaluate the reliability of a measured value using the various information contained in a measurement result.
  • the processor 152 evaluation unit of the terminal device 10 calculates a period from the measurement date to the imaging date and determines whether the calculated period is within a predetermined period. When the calculated period is not within the predetermined period, the processor 152 evaluates that the reliability of the measurement result is low. In this case, the processor 152 may output an error notification indicating that the calculated period is not within the predetermined period.
  • the terminal device 10 detects the inclination of the captured image 702 with respect to the reference image 700 (step A). Based on the inclination, the terminal device 10 corrects the captured image 702 so as to reduce the inclination with respect to the reference image 700 (step B). The terminal device 10 executes processing for recognizing a character string indicating a display measurement value from the corrected captured image 702 (step C). The terminal device 10 determines whether or not the accuracy for all recognized characters corresponding to the display measurement value is equal to or greater than a predetermined threshold (for example, threshold Th1) (step D).
  • a predetermined threshold for example, threshold Th1
  • the terminal device 10 displays a recognition character string formed of these recognition characters on the display 158 as a measurement value of the sphygmomanometer 20 ( Step E).
  • the terminal device 10 rotates the terminal device 10 in an appropriate direction in order to reduce the inclination with respect to the reference image 700.
  • An object for example, objects 772, 774, 776 that prompts the user to perform the operation is displayed on the display 158 (step F), and the processing from step A is repeated again.
  • a program for causing a computer to function and executing control as described in the above flowchart is recorded on a non-temporary computer-readable recording medium such as a flexible disk attached to the computer, a CD (Compact Disk Read Only Memory), a secondary storage device, a main storage device, and a memory card. It can also be provided as a program product. Alternatively, the program can be provided by being recorded on a recording medium such as a hard disk built in the computer. A program can also be provided by downloading via a network.
  • the program may be a program module that is provided as a part of an operating system (OS) of a computer and that calls necessary modules in a predetermined arrangement at a predetermined timing to execute processing.
  • OS operating system
  • the program itself does not include the module, and the process is executed in cooperation with the OS.
  • a program that does not include such a module can also be included in the program according to the present embodiment.
  • the program according to the present embodiment may be provided by being incorporated in a part of another program. Even in this case, the program itself does not include the module included in the other program, and the process is executed in cooperation with the other program. A program incorporated in such another program can also be included in the program according to the present embodiment.
  • the configuration exemplified as the above-described embodiment is an example of the configuration of the present invention, and can be combined with another known technique, and a part thereof does not depart from the gist of the present invention. It is also possible to change and configure such as omitting. In the above-described embodiment, the processing and configuration described in the other embodiments may be adopted as appropriate.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Character Input (AREA)
  • Character Discrimination (AREA)

Abstract

Selon la présente invention, ce dispositif terminal (10) comprend une unité de reconnaissance (206) qui effectue un traitement pour reconnaître, à partir d'une image d'un dispositif de santé compris dans une image capturée, une chaîne de caractères montrant une valeur de mesure affichée sur un dispositif d'affichage du dispositif de santé, et une unité de détermination (208) qui, pour chacun de multiples caractères de reconnaissance compris dans la chaîne de caractères de reconnaissance, détermine si oui ou non la certitude, qui représente la précision de reconnaissance des caractères de reconnaissance, est inférieure à une première valeur seuil. Si la certitude est inférieure à la première valeur seuil, l'unité de reconnaissance (206) effectue en outre un traitement pour reconnaître la chaîne de caractères montrant la valeur de mesure affichée à partir de N images capturées (N étant un entier supérieur ou égal à 1) en outre généré par l'unité d'imagerie (202). Le dispositif terminal (10) comprend en outre une unité de détermination (210) qui, sur la base des N+1 résultats de reconnaissance de la chaîne de caractères montrant la valeur de mesure affichée, détermine que la chaîne de caractères de reconnaissance configurée à partir des caractères de reconnaissance spécifiés est la valeur de mesure du dispositif de santé ; et un dispositif de commande de sortie (204) qui délivre le résultat de détermination.
PCT/JP2017/037990 2016-12-28 2017-10-20 Dispositif terminal WO2018123229A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
RU2019118662A RU2727464C1 (ru) 2016-12-28 2017-10-20 Оконечное устройство
MX2019007047A MX2019007047A (es) 2016-12-28 2017-10-20 Dispositivo terminal.
CN201780076261.3A CN110088770B (zh) 2016-12-28 2017-10-20 终端装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-256224 2016-12-28
JP2016256224A JP6804292B2 (ja) 2016-12-28 2016-12-28 端末装置

Publications (1)

Publication Number Publication Date
WO2018123229A1 true WO2018123229A1 (fr) 2018-07-05

Family

ID=62708025

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/037990 WO2018123229A1 (fr) 2016-12-28 2017-10-20 Dispositif terminal

Country Status (5)

Country Link
JP (1) JP6804292B2 (fr)
CN (1) CN110088770B (fr)
MX (1) MX2019007047A (fr)
RU (1) RU2727464C1 (fr)
WO (1) WO2018123229A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112001392A (zh) * 2019-05-27 2020-11-27 株式会社东芝 读取系统、读取方法、存储介质以及移动体

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7248483B2 (ja) * 2019-03-29 2023-03-29 株式会社日立システムズ メータ読み取り作業装置および方法ならびにメータ読み取り作業支援システム
JP7253957B2 (ja) * 2019-03-29 2023-04-07 株式会社日立システムズ メータ読み取り作業装置および方法ならびにメータ読み取り作業システム
JP7290498B2 (ja) * 2019-07-26 2023-06-13 株式会社ミツトヨ デジタル表示読取装置及びプログラム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008099664A1 (fr) * 2007-02-15 2008-08-21 Mitsubishi Heavy Industries, Ltd. Dispositif de reconnaissance du numéro d'immatriculation d'un véhicule
JP2010218061A (ja) * 2009-03-13 2010-09-30 Toshiba Corp 画像処理装置
JP2013125391A (ja) * 2011-12-14 2013-06-24 Asutemu:Kk テレビジョン装置、情報処理方法、およびプログラム
JP2014137713A (ja) * 2013-01-17 2014-07-28 Hidetoshi Konishi データ識別方式

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3587775B2 (ja) * 1999-09-09 2004-11-10 松下電器産業株式会社 表示データ解析装置および記録媒体
JP4655335B2 (ja) * 2000-06-20 2011-03-23 コニカミノルタビジネステクノロジーズ株式会社 画像認識装置、画像認識方法および画像認識プログラムを記録したコンピュータ読取可能な記録媒体
JP2005352705A (ja) * 2004-06-09 2005-12-22 Omron Corp パターン認識装置、パターン認識方法及び文字認識方法。
US7587308B2 (en) * 2005-11-21 2009-09-08 Hewlett-Packard Development Company, L.P. Word recognition using ontologies
JP5146190B2 (ja) * 2008-08-11 2013-02-20 オムロン株式会社 文字認識装置、文字認識プログラム、および文字認識方法
JP2011081763A (ja) * 2009-09-09 2011-04-21 Sony Corp 情報処理装置、情報処理方法及び情報処理プログラム
JP5075291B2 (ja) * 2010-02-26 2012-11-21 楽天株式会社 情報処理装置、情報処理方法、情報処理プログラムを記録した記録媒体
JP5647919B2 (ja) * 2011-03-07 2015-01-07 株式会社Nttドコモ 文字認識装置、文字認識方法、文字認識システム、および文字認識プログラム
JP5982844B2 (ja) * 2012-02-06 2016-08-31 オムロン株式会社 文字読取用のプログラムおよび文字読取装置
JP5831420B2 (ja) * 2012-09-28 2015-12-09 オムロン株式会社 画像処理装置および画像処理方法
CN103902993A (zh) * 2012-12-28 2014-07-02 佳能株式会社 文档图像识别方法和设备
CN104008384B (zh) * 2013-02-26 2017-11-14 山东新北洋信息技术股份有限公司 字符识别方法和字符识别装置
CN104217203B (zh) * 2013-06-03 2019-08-23 支付宝(中国)网络技术有限公司 复杂背景卡面信息识别方法及系统
CN103729636B (zh) * 2013-12-18 2017-07-25 小米科技有限责任公司 字符切割方法、装置及电子设备
JP6268023B2 (ja) * 2014-03-31 2018-01-24 日本電産サンキョー株式会社 文字認識装置およびその文字切り出し方法
CA2885880C (fr) * 2014-04-04 2018-07-31 Image Searcher, Inc. Traitement d'image comprenant la selection d'objets
CN105675626B (zh) * 2016-02-26 2018-08-07 广东工业大学 一种轮胎模具的字符缺陷检测方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008099664A1 (fr) * 2007-02-15 2008-08-21 Mitsubishi Heavy Industries, Ltd. Dispositif de reconnaissance du numéro d'immatriculation d'un véhicule
JP2010218061A (ja) * 2009-03-13 2010-09-30 Toshiba Corp 画像処理装置
JP2013125391A (ja) * 2011-12-14 2013-06-24 Asutemu:Kk テレビジョン装置、情報処理方法、およびプログラム
JP2014137713A (ja) * 2013-01-17 2014-07-28 Hidetoshi Konishi データ識別方式

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112001392A (zh) * 2019-05-27 2020-11-27 株式会社东芝 读取系统、读取方法、存储介质以及移动体
JP2020194281A (ja) * 2019-05-27 2020-12-03 株式会社東芝 読取システム、読取方法、プログラム、記憶媒体、及び移動体
US11694451B2 (en) 2019-05-27 2023-07-04 Kabushiki Kaisha Toshiba Reading system, reading method, storage medium, and moving body
US12073635B2 (en) 2019-05-27 2024-08-27 Kabushiki Kaisha Toshiba Reading system, reading method, storage medium, and moving body
CN112001392B (zh) * 2019-05-27 2024-12-24 株式会社东芝 读取系统、读取方法、存储介质以及移动体

Also Published As

Publication number Publication date
RU2727464C1 (ru) 2020-07-21
JP2018109801A (ja) 2018-07-12
JP6804292B2 (ja) 2020-12-23
MX2019007047A (es) 2019-09-06
CN110088770A (zh) 2019-08-02
CN110088770B (zh) 2023-07-07

Similar Documents

Publication Publication Date Title
WO2018123227A1 (fr) Dispositif terminal
US11589825B2 (en) Systems and methods for blood pressure estimation using smart offset calibration
WO2018123229A1 (fr) Dispositif terminal
KR20190101456A (ko) 혈압 센서의 배치 및 접촉 품질의 사용자 피드백을 제공하기 위한 시스템 및 방법
CN109906052B (zh) 包括血压传感器的设备以及用于控制该设备的方法
US10959646B2 (en) Image detection method and image detection device for determining position of user
KR101978548B1 (ko) 안구 움직임 측정을 통한 어지럼 진단 서버, 방법, 및 이를 기록한 기록매체
CN109997178A (zh) 用于提醒紧急服务的计算机系统
JP6593176B2 (ja) 血圧補正情報生成装置、血圧測定装置、血圧補正情報生成方法、血圧補正情報生成プログラム
JP6726093B2 (ja) 情報処理システム
US11832925B2 (en) Electronic device for measuring biometric information and method for operating the same
KR20210060246A (ko) 생체 데이터를 획득하는 장치 및 그 방법
WO2022089101A1 (fr) Procédé et appareil de mesure de vitesse d'onde de pouls (vop) basés sur un dispositif électronique portable
US11406330B1 (en) System to optically determine blood pressure
US20200074199A1 (en) IMAGE DETECTION METHOD AND IMAGE DETECTION DEVICE utilizing dual analysis
US20190365243A1 (en) Blood-pressure-related information display apparatus and method
US11257246B2 (en) Image detection method and image detection device for selecting representative image of user
US20240081785A1 (en) Key frame identification for intravascular ultrasound based on plaque burden
US20240245385A1 (en) Click-to-correct for automatic vessel lumen border tracing
US20240081781A1 (en) Graphical user interface for intravascular ultrasound stent display
US20240081666A1 (en) Trend lines for sequential physiological measurements of vessels
US20240086025A1 (en) Graphical user interface for intravascular ultrasound automated lesion assessment system
US20240081782A1 (en) Graphical user interface for intravascular ultrasound calcium display

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17886674

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2019118662

Country of ref document: RU

122 Ep: pct application non-entry in european phase

Ref document number: 17886674

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载