US20130343659A1 - Electronic device, electronic device control method, and computer-readable recording medium having stored thereon electronic device control program - Google Patents
Electronic device, electronic device control method, and computer-readable recording medium having stored thereon electronic device control program Download PDFInfo
- Publication number
- US20130343659A1 US20130343659A1 US13/971,704 US201313971704A US2013343659A1 US 20130343659 A1 US20130343659 A1 US 20130343659A1 US 201313971704 A US201313971704 A US 201313971704A US 2013343659 A1 US2013343659 A1 US 2013343659A1
- Authority
- US
- United States
- Prior art keywords
- subject
- image
- imaged
- images
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims description 25
- 238000003384 imaging method Methods 0.000 claims abstract description 97
- 239000000284 extract Substances 0.000 claims abstract description 23
- 238000004891 communication Methods 0.000 claims description 50
- 230000008569 process Effects 0.000 claims description 17
- 238000000605 extraction Methods 0.000 claims description 7
- 238000012545 processing Methods 0.000 claims description 5
- 230000002123 temporal effect Effects 0.000 claims 1
- 230000001815 facial effect Effects 0.000 description 12
- 238000012015 optical character recognition Methods 0.000 description 11
- 238000010586 diagram Methods 0.000 description 8
- 238000006243 chemical reaction Methods 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 241000167854 Bourreria succulenta Species 0.000 description 2
- 230000004075 alteration Effects 0.000 description 2
- 235000019693 cherries Nutrition 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 238000004513 sizing Methods 0.000 description 2
- 238000009966 trimming Methods 0.000 description 2
- 241000282472 Canis lupus familiaris Species 0.000 description 1
- 241000282326 Felis catus Species 0.000 description 1
- 244000007853 Sarothamnus scoparius Species 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000004907 flux Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000035935 pregnancy Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/21—Intermediate information storage
-
- G06K9/6267—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/5866—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/01—Social networking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/22—Character recognition characterised by the type of writing
- G06V30/224—Character recognition characterised by the type of writing of printed characters having additional code marks or containing code marks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00326—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus
- H04N1/00328—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus with an apparatus processing optically-read information
- H04N1/00336—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus with an apparatus processing optically-read information with an apparatus performing pattern recognition, e.g. of a face or a geographic feature
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
- H04N9/8233—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being a character code signal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/179—Human faces, e.g. facial parts, sketches or expressions metadata assisted face recognition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3212—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
- H04N2201/3214—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of a date
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3212—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
- H04N2201/3215—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of a time or duration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3274—Storage or retrieval of prestored additional information
Definitions
- the present invention relates to an electronic device, an electronic device control method, and a computer-readable recording medium having stored thereon an electronic device control program.
- an album creating apparatus for creating an album uses supplementary information, such as dates and times of imaging and sites of imaging, that is added to images that have been imaged (for example, Patent Literature 1).
- an album is created based on supplementary information about a single person, and use of the supplementary information is limited. Therefore, it is an object of an aspect of the innovations herein to provide an electronic device, an electronic device control method and a computer-readable recording medium having stored thereon an electronic device control program, which are capable of overcoming the above drawbacks accompanying the related art.
- the above and other objects can be achieved by combinations described in the independent claims. Also, the dependent claims define advantageous, specific examples of the present invention.
- a first aspect of the present invention provides an electronic device including: a storage unit that stores therein a plurality of images in which a first subject is imaged, and a plurality of images in which a second subject who is different from the first subject is imaged, and stores therein supplementary information about the first subject and the second subject; and an extracting unit that extracts images whose dates and times of imaging are different from each other based on the supplementary information when extracting an image in which the first subject is imaged and an image in which the second subject is imaged.
- a second aspect of the present invention provides an electronic device including: a storage unit that stores therein a plurality of images in which a first subject is imaged, and a plurality of images in which a second subject who is different from the first subject is imaged, and stores therein supplementary information about the first subject and the second subject; and an extracting unit that extracts images such that time axes of the first subject and the second subject relatively align with each other based on the supplementary information when extracting an image in which the first subject is imaged and an image in which the second subject is imaged.
- a third aspect of the present invention provides a program that controls a computer to perform: a storage function of storing a plurality of images in which a first subject is imaged, and a plurality of images in which a second subject who is different from the first subject is imaged, and storing supplementary information about the first subject and the second subject; and an extraction function of extracting images such that time axes of the first subject and the second subject relatively align with each other based on the supplementary information when extracting an image in which the first subject is imaged and an image in which the second subject is imaged.
- FIG. 1 is a diagram that shows an album creation system 1 according to an embodiment.
- FIG. 2 is a flowchart about image input (S 10 ) performed by a server-side control unit 35 .
- FIG. 3 shows an example of user supplementary information.
- FIG. 4 shows an example of image supplementary information.
- FIG. 5 shows an example of theme setting information.
- FIG. 6 is a flowchart about pair album creation (S 20 ) performed by a server-side control unit 35 .
- FIG. 7 shows an example of templates.
- FIG. 8 is a conceptual diagram for explaining time axes of images extracted with the theme shown in FIG. 5 .
- FIG. 9 shows another example of theme setting information.
- FIG. 10 is a conceptual diagram for explaining time axes of images extracted with the theme shown in FIG. 9 .
- FIG. 1 is a diagram that shows an album creation system 1 according to an embodiment.
- the album creation system 1 has a plurality of communication devices 2 - 1 , 2 - 2 . . . 2 -N that transfers images and make requests for album creation, and a server 3 that stores therein images from the plurality of communication devices 2 and creates an album.
- the communication devices 2 may be devices such as a personal computer, a cellular phone, a digital camera, and a PDA.
- the explanation of the present embodiment takes a digital camera as an example.
- the communication device 2 has: an imaging unit 20 that performs imaging; a nonvolatile flash memory 21 that stores therein images imaged by the imaging unit 20 and various types of data; a face recognizing unit 22 ; a display unit 23 that displays images imaged by the imaging unit 20 ; a GPS (global positioning system) module 24 that detects the position (absolute position) of the communication device 2 ; a calendar unit 25 that stores therein dates and times; a communication unit 26 that communicates with the server 3 ; and a control unit 27 that controls the communication device 2 .
- An example of the display unit 23 is a liquid crystal display, and a touch panel for inputting various types of information may be provided to the display unit.
- Examples of the communication unit 26 include a wireless communication unit that accesses an electric communication line such as the Internet, and a Felica (registered trademark) chip.
- the imaging unit 20 has: an imaging lens that includes a focus lens; a focus detection system that detects a focus position of the focus lens; an imaging device that images a light flux from the imaging lens; and a meta information creating unit that adds, to images, imaging-related information such as focus information detected by the focus detection system, dates and times of imaging acquired from the calendar unit 25 , and sites of imaging acquired from the GPS module 24 .
- the flash memory 21 stores therein images imaged by the imaging unit 20 as described above. Furthermore, the flash memory 21 stores therein facial images and attribute information corresponding to the facial images that are used for facial recognition in images imaged by the imaging unit 20 .
- the attribute information is input to the communication device 2 by a user using, for example, a touch panel provided to the display unit 23 .
- the attribute information includes names, birth dates, addresses, and group information such as relationship with the user of the communication device 2 like family relationship. Note that even if any piece of the attribute information lacks on the side of the communication device 2 , it does not affect imaging negatively.
- the face recognizing unit 22 detects a face in an image imaged by the imaging unit 20 , and recognizes whose face the face is based on facial images stored in the flash memory 21 .
- the face recognizing unit 22 may recognize, in addition to human faces, faces of pets such as dogs and cats.
- the recognition result of the face recognizing unit 22 is added, by the meta information creating unit, to the imaging-related information as subject-related information.
- facial recognition of pets may be performed by the technique disclosed, for example, in Japanese Patent Application Publication No. 2011-19013.
- the control unit 27 has a CPU, and transmits images to the server 3 via the communication unit 26 . At this time, the control unit 27 may transmit imaging-related information created by the meta information creating unit of the imaging unit 20 in association with each image.
- images transmitted to the server 3 by the communication device 2 may be either still images or motion images, and if the communication device 2 has a keyboard or a microphone, text data or audio data may be transmitted to the server 3 .
- images and audio data may be transmitted from a digital camera to the server 3
- text data may be transmitted from a personal computer to the server 3 .
- control unit 27 transmits, to the server 3 , attribute information stored in the flash memory 21 .
- the control unit 27 may transmit the attribute information in association with images, or may transmit the attribute information as separate information without associating it with images.
- control unit 27 transmits, to the server 3 , attribute information of new members, and updated attribute information if there is any update.
- the server 3 has a server-side communication unit 30 , an analyzing unit 31 , an information extracting unit 32 , a server-side flash memory 33 , an album creating unit 34 and a server-side control unit 35 .
- the server 3 may be a personal computer used by the user of the communication device 2 , or an information processing device that is managed by an administrator who is not the user.
- the server-side communication unit 30 communicates with the communication unit 26 of the communication device 2 . Although it is assumed in the present embodiment that communication is established via an electric communication line, wired communication or wireless communication may be employed.
- the analyzing unit 31 analyzes data received by the server-side communication unit 30 .
- the analyzing unit 31 is provided with an image analyzing unit 31 a and a text data analyzing unit 31 b. Note that an audio data analyzing unit may be additionally provided.
- the image analyzing unit 31 a has an OCR (optical character reader) unit, a metadata analyzing unit and a face recognizing unit.
- OCR optical character reader
- the OCR unit reads out characters in an image.
- the OCR unit converts characters in an image, such as “XX Elementary School, Graduation Ceremony, 2009” and “YY Zoo”, into text data.
- the metadata analyzing unit analyzes a focus position, a subject-related information, a date and time of imaging, and a site of imaging that are included in imaging-related information associated with an image, and generates text data.
- the face recognizing unit detects faces in a transmitted image when subject-related information is not included in imaging-related information associated with the image for reasons such as that the communication device 2 does not have a facial recognition function.
- the face recognizing unit further identifies a face detected based on a facial image stored in the server-side flash memory 33 described below.
- the text data analyzing unit 31 b compares text data input by the server-side communication unit 30 with text data created as a result of conversion by the OCR unit, and compares a date and time of imaging and a site of imaging included in imaging-related information with text data created as a result of conversion by the OCR unit. For example, the text data analyzing unit 31 b performs comparison to know whether a site of imaging (latitude/longitude) detected by the GPS module 24 matches with text data created as a result of conversion by the OCR unit (whether there is inconsistency therebetween), or whether a date and time of imaging acquired from the calendar unit 25 matches with text data created as a result of conversion by the OCR unit (whether there is inconsistency therebetween).
- the information extracting unit 32 extracts information acquired as a result of analysis by the analyzing unit 31 , inquires of the communication device 2 via the server-side communication unit 30 when there is inconsistency or unclarity in the information acquired as a result of analysis by the analyzing unit 31 , and extracts further information via the server-side communication unit 30 .
- Examples of situations where the information extracting unit 32 extracts further information include events, such as festivals and fireworks shows, that are determined to be held at the site based on a site of imaging and a date and time of imaging.
- the server-side flash memory 33 is a nonvolatile memory that stores therein images transmitted from the communication device 2 and image supplementary information in association with the images.
- the server-side flash memory 33 further stores therein user supplementary information about the user, theme setting information about themes for creating pair albums, and templates for arranging images based on the themes. The information is further described below.
- the album creating unit 34 creates an album by associating a plurality of images whose dates and times of imaging are different. For example, when creation of an album for enrollment ceremonies of junior high schools for family members is requested, the album creating unit 34 creates an album based on images of the enrollment ceremony of a junior high school for each of the father, the mother, the eldest son, and the eldest daughter stored in the server-side flash memory 33 .
- the album creating unit 34 may create an album based on images of or near a certain site that are imaged before a broom and a bride get to know each other.
- the album creating unit 34 has an image extracting unit 36 , an image comparing unit 37 and an image processing unit 38 .
- the image extracting unit 36 extracts images, from the server-side flash memory 33 , according to a theme that is designated based on control of the server-side control unit 35 .
- the image comparing unit 37 compares a plurality of images with each other. For example, when the image comparing unit 37 extracts images for a certain theme and there is a plurality of pictures of the father that are suited for the theme, the image comparing unit 37 extracts pictures that are determined to be the best among them such as images that show smiles of the father based on smile detection. Furthermore, the image comparing unit 37 compares pictures extracted for each family member, and switches from one image to another. For example, if pictures of other family members than the father each show only one corresponding person, a picture that shows only the father is selected, and if, in several pictures, there is cherry blossom in the background, other pictures are switched to those with cherry blossom in the background. The overall balance is sought in this way. At this time, for example, a picture in which the mother and the eldest daughter face each other (the mother is positioned on the right, and the eldest daughter is positioned on the left) may be extracted.
- the image processing unit 38 processes images extracted based on the result of comparison by the image comparing unit 37 . At this time, the image processing unit 38 performs process of resizing, trimming, and sizing of images, and process for displaying, together with the images, titles and subtitles such as dates and times, and sites of imaging.
- the server-side control unit 35 controls entire operation of the server 3 , and in the present embodiment, performs control of album creation based on themes. Note that each of the functions of the server 3 may be implemented by a software program.
- FIG. 2 is a flowchart about image input (S 10 ) performed by the server-side control unit 35 , and hereinafter, operation up to the image input is explained with reference to the flowchart.
- the operation of the step S 10 starts when the communication device 2 makes a request, to the server 3 , for input of an image.
- the server-side control unit 35 performs user registration with the communication device 2 (S 110 ). At this time, the server-side control unit 35 may acquire attribute information from the communication device 2 via the communication unit 26 and the server-side communication unit 30 . Furthermore, the server-side control unit 35 assigns two user IDs, like the ones shown below, to the user.
- “FAM” indicates FAMILY
- the server-side control unit 35 registers the family as group information.
- all images registered, in the server 3 with the ID No. 123456-FAM-1 may be allowed to be used.
- “ ⁇ 1” at the end indicates the first generation of the family. For example, if the children are to go out on their own in the future, they may have an ID No. 123456-FAM-2, in addition to ID No. 123456-FAM-1.
- “FRI” indicates FRIEND, and the server-side control unit 35 registers a friend as group information.
- all images registered, in the server 3 with the ID No. 123456-FRI-1 may be allowed to be used, or alternatively approval may be required for using each of the images.
- the server-side control unit 35 generates user supplementary information based on a user ID and attribute information, and stores the user supplementary information in the server-side flash memory 33 .
- FIG. 3 shows an example of user supplementary information.
- the user supplementary information includes the birth date and the address of the user associated with the user ID, and group information of the family members.
- the group information of the family members includes relationship with the user for identifying a family member, and the birth dates of the family members.
- the user supplementary information further includes group information of friends associated with the user ID.
- the group information of friends includes names of the friends. As in the case of the information of the family members, the birth dates of the friends may be added as the group information.
- the server-side control unit 35 displays, on the communication device 2 , a screen for prompting the user to input the information and acquires the information.
- a purpose of registering birth dates of the user and his/her family members at the step S 110 is for determining ages of the user and his/her family members at the time of imaging by comparing the birth dates with dates and times of imaging of input images. For example, when a pair album with a theme of images that are imaged when a family member is 10 years old is to be created, the server-side control unit 35 uses the image extracting unit 36 to extract images, from the server-side flash memory 33 , that were imaged 10 years after the birth date. Similarly, when a pair album at the time of enrollment in an elementary school is to be created, the server-side control unit 35 extracts images, from the server-side flash memory 33 , that were imaged in April and six years after the birth date.
- the server-side control unit 35 confirms a user ID prior to input of images (step S 112 ).
- authentication is performed with the ID No. 123456-FAM-1.
- the server-side control unit 35 inputs, via the communication unit 26 and the server-side communication unit 30 , images and imaging-related information created by the meta information creating unit of the imaging unit 20 (S 114 ). At this time, the server-side control unit 35 may acquire, from the communication device 2 , images one by one or multiple images collectively.
- the server-side control unit 35 creates image supplementary information based on the images and the imaging-related information, and stores the image supplementary information in the server-side flash memory 33 .
- FIG. 4 shows an example of image supplementary information.
- the image supplementary information includes focus information, subject-related information, a date and time of imaging, and a site of imaging associated with a file name of an image.
- the subject-related information includes information for identifying a subject, and information for identifying orientation of the subject.
- the date and time of imaging includes a date and year, and an event name of an event at which the image is imaged.
- the site of imaging includes latitude/longitude, and a name of a site where the image is imaged. Note that “*” in the example shown in FIG. 4 indicates that applicable information is not available.
- the server-side control unit 35 analyzes an image input by using the analyzing unit 31 , and extracts additional information of the image supplementary information by using the information extracting unit 32 to update the image supplementary information stored in the server-side flash memory 33 .
- facial recognition by the image analyzing unit 31 a and collection of event information by the information extracting unit 32 are performed. Note that when each piece of the image supplementary information has already been input, analysis of the piece by the analyzing unit 31 may be omitted, or if inconsistency is found as a result of analysis of the piece by the analyzing unit 31 , such inconsistency may be confirmed with the user.
- the server-side control unit 35 confirms whether there are matters to be confirmed with the user about an input image (S 116 ). At this time, the server-side control unit 35 may judge whether there are matters to be confirmed based on whether image supplementary information meet predetermined conditions at the time when update of the image supplementary information at the step S 114 has completed. For example, the server-side control unit 35 judges that there is a matter to be confirmed under a condition that the fields of a date and time of imaging, that is, the date and year and the event name, both indicate “*”. Also, the server-side control unit 35 may judge that there is a matter to be confirmed under a condition that the fields of a site of imaging, that is, the latitude/longitude and the name of a site, both indicate “*”.
- the process proceeds to the step S 118 .
- the server-side control unit 35 inquires for lacking information among the image supplementary information.
- the server-side control unit 35 ends the inquiry at the time when the image supplementary information has met the conditions as a result of the inquiry.
- the server-side control unit 35 inquires for a site of imaging when the GPS module 24 is not provided to the communication device 2 , and thus the site of imaging cannot be identified. Also, when an unregistered facial image is included, for example, in a plurality of images, the server-side control unit 35 inquires whether the facial image should be registered as a facial image of a user.
- an inquiry is made about the event, or if the image is imaged in March, about whether the image should be identified as an image imaged at a graduation ceremony.
- the server-side control unit 35 acquires text data created by the user from the blog or Twitter (registered trademark), and analyzes the text data by using the text data analyzing unit 31 b; thereby, the inquiries to the user can be omitted or the frequency of the inquiries can be reduced. Specifically, the user may be prompted to input accounts of the blog or Twitter (registered trademark) as the user supplementary information shown in FIG. 3 .
- the analysis of an image by using the analyzing unit 31 and the extraction of additional information by using the information extracting unit 32 may be performed at timing that is different from the time of image input, and accordingly, the inquiries at the step S 118 may be performed after the image input.
- the server-side control unit 35 confirms with the user whether there are further image input (S 120 ).
- the server-side control unit 35 confirms whether the user wishes to input any image after changing the user ID to No. 123456-FRI-1, and if the user wishes to do so, the process returns to the step S 112 , and if not, the process proceeds to the step S 122 .
- the server-side control unit 35 inquires the user whether to create a pair album (S 122 ).
- the album creation in this case includes two types. One is that proposed by the server 3 .
- the server-side control unit 35 proposes creation of a pair album by extracting pictures of family members with a theme of graduation ceremonies of elementary schools.
- the other type of the album creation is creation of a pair album according to a request by the user.
- a theme may be input as text by the user by using the communication device 2 , or alternatively the server 3 may present a list of themes to the communication device 2 to prompt the user to make a selection.
- the server-side control unit 35 proceeds to the flowchart S 20 shown in FIG. 6 if the result of judgment by the user at the step S 122 is Yes, and ends the process of the flowchart if the result of judgment by the user is No.
- FIG. 5 shows an example of theme setting information.
- the theme setting information includes keywords associated with a name of a theme, a template, a date and time of imaging, and a site of imaging.
- the date and time of imaging includes the number of years from a birth date, and timing when imaging is performed.
- the site of imaging includes latitude/longitude, and a name of a site where the imaging is performed. Note that “*” in the example shown in FIG. 5 indicates that applicable information is not to be used for extraction of images.
- the theme setting information is input in advance and is stored in the server-side flash memory 33 .
- FIG. 6 is a flowchart about pair album creation (S 20 ) performed by the server-side control unit 35 , and hereinafter, operation of pair album creation is explained with reference to the flowchart.
- the server-side control unit 35 confirms a theme of a pair album to be created (S 210 ). At this time, for example, the server-side control unit 35 receives input of free format text from the communication device 2 , calculates the degrees of coincidence between keywords included in the text, and keywords included in the theme setting information, and extracts a theme name with a high degree of coincidence. In place of receiving input of free format text, the server-side control unit 35 may display a list of theme names in the theme setting information, and receive a selection therefrom.
- pair album creation with a theme of an enrollment ceremony of a junior high school is explained.
- the server-side control unit 35 confirms subjects for the album creation (S 212 ). It is assumed here that all the four family members (father, mother, eldest son and eldest daughter) registered in the user supplementary information of the user are album-creation subjects. Note that this step S 212 and the next step S 214 may be omitted when subjects for the pair album creation are family members.
- the server-side control unit 35 judges whether permission of album-creation subjects are necessary (S 214 ).
- the judgment at the step S 214 is necessary when the group information indicates friends, and use of all images is not approved. In this way, when permission of album-creation subjects are necessary for using images, the process returns to the step S 212 , and the permission is obtained from the album-creation subjects. Also, when the permission cannot be obtained from a certain album-creation subject, for example, within a preset length of time, the album-creation subject may be excluded from the group of album-creation subjects.
- the server-side control unit 35 based on the them and in cooperation with the image extracting unit 36 of the album creating unit 34 , extracts images stored in the server-side flash memory 33 (S 216 ). At this time, the image extracting unit 36 compares information about the date and time of imaging and the site of imaging included in the theme setting information of the theme with information about a date and time of imaging and a site of imaging included in the image supplementary information of each image to extract images.
- the image extracting unit 36 refers to the theme setting information shown in FIG. 5 , and extracts images whose date and time of imaging shows that they were imaged in April and 12 years after the birth date, that is, images imaged in April when the subject is 12 years old.
- the image extracting unit 36 refers to the name of a site of imaging in the theme setting information, and if a junior high school is included in the name of a site of imaging in the image supplementary information, that is, if the imaging-related information created by the meta information creating unit of the imaging unit 20 includes information about a junior high school or if the OCR unit identifies characters like “XX Junior High School, Enrollment Ceremony”, extraction may be performed by giving weight to pictures with such information.
- the server-side control unit 35 may extract images by correcting timing of enrollment based on information such as a site of imaging and latitude/longitude.
- the server-side control unit 35 uses the image extracting unit 36 to extract, from the server-side flash memory 33 , pictures of the mother imaged in September when she was 11 years old.
- the image extracting unit 36 extracts, from the server-side flash memory 33 , images indicating the latitude/longitude of Hokkaido. That is, in theme setting information whose theme name is Hokkaido, the latitude/longitude of Hokkaido is set as the latitude/longitude of a site of imaging, and the image extracting unit 36 refers to the latitude/longitude, and compares the latitude/longitude with latitude/longitude of a site of imaging in the image attribute information of each image to extract images indicating the latitude/longitude of Hokkaido.
- the server-side control unit 35 in cooperation with the image comparing unit 37 of the album creating unit 34 , decides a layout of an album (S 218 ).
- the image comparing unit 37 refers to a layout included in the theme setting information to decide a layout.
- FIG. 7 shows an example of templates. Pictures 1 to 3 are to be arranged on the template shown in FIG. 7 . Furthermore, preferred orientation of subjects is set in association with the position of each picture, and the orientation is indicated with arrows. Note that “•” about the orientation in FIG. 7 indicates that the orientation is toward the front side.
- the server-side control unit 35 may select a layout in which a picture of a child who has just enrolled in a junior high school is made larger than pictures of the other family members, or alternatively may select a layout in which pictures of all the family members are made almost equal in size.
- the image comparing unit 37 may determine the order of images of each album-creation subject based on judgment about smiles, blurs and close eyes, sizes of images, and a result of the above described weighting. Furthermore, the image comparing unit 37 decides a layout such that, for example, older images are positioned on the left and newer images are positioned on the right. Also, when the template shown in FIG. 7 is used, the image comparing unit 37 extracts images such that, for example, the father and the mother face each other in images by using information about orientation of subjects included in the template. Also, parameters for deciding a layout may include whether subjects in an image are arranged horizontally or vertically.
- the server-side control unit 35 may prompt the user to make a selection among about two combinations of layouts and images.
- the server-side control unit 35 in cooperation with the image processing unit 38 of the album creating unit 34 , performs process of resizing, trimming, and sizing of images, and process for displaying, together with the images, titles and subtitles (dates and times, and sites of imaging) (step S 222 ).
- the server-side control unit 35 displays a created album on the display unit 23 of the communication device 2 that the user uses, via the server-side communication unit 30 (S 224 ). Then, the server-side control unit 35 confirms with the user whether there is necessity for correction about the page number of the album, the order of images to be shown, and process that has been performed on the images (S 226 ).
- step S 220 and the step S 226 may be omitted.
- FIG. 8 is a conceptual diagram for explaining time axes of images extracted with the theme shown in FIG. 5 .
- images with a date and time of imaging, “birth year+12”, are extracted by the image extracting unit 36 .
- the theme an enrollment ceremony of a junior high school, is an event that is, commonly, held when they are 12 years old. Accordingly, the life time axes are relatively aligned with reference to the birth dates of both the child who has just enrolled in a junior high school and the family member 1 , and images of corresponding events are extracted by the image extracting unit 36 .
- the server 3 may create albums with themes such as an album for a tenth wedding anniversary and an album for a one-year-old child by relatively aligning life time axes based on character recognition by the OCR unit of the image analyzing unit 31 a, and sites of imaging detected by the GPS module 24 (sites of places of wedding and (maternity) hospitals). Also, the server 3 may relatively align life time axes based on first images related to a certain event as explained below with reference to FIG. 9 .
- pair albums can be created in accordance with a theme of a group of people who share a hobby or a liking.
- the face portion may be masked, and the resolution of images may be made gradually higher as the number of times of pair album creation increases.
- scenery which is not related to facial images, may be selected as a theme.
- FIG. 9 shows another example of theme setting information.
- the theme setting information shown in FIG. 9 includes, similar to FIG. 5 , keywords associated with a name of a theme, a template, a date and time of imaging, and a site of imaging.
- the theme is a premiere of plano concerts. Timing of premieres of plano concerts differs for each person, and judgment of the timing cannot be made based on birth dates.
- the theme is a “premiere”
- a plurality of images which were imaged at events of plano concerts and whose subjects are the same single person and dates and times of imaging are different are stored in the server-side flash memory 33 , the one whose date of imaging is the earliest is reasonably estimated as the image of the “premiere”. Accordingly, “first” is set in the field of a date and time of imaging of the theme setting information of FIG. 9 .
- images may be extracted by making the OCR unit of the image analyzing unit 31 a read out the characters “Concerts”, in images, or by detecting sounds of plano by using the audio data analyzing unit.
- a template image of a plano may be stored in the server-side flash memory 33
- images of plano performance may be extracted by the server-side control unit 35 by performing pattern matching of images stored in the server-side flash memory 33 with the template image.
- the server-side control unit 35 may extract, from the above-described information on a blog or Twitter (registered trademark), information related to plano concerts.
- the image extracting unit 36 extracts an image whose date of imaging is the earliest among images which were imaged at events of plano concerts and whose subjects are the same single person and dates and times of imaging are different. Note that, detection of first events like the one described above can be applied to, other than plano performance, performance of other instruments, and participation in sports events.
- FIG. 10 is a conceptual diagram for explaining time axes of images extracted with the theme shown in FIG. 9 .
- performance images 1 whose dates and times of imaging are the earliest among images of the user and a friend 1 , respectively, are extracted by the image extracting unit 36 .
- the length of a period from the birth date of the user to a date and time when the performance image 1 was imaged is different from the length of a period from the birth date of the friend 1 to a date and time when the performance image 1 was imaged, and the dates and times of the images are different from each other.
- the life time axes are relatively aligned, and images of corresponding events are extracted.
- theme setting information may be automatically generated by the server 3 .
- theme setting information may be generated by using, as a new theme name, an event name included in image supplementary information of the image.
- image extraction can be performed by using supplementary information because when an image including an image of a first person and an image including an image of a second person are extracted, images with different dates and times of imaging are extracted based on the supplementary information.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Business, Economics & Management (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Databases & Information Systems (AREA)
- Library & Information Science (AREA)
- General Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Strategic Management (AREA)
- Computing Systems (AREA)
- Marketing (AREA)
- Economics (AREA)
- Human Resources & Organizations (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Computation (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Human Computer Interaction (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Processing Or Creating Images (AREA)
- Television Signal Processing For Recording (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Studio Devices (AREA)
Abstract
An electronic device includes: a storage unit that stores therein a plurality of images in which a first subject is imaged, and a plurality of images in which a second subject which is different from the first subject is imaged, and stores therein supplementary information about the first subject and the second subject including information about dates and times of imaging, and sites of imaging; and an extracting unit that extracts images whose dates and times of imaging are different from each other, and images imaged at a common site of imaging based on the supplementary information when extracting an image in which the first subject is imaged and an image in which the second subject is imaged. In the electronic device, when the first subject and the second subject are humans, the storage unit stores therein, as the supplementary information, birth dates of the first and second people.
Description
- 1. Technical Field
- The present invention relates to an electronic device, an electronic device control method, and a computer-readable recording medium having stored thereon an electronic device control program.
- 2. Related Art
- Conventionally, an album creating apparatus for creating an album has been proposed. Such an album creating apparatus uses supplementary information, such as dates and times of imaging and sites of imaging, that is added to images that have been imaged (for example, Patent Literature 1).
- Patent Literature 1: Japanese Patent No. 4208113
- As there is not a prior art literature that is currently recognized, description about prior art literatures is omitted.
- In conventional techniques, an album is created based on supplementary information about a single person, and use of the supplementary information is limited. Therefore, it is an object of an aspect of the innovations herein to provide an electronic device, an electronic device control method and a computer-readable recording medium having stored thereon an electronic device control program, which are capable of overcoming the above drawbacks accompanying the related art. The above and other objects can be achieved by combinations described in the independent claims. Also, the dependent claims define advantageous, specific examples of the present invention.
- A first aspect of the present invention provides an electronic device including: a storage unit that stores therein a plurality of images in which a first subject is imaged, and a plurality of images in which a second subject who is different from the first subject is imaged, and stores therein supplementary information about the first subject and the second subject; and an extracting unit that extracts images whose dates and times of imaging are different from each other based on the supplementary information when extracting an image in which the first subject is imaged and an image in which the second subject is imaged.
- A second aspect of the present invention provides an electronic device including: a storage unit that stores therein a plurality of images in which a first subject is imaged, and a plurality of images in which a second subject who is different from the first subject is imaged, and stores therein supplementary information about the first subject and the second subject; and an extracting unit that extracts images such that time axes of the first subject and the second subject relatively align with each other based on the supplementary information when extracting an image in which the first subject is imaged and an image in which the second subject is imaged.
- A third aspect of the present invention provides a program that controls a computer to perform: a storage function of storing a plurality of images in which a first subject is imaged, and a plurality of images in which a second subject who is different from the first subject is imaged, and storing supplementary information about the first subject and the second subject; and an extraction function of extracting images such that time axes of the first subject and the second subject relatively align with each other based on the supplementary information when extracting an image in which the first subject is imaged and an image in which the second subject is imaged.
- The summary clause does not necessarily describe all necessary features of the embodiments of the present invention. The present invention may also be a sub-combination of the features described above.
-
FIG. 1 is a diagram that shows analbum creation system 1 according to an embodiment. -
FIG. 2 is a flowchart about image input (S10) performed by a server-side control unit 35. -
FIG. 3 shows an example of user supplementary information. -
FIG. 4 shows an example of image supplementary information. -
FIG. 5 shows an example of theme setting information. -
FIG. 6 is a flowchart about pair album creation (S20) performed by a server-side control unit 35. -
FIG. 7 shows an example of templates. -
FIG. 8 is a conceptual diagram for explaining time axes of images extracted with the theme shown inFIG. 5 . -
FIG. 9 shows another example of theme setting information. -
FIG. 10 is a conceptual diagram for explaining time axes of images extracted with the theme shown inFIG. 9 . - Hereinafter, (some) embodiment(s) of the present invention will be described. The embodiment(s) do(es) not limit the invention according to the claims, and all the combinations of the features described in the embodiment(s) are not necessarily essential to means provided by aspects of the invention.
-
FIG. 1 is a diagram that shows analbum creation system 1 according to an embodiment. Thealbum creation system 1 has a plurality of communication devices 2-1, 2-2 . . . 2-N that transfers images and make requests for album creation, and aserver 3 that stores therein images from the plurality ofcommunication devices 2 and creates an album. - The
communication devices 2 may be devices such as a personal computer, a cellular phone, a digital camera, and a PDA. The explanation of the present embodiment takes a digital camera as an example. - The
communication device 2 has: animaging unit 20 that performs imaging; anonvolatile flash memory 21 that stores therein images imaged by theimaging unit 20 and various types of data; aface recognizing unit 22; adisplay unit 23 that displays images imaged by theimaging unit 20; a GPS (global positioning system)module 24 that detects the position (absolute position) of thecommunication device 2; acalendar unit 25 that stores therein dates and times; a communication unit 26 that communicates with theserver 3; and acontrol unit 27 that controls thecommunication device 2. An example of thedisplay unit 23 is a liquid crystal display, and a touch panel for inputting various types of information may be provided to the display unit. Examples of the communication unit 26 include a wireless communication unit that accesses an electric communication line such as the Internet, and a Felica (registered trademark) chip. - The
imaging unit 20 has: an imaging lens that includes a focus lens; a focus detection system that detects a focus position of the focus lens; an imaging device that images a light flux from the imaging lens; and a meta information creating unit that adds, to images, imaging-related information such as focus information detected by the focus detection system, dates and times of imaging acquired from thecalendar unit 25, and sites of imaging acquired from theGPS module 24. - The
flash memory 21 stores therein images imaged by theimaging unit 20 as described above. Furthermore, theflash memory 21 stores therein facial images and attribute information corresponding to the facial images that are used for facial recognition in images imaged by theimaging unit 20. The attribute information is input to thecommunication device 2 by a user using, for example, a touch panel provided to thedisplay unit 23. The attribute information includes names, birth dates, addresses, and group information such as relationship with the user of thecommunication device 2 like family relationship. Note that even if any piece of the attribute information lacks on the side of thecommunication device 2, it does not affect imaging negatively. - The
face recognizing unit 22 detects a face in an image imaged by theimaging unit 20, and recognizes whose face the face is based on facial images stored in theflash memory 21. Note that theface recognizing unit 22 may recognize, in addition to human faces, faces of pets such as dogs and cats. The recognition result of theface recognizing unit 22 is added, by the meta information creating unit, to the imaging-related information as subject-related information. Note that facial recognition of pets may be performed by the technique disclosed, for example, in Japanese Patent Application Publication No. 2011-19013. - The
control unit 27 has a CPU, and transmits images to theserver 3 via the communication unit 26. At this time, thecontrol unit 27 may transmit imaging-related information created by the meta information creating unit of theimaging unit 20 in association with each image. Note that images transmitted to theserver 3 by thecommunication device 2 may be either still images or motion images, and if thecommunication device 2 has a keyboard or a microphone, text data or audio data may be transmitted to theserver 3. At this time, for example, images and audio data may be transmitted from a digital camera to theserver 3, and text data may be transmitted from a personal computer to theserver 3. - Also, the
control unit 27 transmits, to theserver 3, attribute information stored in theflash memory 21. At this time, thecontrol unit 27 may transmit the attribute information in association with images, or may transmit the attribute information as separate information without associating it with images. Note that thecontrol unit 27 transmits, to theserver 3, attribute information of new members, and updated attribute information if there is any update. - The
server 3 has a server-side communication unit 30, an analyzingunit 31, aninformation extracting unit 32, a server-side flash memory 33, an album creating unit 34 and a server-side control unit 35. Theserver 3 may be a personal computer used by the user of thecommunication device 2, or an information processing device that is managed by an administrator who is not the user. - The server-
side communication unit 30 communicates with the communication unit 26 of thecommunication device 2. Although it is assumed in the present embodiment that communication is established via an electric communication line, wired communication or wireless communication may be employed. - The analyzing
unit 31 analyzes data received by the server-side communication unit 30. The analyzingunit 31 is provided with animage analyzing unit 31 a and a textdata analyzing unit 31 b. Note that an audio data analyzing unit may be additionally provided. - The
image analyzing unit 31 a has an OCR (optical character reader) unit, a metadata analyzing unit and a face recognizing unit. - The OCR unit reads out characters in an image. The OCR unit converts characters in an image, such as “XX Elementary School, Graduation Ceremony, 2009” and “YY Zoo”, into text data.
- The metadata analyzing unit analyzes a focus position, a subject-related information, a date and time of imaging, and a site of imaging that are included in imaging-related information associated with an image, and generates text data.
- The face recognizing unit detects faces in a transmitted image when subject-related information is not included in imaging-related information associated with the image for reasons such as that the
communication device 2 does not have a facial recognition function. The face recognizing unit further identifies a face detected based on a facial image stored in the server-side flash memory 33 described below. - The text
data analyzing unit 31 b compares text data input by the server-side communication unit 30 with text data created as a result of conversion by the OCR unit, and compares a date and time of imaging and a site of imaging included in imaging-related information with text data created as a result of conversion by the OCR unit. For example, the textdata analyzing unit 31 b performs comparison to know whether a site of imaging (latitude/longitude) detected by theGPS module 24 matches with text data created as a result of conversion by the OCR unit (whether there is inconsistency therebetween), or whether a date and time of imaging acquired from thecalendar unit 25 matches with text data created as a result of conversion by the OCR unit (whether there is inconsistency therebetween). - The
information extracting unit 32 extracts information acquired as a result of analysis by the analyzingunit 31, inquires of thecommunication device 2 via the server-side communication unit 30 when there is inconsistency or unclarity in the information acquired as a result of analysis by the analyzingunit 31, and extracts further information via the server-side communication unit 30. Examples of situations where theinformation extracting unit 32 extracts further information include events, such as festivals and fireworks shows, that are determined to be held at the site based on a site of imaging and a date and time of imaging. - The server-
side flash memory 33 is a nonvolatile memory that stores therein images transmitted from thecommunication device 2 and image supplementary information in association with the images. The server-side flash memory 33 further stores therein user supplementary information about the user, theme setting information about themes for creating pair albums, and templates for arranging images based on the themes. The information is further described below. - In the present embodiment, the album creating unit 34 creates an album by associating a plurality of images whose dates and times of imaging are different. For example, when creation of an album for enrollment ceremonies of junior high schools for family members is requested, the album creating unit 34 creates an album based on images of the enrollment ceremony of a junior high school for each of the father, the mother, the eldest son, and the eldest daughter stored in the server-
side flash memory 33. - Also, the album creating unit 34 may create an album based on images of or near a certain site that are imaged before a broom and a bride get to know each other.
- The album creating unit 34 has an
image extracting unit 36, animage comparing unit 37 and animage processing unit 38. - The
image extracting unit 36 extracts images, from the server-side flash memory 33, according to a theme that is designated based on control of the server-side control unit 35. - The
image comparing unit 37 compares a plurality of images with each other. For example, when theimage comparing unit 37 extracts images for a certain theme and there is a plurality of pictures of the father that are suited for the theme, theimage comparing unit 37 extracts pictures that are determined to be the best among them such as images that show smiles of the father based on smile detection. Furthermore, theimage comparing unit 37 compares pictures extracted for each family member, and switches from one image to another. For example, if pictures of other family members than the father each show only one corresponding person, a picture that shows only the father is selected, and if, in several pictures, there is cherry blossom in the background, other pictures are switched to those with cherry blossom in the background. The overall balance is sought in this way. At this time, for example, a picture in which the mother and the eldest daughter face each other (the mother is positioned on the right, and the eldest daughter is positioned on the left) may be extracted. - The
image processing unit 38 processes images extracted based on the result of comparison by theimage comparing unit 37. At this time, theimage processing unit 38 performs process of resizing, trimming, and sizing of images, and process for displaying, together with the images, titles and subtitles such as dates and times, and sites of imaging. - The server-
side control unit 35 controls entire operation of theserver 3, and in the present embodiment, performs control of album creation based on themes. Note that each of the functions of theserver 3 may be implemented by a software program. - Operation of the
album creation system 1 according to the present embodiment constituted in this way is explained hereinafter. - (Flowchart S10)
-
FIG. 2 is a flowchart about image input (S10) performed by the server-side control unit 35, and hereinafter, operation up to the image input is explained with reference to the flowchart. The operation of the step S10 starts when thecommunication device 2 makes a request, to theserver 3, for input of an image. - The server-
side control unit 35 performs user registration with the communication device 2 (S110). At this time, the server-side control unit 35 may acquire attribute information from thecommunication device 2 via the communication unit 26 and the server-side communication unit 30. Furthermore, the server-side control unit 35 assigns two user IDs, like the ones shown below, to the user. - No. 123456-FAM-1
- No. 123456-FRI-1
- Here, in the first user ID, “FAM” indicates FAMILY, and the server-
side control unit 35 registers the family as group information. At this time, when a pair album of images of the family members is to be created, all images registered, in theserver 3, with the ID No. 123456-FAM-1 may be allowed to be used. Note that “−1” at the end indicates the first generation of the family. For example, if the children are to go out on their own in the future, they may have an ID No. 123456-FAM-2, in addition to ID No. 123456-FAM-1. - In the second user ID, “FRI” indicates FRIEND, and the server-
side control unit 35 registers a friend as group information. At this time, when a pair album of images of the friends is to be created, all images registered, in theserver 3, with the ID No. 123456-FRI-1 may be allowed to be used, or alternatively approval may be required for using each of the images. - The server-
side control unit 35 generates user supplementary information based on a user ID and attribute information, and stores the user supplementary information in the server-side flash memory 33. -
FIG. 3 shows an example of user supplementary information. In the example shown inFIG. 3 , the user supplementary information includes the birth date and the address of the user associated with the user ID, and group information of the family members. The group information of the family members includes relationship with the user for identifying a family member, and the birth dates of the family members. - The user supplementary information further includes group information of friends associated with the user ID. The group information of friends includes names of the friends. As in the case of the information of the family members, the birth dates of the friends may be added as the group information.
- When there is user supplementary information not included in attribute information acquired from the
communication device 2, the server-side control unit 35 displays, on thecommunication device 2, a screen for prompting the user to input the information and acquires the information. - Note that a purpose of registering birth dates of the user and his/her family members at the step S110 is for determining ages of the user and his/her family members at the time of imaging by comparing the birth dates with dates and times of imaging of input images. For example, when a pair album with a theme of images that are imaged when a family member is 10 years old is to be created, the server-
side control unit 35 uses theimage extracting unit 36 to extract images, from the server-side flash memory 33, that were imaged 10 years after the birth date. Similarly, when a pair album at the time of enrollment in an elementary school is to be created, the server-side control unit 35 extracts images, from the server-side flash memory 33, that were imaged in April and six years after the birth date. - The server-
side control unit 35 confirms a user ID prior to input of images (step S112). Here, in the following explanation, it is assumed that authentication is performed with the ID No. 123456-FAM-1. - The server-
side control unit 35 inputs, via the communication unit 26 and the server-side communication unit 30, images and imaging-related information created by the meta information creating unit of the imaging unit 20 (S114). At this time, the server-side control unit 35 may acquire, from thecommunication device 2, images one by one or multiple images collectively. - The server-
side control unit 35 creates image supplementary information based on the images and the imaging-related information, and stores the image supplementary information in the server-side flash memory 33. -
FIG. 4 shows an example of image supplementary information. In the example shown inFIG. 4 , the image supplementary information includes focus information, subject-related information, a date and time of imaging, and a site of imaging associated with a file name of an image. The subject-related information includes information for identifying a subject, and information for identifying orientation of the subject. The date and time of imaging includes a date and year, and an event name of an event at which the image is imaged. Also, the site of imaging includes latitude/longitude, and a name of a site where the image is imaged. Note that “*” in the example shown inFIG. 4 indicates that applicable information is not available. - Also, in the step S114, the server-
side control unit 35 analyzes an image input by using the analyzingunit 31, and extracts additional information of the image supplementary information by using theinformation extracting unit 32 to update the image supplementary information stored in the server-side flash memory 33. Here, facial recognition by theimage analyzing unit 31a and collection of event information by theinformation extracting unit 32 are performed. Note that when each piece of the image supplementary information has already been input, analysis of the piece by the analyzingunit 31 may be omitted, or if inconsistency is found as a result of analysis of the piece by the analyzingunit 31, such inconsistency may be confirmed with the user. - The server-
side control unit 35 confirms whether there are matters to be confirmed with the user about an input image (S116). At this time, the server-side control unit 35 may judge whether there are matters to be confirmed based on whether image supplementary information meet predetermined conditions at the time when update of the image supplementary information at the step S114 has completed. For example, the server-side control unit 35 judges that there is a matter to be confirmed under a condition that the fields of a date and time of imaging, that is, the date and year and the event name, both indicate “*”. Also, the server-side control unit 35 may judge that there is a matter to be confirmed under a condition that the fields of a site of imaging, that is, the latitude/longitude and the name of a site, both indicate “*”. - When the result of judgment at the step S116 is Yes, the process proceeds to the step S118. At the step S118, the server-
side control unit 35 inquires for lacking information among the image supplementary information. The server-side control unit 35 ends the inquiry at the time when the image supplementary information has met the conditions as a result of the inquiry. - For example, the server-
side control unit 35 inquires for a site of imaging when theGPS module 24 is not provided to thecommunication device 2, and thus the site of imaging cannot be identified. Also, when an unregistered facial image is included, for example, in a plurality of images, the server-side control unit 35 inquires whether the facial image should be registered as a facial image of a user. - Also, when an event of an image cannot be identified, an inquiry is made about the event, or if the image is imaged in March, about whether the image should be identified as an image imaged at a graduation ceremony.
- These inquiries to the user at the step S118 place a burden on the user. Accordingly, for example, if the user often posts messages on a blog or Twitter (registered trademark), the server-
side control unit 35 acquires text data created by the user from the blog or Twitter (registered trademark), and analyzes the text data by using the textdata analyzing unit 31 b; thereby, the inquiries to the user can be omitted or the frequency of the inquiries can be reduced. Specifically, the user may be prompted to input accounts of the blog or Twitter (registered trademark) as the user supplementary information shown inFIG. 3 . At this time, the analysis of an image by using the analyzingunit 31 and the extraction of additional information by using theinformation extracting unit 32 may be performed at timing that is different from the time of image input, and accordingly, the inquiries at the step S118 may be performed after the image input. - The server-
side control unit 35 confirms with the user whether there are further image input (S120). Here, the server-side control unit 35 confirms whether the user wishes to input any image after changing the user ID to No. 123456-FRI-1, and if the user wishes to do so, the process returns to the step S112, and if not, the process proceeds to the step S122. - The server-
side control unit 35 inquires the user whether to create a pair album (S122). - The album creation in this case includes two types. One is that proposed by the
server 3. In one example, when it is judged that input images include pictures of a graduation ceremony of an elementary school of the user's child, the server-side control unit 35 proposes creation of a pair album by extracting pictures of family members with a theme of graduation ceremonies of elementary schools. - The other type of the album creation is creation of a pair album according to a request by the user. At this time, a theme may be input as text by the user by using the
communication device 2, or alternatively theserver 3 may present a list of themes to thecommunication device 2 to prompt the user to make a selection. - In any case, the server-
side control unit 35 proceeds to the flowchart S20 shown inFIG. 6 if the result of judgment by the user at the step S122 is Yes, and ends the process of the flowchart if the result of judgment by the user is No. -
FIG. 5 shows an example of theme setting information. In the example shown inFIG. 5 , the theme setting information includes keywords associated with a name of a theme, a template, a date and time of imaging, and a site of imaging. The date and time of imaging includes the number of years from a birth date, and timing when imaging is performed. Also, the site of imaging includes latitude/longitude, and a name of a site where the imaging is performed. Note that “*” in the example shown inFIG. 5 indicates that applicable information is not to be used for extraction of images. The theme setting information is input in advance and is stored in the server-side flash memory 33. - (Flowchart S20)
-
FIG. 6 is a flowchart about pair album creation (S20) performed by the server-side control unit 35, and hereinafter, operation of pair album creation is explained with reference to the flowchart. - The server-
side control unit 35 confirms a theme of a pair album to be created (S210). At this time, for example, the server-side control unit 35 receives input of free format text from thecommunication device 2, calculates the degrees of coincidence between keywords included in the text, and keywords included in the theme setting information, and extracts a theme name with a high degree of coincidence. In place of receiving input of free format text, the server-side control unit 35 may display a list of theme names in the theme setting information, and receive a selection therefrom. - Here, pair album creation with a theme of an enrollment ceremony of a junior high school is explained.
- The server-
side control unit 35 confirms subjects for the album creation (S212). It is assumed here that all the four family members (father, mother, eldest son and eldest daughter) registered in the user supplementary information of the user are album-creation subjects. Note that this step S212 and the next step S214 may be omitted when subjects for the pair album creation are family members. - When creating a pair album, the server-
side control unit 35 judges whether permission of album-creation subjects are necessary (S214). Here, because the family members are the album-creation subjects as described above, such judgment is not necessary. The judgment at the step S214 is necessary when the group information indicates friends, and use of all images is not approved. In this way, when permission of album-creation subjects are necessary for using images, the process returns to the step S212, and the permission is obtained from the album-creation subjects. Also, when the permission cannot be obtained from a certain album-creation subject, for example, within a preset length of time, the album-creation subject may be excluded from the group of album-creation subjects. - The server-
side control unit 35, based on the them and in cooperation with theimage extracting unit 36 of the album creating unit 34, extracts images stored in the server-side flash memory 33 (S216). At this time, theimage extracting unit 36 compares information about the date and time of imaging and the site of imaging included in the theme setting information of the theme with information about a date and time of imaging and a site of imaging included in the image supplementary information of each image to extract images. - For example, when the theme is an enrollment ceremony of a junior high school, the
image extracting unit 36 refers to the theme setting information shown inFIG. 5 , and extracts images whose date and time of imaging shows that they were imaged in April and 12 years after the birth date, that is, images imaged in April when the subject is 12 years old. Also, theimage extracting unit 36 refers to the name of a site of imaging in the theme setting information, and if a junior high school is included in the name of a site of imaging in the image supplementary information, that is, if the imaging-related information created by the meta information creating unit of theimaging unit 20 includes information about a junior high school or if the OCR unit identifies characters like “XX Junior High School, Enrollment Ceremony”, extraction may be performed by giving weight to pictures with such information. Note that different countries adopt different schooling systems and different timing for enrollment and graduation. Accordingly, the server-side control unit 35 may extract images by correcting timing of enrollment based on information such as a site of imaging and latitude/longitude. For example, if the mother enrolled in a junior high school in September when she was 11 years old in a certain country, the server-side control unit 35 uses theimage extracting unit 36 to extract, from the server-side flash memory 33, pictures of the mother imaged in September when she was 11 years old. - Note that, for example, if the theme is Hokkaido, the
image extracting unit 36 extracts, from the server-side flash memory 33, images indicating the latitude/longitude of Hokkaido. That is, in theme setting information whose theme name is Hokkaido, the latitude/longitude of Hokkaido is set as the latitude/longitude of a site of imaging, and theimage extracting unit 36 refers to the latitude/longitude, and compares the latitude/longitude with latitude/longitude of a site of imaging in the image attribute information of each image to extract images indicating the latitude/longitude of Hokkaido. - The server-
side control unit 35, in cooperation with theimage comparing unit 37 of the album creating unit 34, decides a layout of an album (S218). At this time, theimage comparing unit 37 refers to a layout included in the theme setting information to decide a layout. -
FIG. 7 shows an example of templates.Pictures 1 to 3 are to be arranged on the template shown inFIG. 7 . Furthermore, preferred orientation of subjects is set in association with the position of each picture, and the orientation is indicated with arrows. Note that “•” about the orientation inFIG. 7 indicates that the orientation is toward the front side. At this time, the server-side control unit 35 may select a layout in which a picture of a child who has just enrolled in a junior high school is made larger than pictures of the other family members, or alternatively may select a layout in which pictures of all the family members are made almost equal in size. - When there is a plurality of pictures to be subjects at the step S218, the
image comparing unit 37 may determine the order of images of each album-creation subject based on judgment about smiles, blurs and close eyes, sizes of images, and a result of the above described weighting. Furthermore, theimage comparing unit 37 decides a layout such that, for example, older images are positioned on the left and newer images are positioned on the right. Also, when the template shown inFIG. 7 is used, theimage comparing unit 37 extracts images such that, for example, the father and the mother face each other in images by using information about orientation of subjects included in the template. Also, parameters for deciding a layout may include whether subjects in an image are arranged horizontally or vertically. - After a layout is decided, the user is inquired of whether images to be used and the layout need to be corrected (S220). For example, the server-
side control unit 35 may prompt the user to make a selection among about two combinations of layouts and images. Here, in the following explanation, it is assumed that no correction is necessary. - As described above, the server-
side control unit 35, in cooperation with theimage processing unit 38 of the album creating unit 34, performs process of resizing, trimming, and sizing of images, and process for displaying, together with the images, titles and subtitles (dates and times, and sites of imaging) (step S222). - Next, the server-
side control unit 35 displays a created album on thedisplay unit 23 of thecommunication device 2 that the user uses, via the server-side communication unit 30 (S224). Then, the server-side control unit 35 confirms with the user whether there is necessity for correction about the page number of the album, the order of images to be shown, and process that has been performed on the images (S226). - Note that when the user has selected an auto-creation mode, the step S220 and the step S226 may be omitted.
-
FIG. 8 is a conceptual diagram for explaining time axes of images extracted with the theme shown inFIG. 5 . In album creation by using the theme setting information shown inFIG. 5 , images with a date and time of imaging, “birth year+12”, are extracted by theimage extracting unit 36. Here, although there is an age difference between the child who has just enrolled in a junior high school and afamily member 1 such as his/her mother by the difference in their birth dates, the theme, an enrollment ceremony of a junior high school, is an event that is, commonly, held when they are 12 years old. Accordingly, the life time axes are relatively aligned with reference to the birth dates of both the child who has just enrolled in a junior high school and thefamily member 1, and images of corresponding events are extracted by theimage extracting unit 36. - Although in the example explained above, a pair album is created mainly for family members, album creation is not limited thereto, alumni may be registered, and pair albums may be created with a theme of marriage or birth of children. At this time, timing of marriage and timing of children's birth differ person by person. In such cases, the
server 3 may create albums with themes such as an album for a tenth wedding anniversary and an album for a one-year-old child by relatively aligning life time axes based on character recognition by the OCR unit of theimage analyzing unit 31 a, and sites of imaging detected by the GPS module 24 (sites of places of wedding and (maternity) hospitals). Also, theserver 3 may relatively align life time axes based on first images related to a certain event as explained below with reference toFIG. 9 . - Also, various types of SNS (social network service) have been proposed recently, and people can exchange information even with remote people. The present example can be applied to such circumstances, and for example, pair albums can be created in accordance with a theme of a group of people who share a hobby or a liking. At this time, if a user prefers not making his/her image or his/her family's image visible, the face portion may be masked, and the resolution of images may be made gradually higher as the number of times of pair album creation increases. Also, scenery, which is not related to facial images, may be selected as a theme.
-
FIG. 9 shows another example of theme setting information. The theme setting information shown inFIG. 9 includes, similar toFIG. 5 , keywords associated with a name of a theme, a template, a date and time of imaging, and a site of imaging. - In
FIG. 9 , the theme is a premiere of plano concerts. Timing of premieres of plano concerts differs for each person, and judgment of the timing cannot be made based on birth dates. However, because the theme is a “premiere”, if a plurality of images which were imaged at events of plano concerts and whose subjects are the same single person and dates and times of imaging are different are stored in the server-side flash memory 33, the one whose date of imaging is the earliest is reasonably estimated as the image of the “premiere”. Accordingly, “first” is set in the field of a date and time of imaging of the theme setting information ofFIG. 9 . - Specifically, images may be extracted by making the OCR unit of the
image analyzing unit 31 a read out the characters “Concerts”, in images, or by detecting sounds of plano by using the audio data analyzing unit. Alternatively, a template image of a plano may be stored in the server-side flash memory 33, and images of plano performance may be extracted by the server-side control unit 35 by performing pattern matching of images stored in the server-side flash memory 33 with the template image. In addition, the server-side control unit 35 may extract, from the above-described information on a blog or Twitter (registered trademark), information related to plano concerts. In this way, theimage extracting unit 36 extracts an image whose date of imaging is the earliest among images which were imaged at events of plano concerts and whose subjects are the same single person and dates and times of imaging are different. Note that, detection of first events like the one described above can be applied to, other than plano performance, performance of other instruments, and participation in sports events. -
FIG. 10 is a conceptual diagram for explaining time axes of images extracted with the theme shown inFIG. 9 . In album creation by using the theme setting information shown inFIG. 9 ,performance images 1 whose dates and times of imaging are the earliest among images of the user and afriend 1, respectively, are extracted by theimage extracting unit 36. Here, the length of a period from the birth date of the user to a date and time when theperformance image 1 was imaged is different from the length of a period from the birth date of thefriend 1 to a date and time when theperformance image 1 was imaged, and the dates and times of the images are different from each other. However, with reference to the respective orders of the dates and times of imaging for the user and thefriend 1, the life time axes are relatively aligned, and images of corresponding events are extracted. - Note that although in the embodiment, theme setting information is input in advance, theme setting information may be automatically generated by the
server 3. At this time, when a new image is input, theme setting information may be generated by using, as a new theme name, an event name included in image supplementary information of the image. - As explained above, according to the present embodiment, image extraction can be performed by using supplementary information because when an image including an image of a first person and an image including an image of a second person are extracted, images with different dates and times of imaging are extracted based on the supplementary information.
- While the embodiment(s) of the present invention has (have) been described, the technical scope of the invention is not limited to the above described embodiment(s). It is apparent to persons skilled in the art that various alterations and improvements can be added to the above-described embodiment(s). It is also apparent from the scope of the claims that the embodiments added with such alterations or improvements can be included in the technical scope of the invention.
- The operations, procedures, steps, and stages of each process performed by an apparatus, system, program, and method shown in the claims, embodiments, or diagrams can be performed in any order as long as the order is not indicated by “prior to,” “before,” or the like and as long as the output from a previous process is not used in a later process. Even if the process flow is described using phrases such as “first” or “next” in the claims, embodiments, or diagrams, it does not necessarily mean that the process must be performed in this order.
Claims (21)
1. An electronic device comprising:
a storage unit that
stores therein a plurality of images in which a first subject is imaged, and a plurality of images in which a second subject which is different from the first subject is imaged, and
stores therein supplementary information about the first subject and the second subject including information about dates and times of imaging, and sites of imaging; and
an extracting unit that extracts images whose dates and times of imaging are different from each other, and images imaged at a common site of imaging based on the supplementary information when extracting an image in which the first subject is imaged and an image in which the second subject is imaged.
2. The electronic device according to claim 1 , wherein the extracting unit extracts, as the images imaged at a common site of imaging, images imaged at a common event.
3. The electronic device according to claim 1 , wherein, when the first subject and the second subject are humans, the storage unit stores therein, as the supplementary information, birth dates of the first person and the second person.
4. The electronic device according to claim 3 , further comprising a detecting unit that detects ages of the first person and the second person at times of imaging based on the birth dates and the dates and times of imaging.
5. The electronic device according to claim 1 , further comprising a communication unit that communicates with an external device.
6. The electronic device according to claim 5 , further comprising an information extracting unit that extracts, via the communication unit, information that is related to at least either one of the image and the supplementary information.
7. The electronic device according to claim 1 , further comprising a face recognizing unit that recognizes a face included in the image.
8. The electronic device according to claim 1 , further comprising a character recognizing unit that recognizes a character included in the image.
9. The electronic device according to claim 1 , further comprising a comparing unit that compares a first image in which the first subject is imaged with a second image in which the second subject is imaged.
10. The electronic device according to claim 9 , wherein the comparing unit compares the supplementary information of the first image with the supplementary information of the second image.
11. The electronic device according to claim 9 , wherein the extracting unit changes at least either one of the first image and the second image based on a result of comparison performed by the comparing unit.
12. The electronic device according to claim 9 , further comprising a processing unit that processes at least either one of the first image and the second image based on a result of comparison performed by the comparing unit.
13. The electronic device according to claim 9 , further comprising a deciding unit that decides arrangement of the first image and the second image based on a result of comparison performed by the comparing unit.
14. The electronic device according to claim 13 , wherein the deciding unit decides the arrangement of the first image and the second image based on the dates and times of imaging of the first image and the second image.
15. An electronic device comprising:
a storage unit that
stores therein a plurality of images in which a first subject is imaged, and a plurality of images in which a second subject which is different from the first subject is imaged, and
stores therein supplementary information about the first subject and the second subject; and
an extracting unit that extracts images such that time axes of the first subject and the second subject relatively align with each other based on the supplementary information when extracting an image in which the first subject is imaged and an image in which the second subject is imaged.
16. The electronic device according to claim 15 , wherein
when the first subject and the second subject are humans, the storage unit stores therein, as the supplementary information, birth dates of the first person and the second person, and
the time axes are relatively aligned with each other based on a difference between the birth date of the first person and the birth date of the second person.
17. The electronic device according to claim 15 , wherein
the storage unit stores therein, as the supplementary information, information about dates and times of imaging, and
the time axes are relatively aligned with each other based on a temporal order of the dates and times of imaging.
18. An electronic device control method comprising:
storing, in a storage unit, a plurality of images in which a first subject is imaged, and a plurality of images in which a second subject which is different from the first subject is imaged, and storing, in the storage unit, supplementary information about the first subject and the second subject; and
extracting images whose dates and times of imaging are different from each other based on the supplementary information when extracting an image in which the first subject is imaged and an image in which the second subject is imaged.
19. An electronic device control method comprising:
storing, in a storage unit, a plurality of images in which a first subject is imaged, and a plurality of images in which a second subject which is different from the first subject is imaged, and storing, in the storage unit, supplementary information about the first subject and the second subject; and
extracting images such that time axes of the first subject and the second subject relatively align with each other based on the supplementary information when extracting an image in which the first subject is imaged and an image in which the second subject is imaged.
20. A computer-readable recording medium having stored thereon a program that causes a computer to realize:
a storage function of storing a plurality of images in which a first subject is imaged, and a plurality of images in which a second subject which is different from the first subject is imaged, and storing supplementary information about the first subject and the second subject including information about dates and times of imaging, and sites of imaging; and
an extraction function of extracting images whose dates and times of imaging are different from each other, and images imaged at a common site of imaging based on the supplementary information when extracting an image in which the first subject is imaged and an image in which the second subject is imaged.
21. A computer-readable recording medium having stored thereon a program that causes a computer to realize:
a storage function of storing a plurality of images in which a first subject is imaged, and a plurality of images in which a second subject which is different from the first subject is imaged, and storing supplementary information about the first subject and the second subject; and
an extraction function of extracting images such that time axes of the first subject and the second subject relatively align with each other based on the supplementary information when extracting an image in which the first subject is imaged and an image in which the second subject is imaged.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/818,083 US10275643B2 (en) | 2011-03-14 | 2015-08-04 | Electronic device, electronic device control method, and computer-readable recording medium having stored thereon electronic device control program |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011055786 | 2011-03-14 | ||
JPJP-2011-055786 | 2011-03-14 | ||
PCT/JP2012/000939 WO2012124252A1 (en) | 2011-03-14 | 2012-02-13 | Electronic device, and method and program for controlling electronic device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/000939 Continuation WO2012124252A1 (en) | 2011-03-14 | 2012-02-13 | Electronic device, and method and program for controlling electronic device |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/818,083 Continuation US10275643B2 (en) | 2011-03-14 | 2015-08-04 | Electronic device, electronic device control method, and computer-readable recording medium having stored thereon electronic device control program |
Publications (1)
Publication Number | Publication Date |
---|---|
US20130343659A1 true US20130343659A1 (en) | 2013-12-26 |
Family
ID=46830347
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/971,704 Abandoned US20130343659A1 (en) | 2011-03-14 | 2013-08-20 | Electronic device, electronic device control method, and computer-readable recording medium having stored thereon electronic device control program |
US14/818,083 Active US10275643B2 (en) | 2011-03-14 | 2015-08-04 | Electronic device, electronic device control method, and computer-readable recording medium having stored thereon electronic device control program |
US16/298,495 Abandoned US20190205624A1 (en) | 2011-03-14 | 2019-03-11 | Electronic device, electronic device control method, and computer-readable recording medium having stored thereon electronic device control program |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/818,083 Active US10275643B2 (en) | 2011-03-14 | 2015-08-04 | Electronic device, electronic device control method, and computer-readable recording medium having stored thereon electronic device control program |
US16/298,495 Abandoned US20190205624A1 (en) | 2011-03-14 | 2019-03-11 | Electronic device, electronic device control method, and computer-readable recording medium having stored thereon electronic device control program |
Country Status (3)
Country | Link |
---|---|
US (3) | US20130343659A1 (en) |
JP (3) | JPWO2012124252A1 (en) |
WO (1) | WO2012124252A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190065614A1 (en) * | 2017-08-28 | 2019-02-28 | Go Daddy Operating Company, LLC | Customer requested website from digital image metadata |
US10630639B2 (en) | 2017-08-28 | 2020-04-21 | Go Daddy Operating Company, LLC | Suggesting a domain name from digital image metadata |
CN112148910A (en) * | 2019-06-28 | 2020-12-29 | 富士胶片株式会社 | Image processing device, image processing method, and recording medium storing image processing program |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6221487B2 (en) * | 2013-08-09 | 2017-11-01 | 株式会社バッファロー | Information processing apparatus, image display system, image display method and program in information processing apparatus |
JP2017058996A (en) * | 2015-09-17 | 2017-03-23 | 富士フイルム株式会社 | Electronic album template selection system, electronic album template selection method, electronic album template selection program, and storage medium storing electronic album template selection program |
JP6919260B2 (en) * | 2017-03-23 | 2021-08-18 | 富士フイルムビジネスイノベーション株式会社 | Information converter and program |
US10565252B2 (en) * | 2017-09-06 | 2020-02-18 | Facebook, Inc. | Systems and methods for connecting to digital social groups using machine-readable code |
EP3757203A4 (en) | 2018-02-21 | 2022-01-12 | University Public Corporation Osaka | CELL CULTURE CONTAINERS, CELL CULTURE CONTAINERS MANUFACTURING PROCESS, CELL COLLECTION SYSTEM AND CELL DETECTION METHODS |
JP7040120B2 (en) * | 2018-02-27 | 2022-03-23 | 大日本印刷株式会社 | Photobook creation system, photobook creation method and program |
JP6988607B2 (en) * | 2018-03-16 | 2022-01-05 | 大日本印刷株式会社 | Photobook production system and server equipment |
JP7224774B2 (en) * | 2018-04-23 | 2023-02-20 | キヤノン株式会社 | Image processing device, image processing method, and program |
JP7187986B2 (en) * | 2018-10-31 | 2022-12-13 | 京セラドキュメントソリューションズ株式会社 | Information processing equipment |
JP7470279B2 (en) * | 2020-01-18 | 2024-04-18 | 株式会社Mixi | Information processing device, image output program, and image output method |
JP2021197615A (en) * | 2020-06-12 | 2021-12-27 | 株式会社エクサウィザーズ | Image selection method, information processing device, program, and information processing system |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050105806A1 (en) * | 2003-11-14 | 2005-05-19 | Yasuhiko Nagaoka | Method and apparatus for organizing digital media based on face recognition |
US6922489B2 (en) * | 1997-10-29 | 2005-07-26 | Canon Kabushiki Kaisha | Image interpretation method and apparatus |
US20060182346A1 (en) * | 2001-09-17 | 2006-08-17 | National Inst. Of Adv. Industrial Science & Tech. | Interface apparatus |
US20070165968A1 (en) * | 2006-01-19 | 2007-07-19 | Fujifilm Corporation | Image editing system and image editing program |
US7440595B2 (en) * | 2002-11-21 | 2008-10-21 | Canon Kabushiki Kaisha | Method and apparatus for processing images |
US20120158700A1 (en) * | 2010-12-20 | 2012-06-21 | Microsoft Corporation | Face recognition using social data |
Family Cites Families (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6636648B2 (en) * | 1999-07-02 | 2003-10-21 | Eastman Kodak Company | Albuming method with automatic page layout |
US6628808B1 (en) | 1999-07-28 | 2003-09-30 | Datacard Corporation | Apparatus and method for verifying a scanned image |
JP4208113B2 (en) | 2000-04-19 | 2009-01-14 | 富士フイルム株式会社 | Album creating method and apparatus, and recording medium |
JP3945974B2 (en) * | 2000-11-13 | 2007-07-18 | 富士フイルム株式会社 | Image transmitting apparatus and image transmitting method |
US8377100B2 (en) | 2000-12-08 | 2013-02-19 | Roger P. Jackson | Closure for open-headed medical implant |
JP4697913B2 (en) | 2000-12-20 | 2011-06-08 | キヤノン株式会社 | Data retrieval apparatus and method |
US6826316B2 (en) | 2001-01-24 | 2004-11-30 | Eastman Kodak Company | System and method for determining image similarity |
JP2002342743A (en) | 2001-05-17 | 2002-11-29 | Olympus Optical Co Ltd | Image processing apparatus and image processing method |
KR20040090973A (en) | 2002-01-11 | 2004-10-27 | 포트레이트 이노베이션스 인코포레이티드 | Systems and methods for producing portraits |
JP2003281163A (en) | 2002-03-26 | 2003-10-03 | Canon Inc | Image processor, image processing method and storage medium |
JP2004048648A (en) | 2002-05-13 | 2004-02-12 | Fuji Photo Film Co Ltd | Method of forming special effect image, camera and image server |
JP4315345B2 (en) * | 2003-11-27 | 2009-08-19 | 富士フイルム株式会社 | Image editing apparatus and method, and program |
JP4388905B2 (en) | 2004-02-27 | 2009-12-24 | 富士フイルム株式会社 | Card issuing system, card issuing method, and card issuing program |
WO2006064696A1 (en) * | 2004-12-15 | 2006-06-22 | Nikon Corporation | Image reproducing system |
JP4649980B2 (en) | 2004-12-21 | 2011-03-16 | ソニー株式会社 | Image editing apparatus, image editing method, and program |
JP4148228B2 (en) | 2005-02-10 | 2008-09-10 | ソニー株式会社 | Image recording apparatus, image reproduction control apparatus, image recording / reproduction control apparatus, processing method of these apparatuses, and program causing computer to execute the method |
JP4655212B2 (en) | 2005-08-26 | 2011-03-23 | 富士フイルム株式会社 | Image processing apparatus, image processing method, and image processing program |
JP4762731B2 (en) | 2005-10-18 | 2011-08-31 | 富士フイルム株式会社 | Album creating apparatus, album creating method, and album creating program |
JP2007122431A (en) * | 2005-10-28 | 2007-05-17 | Matsushita Electric Ind Co Ltd | Content extraction device and method |
JP2007133838A (en) * | 2005-11-14 | 2007-05-31 | Fujifilm Corp | Image display method and image display program |
JP2007219713A (en) | 2006-02-15 | 2007-08-30 | Sony Corp | Inquiry system, imaging apparatus, inquiry device, information processing method, and program |
JP4762762B2 (en) * | 2006-03-07 | 2011-08-31 | 富士フイルム株式会社 | Album creating apparatus, album creating method, and program |
JP2007257312A (en) | 2006-03-23 | 2007-10-04 | Fujifilm Corp | Album creation system, album creating method, and program |
JP5239126B2 (en) | 2006-04-11 | 2013-07-17 | 株式会社ニコン | Electronic camera |
JP4642695B2 (en) | 2006-05-18 | 2011-03-02 | 富士フイルム株式会社 | Album creating system, album creating method, program, and album creating apparatus |
US8971667B2 (en) * | 2006-10-23 | 2015-03-03 | Hewlett-Packard Development Company, L.P. | Digital image auto-resizing |
JP5034661B2 (en) | 2007-05-07 | 2012-09-26 | ソニー株式会社 | Image management apparatus, image display apparatus, imaging apparatus, processing method in these, and program causing computer to execute the method |
JP2009294902A (en) | 2008-06-05 | 2009-12-17 | Nikon Corp | Image processor and camera |
JP5733775B2 (en) | 2008-06-06 | 2015-06-10 | 日本電気株式会社 | Object image display system |
US20100026822A1 (en) * | 2008-07-31 | 2010-02-04 | Itt Manufacturing Enterprises, Inc. | Multiplexing Imaging System for Area Coverage and Point Targets |
JP5207940B2 (en) | 2008-12-09 | 2013-06-12 | キヤノン株式会社 | Image selection apparatus and control method thereof |
AU2008264197B2 (en) * | 2008-12-24 | 2012-09-13 | Canon Kabushiki Kaisha | Image selection method |
JP4636190B2 (en) | 2009-03-13 | 2011-02-23 | オムロン株式会社 | Face collation device, electronic device, face collation device control method, and face collation device control program |
JP5532661B2 (en) | 2009-04-10 | 2014-06-25 | 株式会社ニコン | Image extraction program and image extraction apparatus |
JP5423186B2 (en) | 2009-07-07 | 2014-02-19 | 株式会社リコー | Imaging apparatus, area detection method, and program |
-
2012
- 2012-02-13 JP JP2013504531A patent/JPWO2012124252A1/en active Pending
- 2012-02-13 WO PCT/JP2012/000939 patent/WO2012124252A1/en active Application Filing
-
2013
- 2013-08-20 US US13/971,704 patent/US20130343659A1/en not_active Abandoned
-
2015
- 2015-08-04 US US14/818,083 patent/US10275643B2/en active Active
-
2016
- 2016-04-05 JP JP2016076208A patent/JP2016154366A/en active Pending
-
2017
- 2017-09-27 JP JP2017186973A patent/JP2018028921A/en active Pending
-
2019
- 2019-03-11 US US16/298,495 patent/US20190205624A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6922489B2 (en) * | 1997-10-29 | 2005-07-26 | Canon Kabushiki Kaisha | Image interpretation method and apparatus |
US20060182346A1 (en) * | 2001-09-17 | 2006-08-17 | National Inst. Of Adv. Industrial Science & Tech. | Interface apparatus |
US7440595B2 (en) * | 2002-11-21 | 2008-10-21 | Canon Kabushiki Kaisha | Method and apparatus for processing images |
US20050105806A1 (en) * | 2003-11-14 | 2005-05-19 | Yasuhiko Nagaoka | Method and apparatus for organizing digital media based on face recognition |
US20070165968A1 (en) * | 2006-01-19 | 2007-07-19 | Fujifilm Corporation | Image editing system and image editing program |
US20120158700A1 (en) * | 2010-12-20 | 2012-06-21 | Microsoft Corporation | Face recognition using social data |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190065614A1 (en) * | 2017-08-28 | 2019-02-28 | Go Daddy Operating Company, LLC | Customer requested website from digital image metadata |
US10630639B2 (en) | 2017-08-28 | 2020-04-21 | Go Daddy Operating Company, LLC | Suggesting a domain name from digital image metadata |
CN112148910A (en) * | 2019-06-28 | 2020-12-29 | 富士胶片株式会社 | Image processing device, image processing method, and recording medium storing image processing program |
Also Published As
Publication number | Publication date |
---|---|
US20150339518A1 (en) | 2015-11-26 |
US10275643B2 (en) | 2019-04-30 |
WO2012124252A1 (en) | 2012-09-20 |
JP2018028921A (en) | 2018-02-22 |
JPWO2012124252A1 (en) | 2014-07-17 |
US20190205624A1 (en) | 2019-07-04 |
JP2016154366A (en) | 2016-08-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20190205624A1 (en) | Electronic device, electronic device control method, and computer-readable recording medium having stored thereon electronic device control program | |
US20200019756A1 (en) | Private Photo Sharing System, Method and Network | |
US9058375B2 (en) | Systems and methods for adding descriptive metadata to digital content | |
US9135500B2 (en) | Facial recognition | |
US8650242B2 (en) | Data processing apparatus and data processing method | |
JP5931829B2 (en) | Composite image creation assist device, composite image creation assist method, composite image creation assist program, and recording medium thereof | |
US8949335B2 (en) | Content processing device, content processing method, computer-readable recording medium, and integrated circuit for processing at least one of more contents | |
CN102576373B (en) | Content management device, contents management method, content supervisor and integrated circuit | |
US9521211B2 (en) | Content processing device, content processing method, computer-readable recording medium, and integrated circuit | |
WO2019230275A1 (en) | Image processing device, image processing method, image processing program, and recording medium storing image processing program | |
CN105653676B (en) | A kind of recommending scenery spot method and system | |
KR20170097980A (en) | Method for sharing content group of electronic device and electronic device thereof | |
US20150189118A1 (en) | Photographing apparatus, photographing system, photographing method, and recording medium recording photographing control program | |
US20130308864A1 (en) | Information processing apparatus, information processing method, computer program, and image display apparatus | |
KR20170098113A (en) | Method for creating image group of electronic device and electronic device thereof | |
JP6677527B2 (en) | Server device and program | |
JP2013069024A (en) | Image retrieval program and image retrieval device | |
US9064020B2 (en) | Information providing device, information providing processing program, recording medium having information providing processing program recorded thereon, and information providing method | |
JP6958795B1 (en) | Information processing methods, computer programs and information processing equipment | |
JP5708868B1 (en) | Program, information processing apparatus and method | |
US20150120715A1 (en) | Non-transitory computer readable medium, document recommending apparatus, and document recommending method | |
US20190180042A1 (en) | Image display device, image display control device, and image display control method | |
US20240040232A1 (en) | Information processing apparatus, method thereof, and program thereof, and information processing system | |
JP2015073141A (en) | Computer processing method, program, and information processing apparatus | |
CA2827639A1 (en) | Facial recognition |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: NIKON CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NAKADA, YUKO;TAKE, TOSHINORI;UWAI, HIROKI;AND OTHERS;SIGNING DATES FROM 20130805 TO 20130812;REEL/FRAME:031096/0803 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |