+

WO2016008127A1 - Imaging systems, devices and methods for customized automatic image settings - Google Patents

Imaging systems, devices and methods for customized automatic image settings Download PDF

Info

Publication number
WO2016008127A1
WO2016008127A1 PCT/CN2014/082372 CN2014082372W WO2016008127A1 WO 2016008127 A1 WO2016008127 A1 WO 2016008127A1 CN 2014082372 W CN2014082372 W CN 2014082372W WO 2016008127 A1 WO2016008127 A1 WO 2016008127A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
settings
scene
manual
auto
Prior art date
Application number
PCT/CN2014/082372
Other languages
French (fr)
Inventor
Ruiduo Yang
Feng Guo
Michel Adib SARKIS
Xuan ZOU
Original Assignee
Qualcomm Incorporated
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Incorporated filed Critical Qualcomm Incorporated
Priority to PCT/CN2014/082372 priority Critical patent/WO2016008127A1/en
Publication of WO2016008127A1 publication Critical patent/WO2016008127A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes

Definitions

  • This disclosure relates generally to digital image capturing.
  • methods and devices for customizing automatic image settings used with capturing digital images are disclosed.
  • an imaging device with customized automatic image settings comprises an image sensor configured to capture an image having one or more scene characteristics.
  • the device may also include an electronic settings module configured with image settings to be used for capturing images, the settings module configured to operate in a manual mode and operate in an automatic mode, and the settings module configured to receive manual image settings from a user of the device when in the manual mode and configured to provide automatically determined customized image settings when in the automatic mode.
  • the device may further include a processor in data communication connection with the image sensor and the settings module.
  • the processor may be configured to execute a set of instructions to perform a method comprising capturing a first image in the manual mode using the manual image settings from the settings module while identifying one or more scene characteristics of the first image, associating the manual image settings used to capture the first image with one or more of the identified scene characteristics of the first image, determining an auto-setting relationship based on the association, and determining customized automatic image settings based on the auto-setting relationship and subsequent scene characteristics of a subsequent image captured in the automatic mode.
  • associating the manual image settings used to capture the first image with one or more of the identified scene characteristics of the first image comprises determining a first probability defined as the probability that the manual image settings used to capture the first image would be selected from a set of possible manual image settings, and determining a second probability defined as the probability that the identified scene characteristics would be present given the manual image settings used to capture the first image.
  • determining the customized automatic image settings based on the auto-setting relationship and the subsequent scene characteristics of the subsequent image captured in the automatic mode comprises determining the most probable manual image settings using the subsequent scene characteristics by multiplying the first probability times the second probability, wherein the determined most probable manual image settings are the customized automatic image settings to be applied to the subsequent image.
  • the method performed with the processor may further comprise associating subsequent manual image settings, used with one or more subsequent images captured in the manual mode, with one or more subsequent scene characteristics of the one or more subsequent images captured in the manual mode.
  • the method may further comprise adjusting the auto-setting relationship based on the subsequent association.
  • the method further comprise capturing the subsequent image in the automatic mode using the customized automatic image settings.
  • the manual image settings comprise auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, or temperature.
  • the customized automatic image settings comprise auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, or temperature.
  • the one or more scene characteristics comprise one or more environmental attributes of the scene.
  • the one or more environmental attributes may comprise brightness or scale.
  • the one or more scene characteristics comprise one or more subject attributes of the scene.
  • the one or more subject attributes may comprise facial likeness, body shape, color or movement.
  • an imaging system with customized automatic image settings may comprise means for capturing an image having one or more scene characteristics, means for providing image settings to be used for capturing images, the means for providing configured to operate in a manual mode and operate in an automatic mode, and the means for providing configured to receive customized image settings from a user of the system when in the manual mode and configured to provide automatically determined image settings when in the automatic mode, means for capturing a first image in the manual mode using manual image settings from the means for providing image settings while identifying one or more scene characteristics of the first image, means for associating the manual image settings used to capture the first image with one or more of the identified scene characteristics of the first image, means for determining an auto- setting relationship based on the association, and means for determining customized automatic image settings based on the auto-setting relationship and subsequent scene characteristics of a subsequent image captured in the automatic mode.
  • the one or more scene characteristics may comprise one or more environmental attributes of the scene. In some embodiments, the one or more scene characteristics may comprise one or more subject attributes of the scene.
  • a method of customizing automatic image settings of an imaging device where the device has an image sensor configured to capture an image having one or more scene characteristics and the device further has an electronic settings module configured with image settings to be used for capturing images, the settings module configured to operate in a manual mode and operate in an automatic mode, and the settings module configured to receive manual image settings from a user of the device when in the manual mode and configured to provide automatically determined customized image settings when in the automatic mode, the method comprising.
  • the method comprises capturing a first image in the manual mode using the manual image settings from the settings module while identifying one or more scene characteristics of the first image, associating the manual image settings used to capture the first image with one or more of the identified scene characteristics of the first image, determining an auto-setting relationship based on the association, and determining customized automatic image settings based on the auto- setting relationship and subsequent scene characteristics of a subsequent image captured in the automatic mode.
  • the method further comprises associating subsequent manual image settings, used with one or more subsequent images captured in the manual mode, with one or more subsequent scene characteristics of the one or more subsequent images captured in the manual mode. In some embodiments, the method further comprises adjusting the auto-setting relationship based on the subsequent association. In some embodiments, the method further comprises capturing the subsequent image in the automatic mode using the customized automatic image settings.
  • the manual image settings used in the method may comprise auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, or temperature.
  • the automatic image settings used in the method may comprise auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, or temperature.
  • the one or more scene characteristics used in the method may comprise one or more environmental attributes of the scene.
  • the one or more environmental attributes may comprise brightness or scale.
  • the one or more scene characteristics comprise one or more subject attributes of the scene.
  • the one or more subject attributes may comprise facial likeness, body shape, color or movement.
  • a non-transient computer readable medium configured to store instructions that when executed by a processor perform a method for customizing automatic image settings.
  • the method comprises capturing a first image in the manual mode using the manual image settings from a settings module while identifying one or more scene characteristics of the first image, associating the manual image settings used to capture the first image with one or more of the identified scene characteristics of the first image, determining an auto-setting relationship based on the association, and determining customized automatic image settings based on the auto-setting relationship and subsequent scene characteristics of a subsequent image captured in the automatic mode.
  • the method further comprises associating subsequent manual image settings, used with one or more subsequent images captured in the manual mode, with one or more subsequent scene characteristics of the one or more subsequent images captured in the manual mode.
  • the method may further comprise adjusting the auto-setting relationship based on the subsequent association.
  • the method may further comprise capturing the subsequent image in the automatic mode using the customized automatic image settings.
  • the manual image settings may comprise auto focus area, auto exposure area, ISO, white- balance, aperture, shutter speed, focus area, flash, or temperature.
  • the customized automatic image settings may comprise auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, or temperature.
  • the one or more scene characteristics may comprise one or more environmental attributes of the scene. In some embodiments, the one or more scene characteristics may comprise one or more subject attributes of the scene.
  • FIG. 1A shows an embodiment of a scene having scene characteristics captured by an imaging device.
  • FIG. IB is a table illustrating an example of manual settings that may be manually selected for the scene of FIG. 1 A.
  • FIG. 2 A shows another embodiment of a scene having scene characteristics captured by an imaging device.
  • FIG. 2B is a table illustrating an example of automatic settings that may be automatically selected for the scene of FIG. 2A.
  • FIG. 3 shows a block diagram of an imaging device having customized automatic image settings.
  • FIG. 4 shows an embodiment of an overview process for determining customized automatic image settings.
  • FIG. 5A shows an embodiment of a process, for determining or adjusting an auto-setting relationship based on manually selected image settings for use in determining customized automatic image settings, that may be used in the process of FIG. 4.
  • FIG. 5B shows an embodiment of a process for determining customized automatic image settings that may be used in the process of FIG. 4.
  • FIG. 6 shows an embodiment of a process for applying custom automatic image settings to an image of a scene based on an auto-setting relationship that takes into account preferred manual settings for the same scene.
  • FIG. 7 is a table with showing embodiments of manual settings and scene characteristics that may be used to determine an embodiment of an auto-setting relationship.
  • This disclosure provides methods, devices and systems with customized automatic image settings that may be used to capture images with an imaging device, such as a camera.
  • the imaging device may learn the custom preferences of a user based on manually-selected image settings for past captured images.
  • the scenes captured in manual mode may be associated with the manual settings selected for those scenes to form and build a relationship that is used with subsequent images taken in automatic mode.
  • the user may capture subsequent images in automatic mode, and image settings that take into account the user's previously preferred manual image settings may be applied automatically based on the relationship.
  • the automatic image settings may be customized based on a user's past preferences, and the relationship on which the automatic image settings are based may continue to evolve based on future manually- selected settings.
  • FIG. 1A shows an embodiment of a scene 100 having scene characteristics 105 that may be viewed and/or captured by an imaging device.
  • the scene 100 may include a variety of scene characteristics 105.
  • the scene characteristics 105 may relate to the environment in the scene 100, to subjects in the scene 100, or to other features of the scene 100.
  • the scene characteristics 105 in the scene 100 include a baby playing on the floor in an indoor room with the curtains drawn.
  • the scene characteristics 105 also include a wall, the floor, a table and a couch.
  • the scene characteristics 105 include environmental attributes.
  • Environmental attributes may include brightness, scale, or other environmental features of the scene 100.
  • the scene characteristics 105 shown in FIG. 1A include the relative less bright setting of an indoor room with the curtains drawn, as compared to the brightness outside during the day.
  • the scene characteristics 105 shown in FIG. 1A include the relatively large scale of the items in the close up scene 100, as compared to what the scale of these items would be from far away.
  • the scene characteristics 105 include subject attributes.
  • Subject attributes may include features of people, animals or other things that are the subject of the scene 100.
  • subject attributes may include facial likeness, body shape, color, movement, and/or other features of the subject or subjects.
  • the scene characteristics 105 shown in FIG. 1A include the facial likeness of the baby, the shape of the baby, the color of the baby, and the movement of the baby.
  • Other items in the scene 100 can also have subject attributes, such as the floor, wall, window curtains, table or couch.
  • scene characteristics 105 have been described as one or another type of characteristic, some characteristics 105 may belong to more than one type.
  • the scale of an object may be described as an environmental attribute, a subject attribute, and/or any other type of scene characteristic 105. Therefore, the discussion of a particular object in the context of one particular type of characteristic 105 is not meant to limit the scope of the present disclosure.
  • the image capturing device may have a manual mode.
  • the image settings applied to a captured image may be manually chosen by the user of the device.
  • FIG. IB is a table 110 having image settings for which values may be manually selected for an image capturing device to capture an image of the scene of FIG. 1A while the device is in manual mode.
  • Manual settings may include a variety of image settings.
  • manual settings include image settings such as auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, temperature, and/or others. As shown in FIG.
  • IB some image settings, for which values may be selected in manual mode, are listed in the left column under the label “Setting.” Corresponding values that may be manually chosen for these settings are shown in the right column under the label “Value.” For example, for the image setting of "shutter speed,” the corresponding value chosen may be “fast,” for the image setting of "flash,” the corresponding value chosen may be “off,” and for the image setting of "temperature,” the corresponding value chosen may be “cold.” “Temperature” here refers to the color temperature of the image, which may be warm, cool, or other temperatures. These are merely examples, and other values may be chosen for these settings. Further, other settings may be manually selected besides these three.
  • a user of an image capturing device may want to use the values for the settings shown in table 110 for capturing an image of the scene 100 shown in FIG. 1A. For example, because the scene 100 includes the scene characteristics 105 of a close up baby indoors, a user may thus want to manually apply a fast shutter speed due to quick movements of the baby, no flash so that the baby's eyes are not harmed, and cold temperature since the baby is close up.
  • the image capturing device may have an automatic mode. In automatic mode, the image settings applied to a captured image are automatically chosen by the device.
  • FIG. 2A shows another embodiment of a scene 100 having scene characteristics 105 viewed and/or captured by an imaging device in automatic mode. As shown in FIG. 2A, the scene characteristics 105 include a baby playing on the floor in an indoor room with the curtains drawn. The scene 100 also includes a crib. It is appreciated that the scene 100 shown in FIG. 2A has different particular scene characteristics 105 than the scene 100 shown in FIG. 1A. For instance, the baby in FIG. 2A is a different baby from that in FIG. 1A.
  • FIG. 2B is a table 120 having image settings for which values may be automatically selected by an image capturing device to capture an image of the scene of FIG. 2A while the device is in automatic mode.
  • Automatic settings may include a variety of image settings, including those used for manual mode.
  • automatic settings include image settings such as auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, temperature, and/or others. As shown in FIG.
  • the device may have been initially programmed to automatically choose a particular set of image settings given a particular set of scene characteristics 105.
  • the device may have been programmed to automatically choose values for the image settings that are different than a user's preference, or that are different than those shown in table 120 of FIG. 2B, to capture an image of the scene 100 in FIG. 2A.
  • the value for the image setting "flash" may have been automatically selected as "on” based on the initial programming of the device.
  • a user may desire to alter this and other automatically-chosen image settings.
  • a user may desire that the device automatically choose different image settings given the scene characteristics 105. For instance, a user may desire that the device automatically choose the value of "off for the image setting "flash" for the scene 100 of FIG. 2A.
  • the automatic image settings may be customized. In some embodiments, the automatic image settings may be customized based on a user's preferences. Therefore, the devices and methods disclosed herein allow for customizing the automatic image settings of an image capturing device. In some embodiments, the devices and methods disclosed herein allow for customizing the automatic image settings of an image capturing device based on a user's past manually-chosen image settings, as discussed in more detail below.
  • the automatic image settings may be customized based on a user's previous manually-chosen settings.
  • the automatically-chosen settings shown in the table 120 of FIG. 2B may be based on the user's manually-chosen image settings shown in table 110 of FIG. IB.
  • the device may initially be programmed to automatically choose a value of "slow” for the image setting "shutter speed,” a value of "on” for the image setting "flash,” and a value of "warm” for the image setting "temperature.” A user may then manually choose the values for these image settings that are shown in table 110 of FIG.
  • the user may then choose the automatic mode for the device to capture a subsequent image that includes the subsequent scene characteristics 105 of scene 100 shown in FIG. 2A.
  • the device will learn from the user's previous manually-chosen image settings to inform and alter the automatic mode. These values may be subsequently automatically chosen because the user manually selected them for a previous scene 100 having the same or similar scene characteristics 105.
  • the scene 100 of FIG. 1A includes a baby, indoors and close up.
  • the scene 100 of FIG. IB includes a baby, indoors and close up. Although it is a different baby in a different room in the scene 100 of FIG.
  • a subsequent image of the scene 100 shown in FIG. 2A may have the image settings shown in table 120 of FIG. 2B applied to the captured image, as opposed to the initially-programmed automatic image settings.
  • the automatic mode may subsequently choose "fast;” instead of the initially-programmed value "on” for flash, the automatic mode may subsequently choose "off;” and instead of the subsequently value "warm” for flash, the automatic mode may subsequently choose "on.”
  • FIG. 3 shows a block diagram of an imaging device 300 that may have customized automatic image settings.
  • the device 300 that may have customized automatic image settings includes devices that may merely have the ability to have customized automatic image settings.
  • a device 300 that is new may not yet know a user's preferences, and so there are no actual customized automatic image settings yet.
  • the device 300 may have its customized automatic image settings erased. It is understood that these and other examples are included as devices that may have customized automatic image settings and are thus within the scope of the present disclosure.
  • the imaging device 300 may have a set of components including a processor 320 coupled to an imaging sensor 315.
  • a working memory 305, storage 310, electronic display 325, and a memory 330 may also be included in the device 300 and in data communication with the processor 320.
  • the device 300 may be a digital camera, cell phone, tablet, personal digital assistant, or the like.
  • a plurality of applications may be available to the user on device 300. These applications may relate to sensing scene characteristics, applying image settings to capture an image, selecting an image capture mode such as manual or automatic, associating scene characteristics with the image settings used to capture the image of the scene, building and/or adjusting an auto-setting relationship based on the associations. Many other applications that are known in the art may further be included.
  • the processor 320 may be a general purpose processing unit or a processor specially designed for color sensing applications. As shown, the processor 320 is connected to a memory 330 and a working memory 305.
  • the memory 330 stores image sensor control module 335, scene characteristics recognition module 340, mode module 345, setting module 350, association module 355, auto-setting relationship module 357 and operating system 360. These modules include instructions that configure the processor 320 to perform various image processing and device management tasks.
  • Working memory 305 may be used by processor 320 to store a working set of processor instructions contained in the modules of memory 330. Alternatively, working memory 305 may also be used by processor 320 to store dynamic data created during the operation of the device 300.
  • the processor 320 may be configured by several modules.
  • Image sensor control module 335 may include instructions that configure the processor 320 to capture an image using the imaging sensor 315.
  • image sensor control module 335 may include instructions that configure the processor 320 to capture an image of the scene 100, having scene characteristics 105 shown in FIGS. 1A or IB, using the imaging sensor 315.
  • Scene characteristics recognition module 340 may include instructions that configure the processor 320 to identify scene characteristics.
  • the scene characteristics recognition module 340 may include instructions that configure the processor 320 to identify one or more environmental attributes, one or more subject attributes, and/or other attributes of a scene.
  • scene characteristics recognition module 340 may include instructions that configure the processor 320 to identify scene characteristics of the scene 100 shown in FIG. IB, such as the close up scale of the baby and other objects, the relative brightness of the ambient environment in the room, the facial likeness of the baby, the movement of the baby, and/or others.
  • Mode module 345 may include instructions that configure the processor 320 to enter an imaging mode.
  • the mode module 345 may include instructions that configure the processor 320 to enter a manual mode, an automatic mode, a hybrid mode, or other modes.
  • mode module 345 may include instructions that configure the processor 320 to enter manual mode for capturing an image with the imaging sensor 315 using the manually-chosen image settings shown in table 110 of FIG. IB.
  • mode module 345 may include instructions that configure the processor 320 to enter automatic mode for capturing an image with the imaging sensor 315 using the automatically-chosen image settings shown in table 120 of FIG. 2B.
  • the setting module 350 may include instructions that configure the processor 320 to provide image settings for device 300.
  • setting module 350 may include instructions that configure the processor 320 to provide image settings for device 300 to use for capturing an image with the imaging sensor 315.
  • setting module 350 may include instructions that configure the processor 320 to provide image settings for device 300 that relate to auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, temperature, and/or other image settings.
  • image settings to be applied to an image of a scene captured by the device 300 with the imaging sensor 315 may be provided by setting module 350. These settings may include, for example, the manual or automatic values for the settings shown in tables 110 and 120 of FIGS. IB and 2B, respectively.
  • Association module 355 may include instructions that configure the processor 320 to associate scene characteristics from the captured scene with the image settings used to capture the image of that scene. In some embodiments, association module 355 may include instructions that configure the processor 320 to associate scene characteristics with image settings that relate to auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, temperature, and/or other image settings. In some embodiments, association module 355 may include instructions that configure the processor 320 to associate image settings with scene characteristics that relate to subject attributes, environmental attributes, and/or other attributes of the scene. For example, association module 355 may include instructions that configure the processor 320 to associate the scene characteristics 105 of the scene 100 shown in FIG. 2A with the image settings shown in table 120 of FIG. 2B.
  • an association may be generated by association module 355 and stored in memory 310.
  • association module 355 may generate and store in memory 310 the association of A) a close up baby who is moving quickly in an indoor environment with darker lighting conditions, with B) a fast shutter speed, no flash and cold temperature. Further details of associating scene characteristics with image settings are discussed herein, for example with respect to FIGS. 4-7.
  • Auto-setting relationship module 357 may include instructions that configure the processor 320 to generate and/or provide automatic image settings, which may be custom automatic image settings, for device 300.
  • setting module 350 may include instructions to provide automatic image settings for device 300 to use for capturing an image with the imaging sensor 315.
  • setting module 350 may include instructions to provide automatic image settings for device 300 that relate to auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, temperature, and/or other image settings.
  • setting module 350 may provide instructions that configure the processor 320 to provide automatic image settings to be applied to an image of a scene captured by the device 300 with the imaging sensor 315. These settings may include, for example, the values for the automatic image settings shown in table 120 of FIG. 2B.
  • the various modules may further call subroutines in other modules.
  • the mode module 345 may configure the processor 320 to call subroutines in the setting module 350 to provide image settings based on the current imaging mode, such as manual or automatic mode.
  • setting module 350 may configure the processor 320 to call subroutines in scene characteristics recognition module 340 to identify scene characteristics of a scene viewed by the imaging sensor 315.
  • image sensor control module 335 may provide instructions that configure the processor 320 to apply the image settings provided by setting module 350, based on the identified scene characteristics provided by scene characteristics recognition module 340, to capture an image of the scene using the imaging sensor 315.
  • scene characteristics recognition module 340 may provide instructions that configure the processor 320 to provide to association module 355 the identified scene characteristics from the captured scene.
  • setting module 350 may provide instructions that configure the processor 320 to provide to association module 355 the image settings used with scene characteristics.
  • association module 355 may provide instructions that configure the processor 320 to provide the auto-setting relationship module 357 associations between scene characteristics and image settings.
  • Operating system module 360 may configure the processor 320 to manage the memory and processing resources of device 300.
  • operating system module 360 may include device drivers to manage hardware resources such as the electronic display 325, storage 310, or imaging sensor 315. Therefore, in some embodiments, instructions contained in the other modules discussed above may not interact with these hardware resources directly, but instead interact through standard subroutines or APIs located in the operating system module 360. Instructions within operating system 360 may then interact directly with these hardware components.
  • Processor 320 may write data to storage module 310. While storage module 310 is represented graphically as a traditional disk device, those with skill in the art would understand multiple embodiments could include either a disk based storage device or one of several other type storage mediums to include a memory disk, USB drive, flash drive, remotely connected storage medium, virtual disk driver, or the like.
  • FIG. 3 depicts a device comprising separate components to include a processor, imaging sensor, and memory
  • processor imaging sensor
  • memory memory components
  • FIG. 3 illustrates two memory components, to include memory component 330 comprising several modules, and a separate memory 305 comprising a working memory
  • a design may utilize ROM or static RAM memory for the storage of processor instructions implementing the modules contained in memory 330.
  • processor instructions may be read at system startup from a disk storage device that is integrated into device 300 or connected via an external device port. The processor instructions may then be loaded into RAM to facilitate execution by the processor.
  • working memory 305 may be a RAM memory, with instructions loaded into working memory 305 before execution by the processor 320.
  • FIG. 4 shows an embodiment of a process 400 for customizing automatic image settings.
  • the process 400 may be performed by or with a variety of imaging devices. In some embodiments, the process 400 may be performed by or with the device 300 described with respect to FIG. 3.
  • the process 400 may include block 410 wherein an imaging mode is selected. As shown, in some embodiments, block 410 may include selecting either manual or automatic mode. In some embodiments, an imaging mode may be selected in block 410 using the mode module 345 and processor 320 of device 300.
  • the process 400 may further include block 420 wherein one or more image settings are selected.
  • block 420 may include selecting image settings including auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, temperature, and/or other settings.
  • block 420 may include automatically or annually selecting image settings.
  • block 420 may include selecting particular values for these or other settings. For instance, block 420 may include manually selecting the values shown in table 110 of FIG. IB. Block 420 may also include automatically selecting the values shown in table 120 of FIG. 2B.
  • Block 420 may be performed with a variety of modules and components of an imaging device.
  • image settings may be selected in block 420 using the setting module 350 and processor 320 of device 300.
  • image settings may be selected in block 420 using setting module 350, mode module 345 and processor 320 of device 300.
  • image settings may be selected in block 420 using auto-setting relationship module 357, setting module 350, mode module 345 and processor 320 of device 300.
  • the process 400 may further include block 430 wherein an image of a scene is captured.
  • block 430 may include capturing an image of a scene using the image settings selected in block 420.
  • block 430 may include capturing an image of a scene using the mode selected in block 410.
  • block 430 may include capturing the scene 100 of FIG. 1A using the manually- chosen image settings shown in table 110 of FIG. IB.
  • block 430 may include capturing the scene 100 of FIG. 2A using the automatically-chosen image settings shown in table 120 of FIG. 2B.
  • Block 430 may be performed with a variety of modules and components of an imaging device.
  • a scene is captured in block 430 using image sensor control module 350, processor 320 and imaging sensor 315 of device 300.
  • a scene is captured in block 430 using setting module 350, image sensor control module 350, processor 320 and imaging sensor 315 of device 300.
  • a scene is captured in block 430 using mode module 345, image sensor control module 350, processor 320 and imaging sensor 315 of device 300.
  • a scene is captured in block 430 using setting module 350, mode module 345, image sensor control module 350, processor 320 and imaging sensor 315 of device 300.
  • mode module 345 may provide instructions that configure the processor 320 to capture an image in manual mode
  • setting module 350 may provide instructions that configure the processor 320 to use the image settings shown in table 110 of FIG. IB when capturing the image
  • image sensor control module 350 may provide instructions that configure the processor 320 to capture the image using the imaging sensor 315.
  • mode module 345 may provide instructions that configure the processor 320 to capture an image in automatic mode
  • scene characteristics recognition module 340 may identify the scene characteristics of the scene to be captured
  • auto- setting relationship module 357 may provide instructions that configure the processor 320 to build and/or use an auto-setting relationship to apply automatically-chosen image settings based on the identified scene characteristics
  • setting module 350 may provide instructions that configure the processor 320 to use the image settings shown in table 120 of FIG. 2B when capturing the image
  • image sensor control module 350 may provide instructions that configure the processor 320 to capture the image using the imaging sensor 315.
  • the process 400 may further include block 440 wherein scene characteristics are identified.
  • block 440 may include identifying subject attributes, environmental attributes, and/or other scene characteristics.
  • block 440 may include identifying scene characteristics 105 of the scenes 100 of FIG. 1A or FIG. 2A.
  • block 440 may include identifying scene characteristics 105 of the scene 100 in FIG. 2A as a facial likeness of a baby, who is moving quickly, has a close up scale, and is in an indoor lighting environment.
  • Block 440 may be performed with a variety of modules and components of an imaging device.
  • block 440 may include identifying scene characteristics using the scene characteristics recognition module 340 of device 300.
  • block 440 may include identifying and providing the identified scene characteristics to other modules of the device 300.
  • block 440 may include identifying scene characteristics using the scene characteristics recognition module 340 and the image sensor control module 355 of device 300.
  • block 440 may include identifying scene characteristics using the scene characteristics recognition module 340, the image sensor control module 355 and the imaging sensor 315 of device 300.
  • the image sensor control module 355 may provide instructions that configure the processor 320 to view the scene using the imaging sensor 315, the imaging sensor 315 may then view and/or detect one or more scene characteristics in the scene, and the scene characteristics recognition module 340 may configure the processor 320 to identify the viewed and/or detected scene characteristics. Applying this example using the scene 100 in FIG.
  • the image sensor control module 355 may provide instructions that configure the processor 320 to view the room and objects in it using the imaging sensor 315, the imaging sensor 315 may then detect that there are three objects in an indoor environment in the scene 100, and the scene characteristics recognition module 340 may configure the processor 320 to identify the objects as a baby that is close up and moving quickly, a close up and stationary crib and window, and an environment that has relatively darker indoor ambient lighting.
  • the process 400 may further include block 450 wherein scene characteristics are associated with the image settings used to capture the image of the scene.
  • block 450 may include associating subject attributes, environmental attributes, and/or other scene characteristics with particular values for auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, temperature, and/or other image settings.
  • block 450 may include associating the scene characteristics of the scene with manually-chosen image settings.
  • block 450 may include associating the scene characteristics 105 of the scene 100 of FIG. 1A with the manually-chosen image settings shown in table 110 of FIG. IB.
  • block 450 may include associating the scene characteristics of the scene with automatically-chosen image settings.
  • block 450 may include associating the scene characteristics 105 of the scene 100 of FIG. 2A with the automatically-chosen image settings given in table 120 of FIG. 2B.
  • Block 450 may be performed with a variety of modules and components of an imaging device.
  • block 450 may include associating scene characteristics using association module 355.
  • block 450 may include associating scene characteristics using one or more modules and/or components of an imaging device.
  • block 450 may include associating scene characteristics using the image sensor control module 335, the scene characteristics recognition module 340, the mode module 345, the setting module 350, the association module 355, the operating system 360, the processor 320, the imaging sensor 315, the storage 310, and/or the working memory 305 of device 300.
  • the scene characteristics recognition module 340 may provide instructions that configure the processor 320 to provide identified scene characteristics to the association module 355, the setting module 350 may provide instructions that configure the processor 320 to provide image settings used with those identified scene characteristics to the association module 355, and the association module 355 may provide instructions that configure the processor 320 to generate one or more associations between the provided scene characteristics and the provided image settings. Applying this example using the scene 100 in FIG.
  • the scene characteristics recognition module 340 may provide instructions that configure the processor 320 to provide to the association module 355 the identified scene characteristics 105 of a baby that is close up and moving quickly, a close up and stationary crib and window, and an environment that has relatively darker indoor ambient lighting
  • the setting module 350 may provide instructions that configure the processor 320 to provide to the association module 355 the image settings shown in table 120 of FIG. 2B
  • the association module 355 may provide instructions that configure the processor 320 to generate one or more associations between (A) a baby that is close up and moving quickly, a close up and stationary crib and window, and an environment that has relatively darker indoor ambient lighting, and (B) a fast shutter speed, no flash, and a cold temperature.
  • the process 400 may further include block 460 wherein an auto-setting relationship is determined or adjusted.
  • block 460 may include determining an auto-setting relationship that is used in automatic mode to automatically provide image settings based on one or more identified scene characteristics.
  • block 460 may include determining an auto-setting relationship based on the one or more associations formed in block 450.
  • block 460 may include determining an auto-setting relationship based on one or more associations formed between (A) subject attributes, environmental attributes, and/or other scene characteristics and (B) particular values for auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, temperature, and/or other image settings.
  • block 460 may include determining an auto- setting relationship based on one or more associations of the scene characteristics with manually-chosen image settings.
  • block 460 may include determining an auto-setting relationship based on one or more associations between the scene characteristics 105 of the scene 100 of FIG. 1A with the manually-chosen image settings shown in table 110 of FIG. IB.
  • an auto-setting relationship may be determined that provides the image settings of fast shutter speed, no flash, and cold temperature when the scene characteristics of a close up, quickly-moving baby in an indoor lighting environment are identified.
  • block 460 may include adjusting an existing auto-setting relationship. In some embodiments, block 460 may include adjusting an auto-setting relationship that is used in automatic mode to automatically provide image settings based on one or more identified scene characteristics. In some embodiments, block 460 may include adjusting an existing auto-setting relationship based on the one or more associations formed in block 450. In some embodiments, block 460 may include adjusting an existing auto-setting relationship based on one or more associations formed between (A) subject attributes, environmental attributes, and/or other scene characteristics and (B) particular values for auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, temperature, and/or other image settings.
  • A subject attributes, environmental attributes, and/or other scene characteristics
  • B particular values for auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, temperature, and/or other image settings.
  • block 460 may include adjusting an auto-setting relationship based on one or more associations of the scene characteristics with manually- chosen image settings.
  • an auto-setting relationship may exist in the device 300 that will provide the image settings of a flash turned on and slow shutter speed when the scene characteristic of an indoor lighting environment is encountered.
  • the device 300 may be used to capture an image of the scene 100 in FIG. 1A using the manually-chosen images settings shown in table 110 of FIG. IB, e.g. a flash turned off and a fast shutter speed.
  • This may adjust the auto-setting relationship such that the image settings shown in table 120 of FIG. 2B will now be applied to subsequent scenes 100 that include an indoor lighting environment with a close up of a quickly moving baby.
  • This is merely an example, and many other auto-setting relationships may be adjusted in many other manners. Further details of adjusting an auto-setting relationship are discussed herein, for example with respect to FIGS. 5A-7.
  • Block 460 may be performed with a variety of modules and components of an imaging device.
  • block 460 may include determining or adjusting an auto-setting relationship using auto-setting relationship module 357.
  • block 460 may include determining or adjusting an auto-setting relationship using one or more modules and/or components of an imaging device.
  • block 460 may include determining or adjusting an auto- setting relationship using the image sensor control module 335, the scene characteristics recognition module 340, the mode module 345, the setting module 350, the association module 355, the operating system 360, the processor 320, the imaging sensor 315, the storage 310, and/or the working memory 305 of device 300.
  • the association module 355 may provide instructions that configure the processor 320 to provide one or more associations to the auto-setting relationship module 357, and the auto-setting relationship module 357 may provide instructions that configure the processor 320 to generate an auto-setting relationship to be used with subsequent images and to store this auto-setting relationship in storage 310. Applying this example using the scene 100 in FIG.
  • the association module 355 may provide instructions that configure the processor 320 to provide to the auto-setting relationship module 357 the association of (A) a baby that is close up and moving quickly, a close up and stationary crib and window, and an environment that has relatively darker indoor ambient lighting, with (B) a fast shutter speed, no flash, and a cold temperature, and the auto- setting relationship module 357 may provide instructions that configure the processor 320 to generate and store in storage 310 an auto-setting relationship, such that when the scene 100 of FIG. 2A is subsequently encountered in automatic mode this auto-setting relationship in storage 310 will be accessed and used and thus the settings shown in table 120 of FIG. 2B will be automatically applied to the subsequently captured image.
  • the process 400 may further include block 470 wherein one or more customized automatic image settings are determined.
  • customized automatic image settings may be determined in block 470 using the auto-setting relationship determined or adjusted in block 460.
  • customized automatic image settings may be determined in block 470 when subsequent scene characteristics in a subsequent scene are identified. For example, the scene characteristics 105 of the scene 100 shown in FIG. 2A may be identified, and an auto-setting relationship may be used to determine the customized automatic image settings to apply to the captured image of that scene 100.
  • the auto-setting relationship may be based on associations between the scene characteristics 105 of FIG. 1A and the image settings shown in table 110 of FIG. IB. For instance, if the scene characteristics 105 of the scene 100 shown in FIG.
  • an auto-setting relationship based on associations between the scene characteristics 105 of FIG. 1A and the image settings shown in table 110 of FIG. IB may be used such that the image is taken with the customized image settings that are the same as those shown in table 120 of FIG. 2B, i.e. a fast shutter speed, the flash off, and the temperature set to cold. Further details of determining customized image settings are discussed herein, for example with respect to FIGS. 5A-7.
  • FIG. 5A shows an embodiment of a process 500 for determining or adjusting an auto-setting relationship based on manually selected image settings for use in determining customized automatic image settings to be used with subsequently captured images.
  • the process 500 may be used in the overview process 400 shown in FIG. 4.
  • the process 500 may include block 505 wherein manual mode is selected.
  • block 505 includes a user of an image capture device selecting the manual mode.
  • a user of the device 300 may manually select manual mode using a touch-screen display 325 that provides a menu option for selecting manual mode.
  • the mode module 345 may provide instructions that configure the processor 320 to respond to user input selecting manual mode by entering device 300 into manual mode.
  • any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to selecting the manual mode including those discussed with respect to block 410 of process 400 shown in FIG. 4 that relate to selecting the manual mode, may apply to block 505 of process 500.
  • the process 500 may further include block 510 wherein image settings are manually selected.
  • block 510 includes a user of an image capture device manually selecting image settings.
  • block 510 may be performed by or with the device 300 shown in FIG. 3.
  • a user of the device 300 may manually select image settings using a touch-screen display 325 that provides a menu option for manually selecting image settings.
  • the setting module 345 may provide instructions that configure the processor 320 to respond to manually- selected image settings by applying those settings to an image captured with the imaging sensor 315.
  • any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to manually selecting image settings including those discussed with respect to block 420 of process 400 shown in FIG. 4 that relate to manually selecting image settings, may apply to block 510 of process 500.
  • the process 500 may further include block 515 wherein an image of a scene is captured.
  • block 510 includes capturing an image of a scene using the image settings manually selected in block 510.
  • block 515 may be performed by or with the device 300 shown in FIG. 3. For instance, a user of the device 300 may capture on image of the scene using a touch-screen display 325 that provides a menu option for capturing an image with the imaging sensor 315.
  • setting module 350 may provide instructions that configure the processor 320 to capture an image with the imaging sensor 315 using the manually- selected image settings.
  • the process 500 may further include block 520 wherein scene characteristics of a scene are identified.
  • block 520 includes identifying the scene characteristics of the scene captured in block 515.
  • block 520 may be performed by or with the device 300 shown in FIG. 3.
  • the scene characteristics recognition module 340 of device 300 may provide instructions that configure the processor 320 to identify the scene characteristics of a scene viewed by an imaging sensor 315.
  • any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to identifying scene characteristics, including those discussed with respect to block 440 of process 400 shown in FIG. 4 may apply to block 515 of process 500.
  • the process 500 may further include block 525 wherein scene characteristics are associated with manually-selected image settings.
  • block 525 includes associating scene characteristics identified in block 520 with the manually- selected image settings selected in block 510.
  • block 525 may be performed by or with the device 300 shown in FIG. 3.
  • the association module 355 of the device 300 may provide instructions that configure the processor 320 to associate the scene characteristics with manually- selected image settings using.
  • any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to associating scene characteristics with manually- selected image settings, including those discussed with respect to block 450 of process 400 shown in FIG. 4, may apply to block 515 of process 500.
  • the process 500 may further include decision block 530 wherein it is determined whether an auto-setting relationship is defined.
  • block 530 includes using various modules and/or components to determine whether an auto- setting relationship is defined.
  • block 530 may be performed by or with the device 300 shown in FIG. 3.
  • the association module 355 of device 300 may provide instructions that configure the processor 320 to determine whether an auto-setting relationship has already been determined.
  • the processor 320 may search the storage 310 or working memory 305 to determine whether an auto-setting relationship already exists.
  • block 535 includes determining an auto-setting relationship based on the scene characteristics identified in block 520 and the image settings manually chosen in block 510.
  • block 535 may be performed by or with the device 300 shown in FIG. 3.
  • the auto-setting relationship module 357 of device 300 may provide instructions that configure the processor 320 to determine an auto-setting relationship.
  • any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to determining an auto-setting relationship, including those discussed with respect to block 460 of process 400 shown in FIG. 4, may apply to block 530 of process 500.
  • the process 500 may continue to block 545.
  • block 540 includes adjusting the auto-setting relationship based on the scene characteristics identified in block 520 and the image settings manually chosen in block 510.
  • block 540 may be performed by or with the device 300 shown in FIG. 3.
  • the auto-setting relationship module 357 of device 300 may provide instructions that configure the processor 320 to adjust the auto-setting relationship.
  • any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to adjusting an auto-setting relationship, including those discussed with respect to block 460 of process 400 shown in FIG. 4, may apply to block 540 of process 500.
  • the process 500 may continue to block 545.
  • the process 500 may further include block 545 wherein customized automatic image settings are determined.
  • block 545 includes determining customized automatic image settings based on the auto-setting relationship determined in block 535.
  • block 545 includes determining customized automatic image settings based on the auto-setting relationship adjusted in block 540.
  • block 545 includes determining customized automatic image settings based on applying an auto-setting relationship to scene characteristics identified in subsequent scenes 100 captured in subsequent images.
  • block 545 may be performed by or with the device 300 shown in FIG. 3.
  • the auto-setting relationship module 357 of device 300 may provide instructions that configure the processor 320 to apply the auto-setting relationship to subsequently-identified scene characteristics to generate image settings to be automatically applied to images subsequently-captured with the imaging sensor 315.
  • FIG. 5B shows an embodiment of a process 550 for determining customized automatic image settings.
  • the process 550 may be used in the process 400 shown in FIG. 4.
  • process 550 may be used in block 470 of process 400.
  • process 550 may be used after the process 500 shown in FIG. 5A is performed.
  • process 500 may be performed to determine or adjust an auto- setting relationship.
  • process 550 may be performed to apply the auto-setting relationship to subsequent images in the automatic mode to determine customized automatic image settings.
  • process 500 may be performed multiple times before process 550 is performed.
  • process 500 may be performed to determine an auto-setting relationship, then process 550 may be performed to determine customized automatic image settings, and then process 500 may be performed again to adjust the auto-setting relationship.
  • the process 550 may include block 560 wherein automatic mode is selected.
  • block 560 includes a user of an image capture device selecting the automatic mode. For instance, a user of the device 300 may select automatic mode using a touch-screen display 325 that provides a menu option for selecting automatic mode. Further, for example, the mode module 345 may provide instructions that configure the processor 320 to respond to user input selecting automatic mode by entering device 300 into automatic mode. Further, any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to selecting the automatic mode, including those discussed with respect to block 410 of process 400 shown in FIG. 4 that relate to selecting the automatic mode, may apply to block 560 of process 550.
  • the process 550 may further include block 565 wherein scene characteristics of a scene are identified.
  • block 565 includes identifying the scene characteristics of the scene viewed by an imaging sensor.
  • block 565 may be performed by or with the device 300 shown in FIG. 3.
  • the scene characteristics recognition module 340 of device 300 may provide instructions that configure the processor 320 to identify the scene characteristics of a scene viewed by an imaging sensor 315.
  • any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to identifying scene characteristics including those discussed with respect to block 520 of process 500 shown in FIG. 5A and block 440 of process 400 shown in FIG. 4, may apply to block 565 of process 550.
  • the process 550 may further include block 570 wherein customized automatic image settings are determined.
  • block 570 includes determining customized automatic image settings based on an auto-setting relationship.
  • block 570 includes determining customized automatic image settings based on applying an auto-setting relationship to scene characteristics identified in a scene.
  • block 570 includes determining customized automatic image settings based on applying an auto-setting relationship to scene characteristics identified in block 565.
  • block 570 may be performed by or with the device 300 shown in FIG. 3.
  • the auto-setting relationship module 357 of device 300 may provide instructions that configure the processor 320 to apply the auto- setting relationship to the scene characteristics to generate image settings to be automatically applied to an image of the scene captured with the imaging sensor 315.
  • any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to determining customized automatic image settings may apply to block 545 of process 500.
  • the process 550 may further include block 575 wherein an image of a scene is captured.
  • block 570 includes capturing an image of a scene using the customized automatic image settings determined in block 570.
  • block 575 may be performed by or with the device 300 shown in FIG. 3.
  • the image sensor control module 357 and/or setting module 350 of device 300 may provide instructions that configure the processor 320 to apply the customized automatic image settings determined in block 570 to an image of the scene captured with the imaging sensor 315.
  • any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to capturing an image using customizing automatic image settings may apply to block 575 of process 500.
  • FIG. 6 shows an embodiment of a process for applying custom automatic image settings to an image of a scene based on an auto-setting relationship that takes into account preferred manual settings for the same scene.
  • the process 600 may include block 605 wherein manual mode is selected.
  • Block 605 may include similar features and functionalities as discussed above with respect to block 505 of process 500 shown in FIG. 5A.
  • the process 600 may further include block 610 wherein scene characteristics of a scene are identified.
  • Block 610 may include similar features and functionalities as discussed above with respect to block 520 of process 500 shown in FIG. 5A.
  • the process 600 may further include block 615 wherein image settings are manually selected.
  • Block 615 may include similar features and functionalities as discussed above with respect to block 510 of process 500 shown in FIG. 5A.
  • the process 600 may further include block 620 wherein scene characteristics are associated with manually-selected image settings.
  • block 620 includes associating the scene characteristics identified in block 610 with the image settings manually selected in block 615.
  • Block 620 may further include similar features and functionalities as discussed above with respect to block 525 of process 500 shown in FIG. 5A.
  • the process 600 may further include decision block 625 wherein it is determined whether an auto-setting relationship is defined. Decision block 625 may further include similar features and functionalities as discussed above with respect to decision block 530 of process 500 shown in FIG. 5A. If it is determined in decision block 625 that an auto-setting relationship is not defined, then the process 600 may continue to block 630, wherein an auto-setting relationship is determined. If it is determined in decision block 625 that an auto-setting relationship is defined, then the process 600 may continue to block 635, wherein an existing auto-setting relationship is adjusted.
  • the process 600 may further include block 630 wherein an auto-setting relationship is determined.
  • Block 630 may include similar features and functionalities as discussed above with respect to block 535 of process 500 shown in FIG. 5A. After block 630, the process 600 may continue to block 640.
  • the process 600 may further include block 635 wherein an auto-setting relationship is adjusted. Block 635 may include similar features and functionalities as discussed above with respect to block 540 of process 500 shown in FIG. 5A. After block 635, the process 600 may continue to block 640.
  • the process 600 may further include block 640 wherein customized automatic image settings are determined.
  • block 640 includes determining customized automatic image settings based on the auto-setting relationship determined in block 630.
  • block 640 includes determining customized automatic image settings based on the auto-setting relationship adjusted in block 635.
  • Block 635 may further include similar features and functionalities as discussed above with respect to block 545 of process 500 shown in FIG. 5A.
  • the process 600 may further include block 645 wherein an image of the scene is captured.
  • block 645 includes capturing an image of the scene using the customized automatic image settings determined in block 640.
  • the scene is captured in block 645 using an auto-setting relationship that is determined or adjusted based on the association determined in block 620, which was determined based on manually- selected image settings for that same scene.
  • a user may therefore use an auto-setting relationship that takes into account the user's preferences with respect to a scene to capture an image of that same scene.
  • custom automatic image settings may be determined for a particular scene using an auto-setting relationship that is adjusted or determined on the fly.
  • an auto- setting relationship is used that takes into account those manual image settings selected in block 615 to capture an image of that same scene.
  • FIG. 7 is a table showing embodiments of manual settings and scene characteristics that may be used to determine an embodiment of an auto-setting relationship. This auto-setting relationship may in turn be used to determine custom automatic image settings.
  • the various associations and auto-setting relationships that will be discussed with respect to the manual settings and scene characteristics shown in FIG. 7 may be used in the various devices, systems and methods discussed above with respect to FIGS. 1A-6.
  • the table 700 may include column 710 with various manual settings that may be applied in manual mode.
  • the settings may be identified generically as Mi, M 2 , M 3 ...M m . There are therefore "m" number of possible manual settings "M” shown.
  • the table 700 may further include column 720 with various scene characteristics that may be in a scene.
  • an auto-setting relationship may be determined based on associations formed between the manual settings and scene characteristics shown in table 700.
  • all Si for 1 ⁇ i ⁇ n are independent, and similarly all M j for 1 ⁇ j ⁇ m are independent.
  • a conditional probability that a particular manual setting will be chosen, given the "n" possible choices of scene characteristics "S,” may be determined as follows: P(MIS) ⁇ P(SIM)*P(M).
  • P(MIS) represents the conditional probability that a particular manual setting "M” will be chosen given all possible choices of scene characteristics "S" present in a given scene
  • P(SIM) represents the conditional probability that a particular scene characteristic "S” is present in the scene given the selected manual settings "M”
  • P(M) represents the probability that a particular manual setting "M” may be chosen out of all possible manual settings for a device 300. Therefore, in this manner, customized automatic image settings may be determined by computing the highest probability for P(MIS) by multiplying P(SIM) times P(M).
  • the auto-setting relationship may be adjusted based on future or subsequent associations.
  • the conditional probability P(MIS) may change as more and more associations between manual settings and scene characteristics are determined.
  • Mi and M 2 may be associated with Si, while M 2 may also be associated with S 2 .
  • the auto setting relationship may determine that if S 2 is encountered, then Mi should be applied to the captured image. The reason is that there is a link between S 2 and Mi given the association between Si with both M 2 and Mi. In other words, S 2 is associated with M 2 , which is associated with Si, which is associated with Mi. In this manner, Mi may be applied when S 2 is identified.
  • the associations between "S" and “M” may be selectively used to determine the auto-setting relationship. For instance, there may be associations between Mi and S 2 , Mi and S 2 , M 2 and Si, and M 3 and Si. In some embodiments, all of these associations may be used to determine the auto-setting relationship. In other embodiments, some of these associations may be selectively ignored to determine the auto-setting relationship. Thus, in some embodiments, only the associations between Mi and S 2 , Mi and S 2 , and M 2 and Si may be used to determine the auto-setting relationship, while the association between M 3 and Si may not be included in determining the auto-setting relationship.
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • a general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
  • a software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
  • An exemplary storage medium is coupled to the processor such the processor reads information from, and write information to, the storage medium.
  • the storage medium may be integral to the processor.
  • the processor and the storage medium may reside in an ASIC.
  • the technology is operational with numerous other general purpose or special purpose computing system environments or configurations.
  • Examples of well-known computing systems, environments, and/or configurations that may be suitable for use with the invention include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, a microcontroller or microcontroller based system, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
  • instructions refer to computer-implemented steps for processing information in the system. Instructions may be implemented in software, firmware or hardware and include any type of programmed step undertaken by components of the system.
  • a microprocessor may be any conventional general purpose single- or multi-chip microprocessor such as a Pentium ® processor, a Pentium ® Pro processor, a 8051 processor, a MIPS ® processor, a Power PC ® processor, an Alpha ® processor, or a duo core or quad core processor.
  • the microprocessor may be any conventional special purpose microprocessor such as a digital signal processor or a graphics processor.
  • the microprocessor typically has conventional address lines, conventional data lines, and one or more conventional control lines.
  • the system may be used in connection with various operating systems such as Linux®, UNIX® or Microsoft Windows®.
  • the system control may be written in any conventional programming language such as C, C++, BASIC, Pascal, .NET (e.g., C#), or Java, and ran under a conventional operating system.
  • C, C++, BASIC, Pascal, Java, and FORTRAN are industry standard programming languages for which many commercial compilers may be used to create executable code.
  • the system control may also be written using interpreted languages such as Perl, Python or Ruby. Other languages may also be used such as PHP, JavaScript, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

Systems, devices and methods for customized automatic image settings are disclosed. The imaging device can learn the manual settings set by a user of the device to capture an image of a scene. Examples include a fast exposure setting for a moving scene, a flash setting for a darker scene, and/or a cold temperature setting for a close-up. Those characteristics of the scene may then be associated with the user's manual settings to form a relationship to be used by the device's automatic settings for future images. Techniques for forming and modifying the relationship with each subsequent image taken with manual settings are also disclosed.

Description

IMAGING SYSTEMS, DEVICES AND METHODS FOR CUSTOMIZED
AUTOMATIC IMAGE SETTINGS
BACKGROUND
Field of the Invention
[0001] This disclosure relates generally to digital image capturing. In particular, methods and devices for customizing automatic image settings used with capturing digital images are disclosed.
Description of the Related Art
[0002] Photography is a field of interest to many people. With the advent of cheaper digital photography devices, more people have digital image capturing devices. With this increased interest and use of such devices, consumers want more features and flexibility with respect to the images they are capturing. In particular, users want to be able to apply their own individualized settings to the images they capture and to do so easily.
[0003] Conventional image capturing devices, such as cameras, have automatic settings that are automatically applied to the captured images. These automatic settings are set when the product is manufactured. For instance, a camera from the factory may include light sensors that determine when the flash should be used based on sensing the environmental light conditions. However, users may prefer to have a different automatic response from the camera than what has been programmed at the factory.
[0004] While some cameras allow for manual settings to be applied by the user, these manual settings are separate from the automatic settings. For instance, a user's selection of manual settings typically does not affect the automatic settings. Further, the manual settings must be set manually by the user every time he or she wants to apply settings different from the automatic settings. For example, the automatic settings may want to apply a particular shutter speed given the ambient light, but the user may want a faster shutter speed given the fast moving object being captured. In such a case, the user must manually set the camera to use the faster shutter speed in that situation every time it is encountered. [0005] It is desirable therefore to have an image capturing device that takes into account the individual user's preference with respect to automatic image settings and that will do so in a simple and convenient manner for the user.
SUMMARY
[0006] The embodiments disclosed herein each have several aspects no single one of which is solely responsible for the disclosure's desirable attributes. Without limiting the scope of this disclosure in any way, certain prominent features will now be briefly discussed, and such features may appear together or separately in one or more embodiments. After considering this discussion, and particularly after reading the section entitled "Detailed Description," one will understand how the features of the embodiments described herein provide advantages over existing digital image capturing systems, devices and methods.
[0007] In an aspect, an imaging device with customized automatic image settings is disclosed. In some embodiments, the device comprises an image sensor configured to capture an image having one or more scene characteristics. The device may also include an electronic settings module configured with image settings to be used for capturing images, the settings module configured to operate in a manual mode and operate in an automatic mode, and the settings module configured to receive manual image settings from a user of the device when in the manual mode and configured to provide automatically determined customized image settings when in the automatic mode. The device may further include a processor in data communication connection with the image sensor and the settings module. The processor may be configured to execute a set of instructions to perform a method comprising capturing a first image in the manual mode using the manual image settings from the settings module while identifying one or more scene characteristics of the first image, associating the manual image settings used to capture the first image with one or more of the identified scene characteristics of the first image, determining an auto-setting relationship based on the association, and determining customized automatic image settings based on the auto-setting relationship and subsequent scene characteristics of a subsequent image captured in the automatic mode.
[0008] In some embodiments, associating the manual image settings used to capture the first image with one or more of the identified scene characteristics of the first image comprises determining a first probability defined as the probability that the manual image settings used to capture the first image would be selected from a set of possible manual image settings, and determining a second probability defined as the probability that the identified scene characteristics would be present given the manual image settings used to capture the first image.
[0009] In some embodiments, determining the customized automatic image settings based on the auto-setting relationship and the subsequent scene characteristics of the subsequent image captured in the automatic mode comprises determining the most probable manual image settings using the subsequent scene characteristics by multiplying the first probability times the second probability, wherein the determined most probable manual image settings are the customized automatic image settings to be applied to the subsequent image.
[0010] In some embodiments, the method performed with the processor may further comprise associating subsequent manual image settings, used with one or more subsequent images captured in the manual mode, with one or more subsequent scene characteristics of the one or more subsequent images captured in the manual mode. The method may further comprise adjusting the auto-setting relationship based on the subsequent association. The method further comprise capturing the subsequent image in the automatic mode using the customized automatic image settings.
[0011] In some embodiments, the manual image settings comprise auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, or temperature. In some embodiments, the customized automatic image settings comprise auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, or temperature.
[0012] In some embodiments, the one or more scene characteristics comprise one or more environmental attributes of the scene. The one or more environmental attributes may comprise brightness or scale. In some embodiments, the one or more scene characteristics comprise one or more subject attributes of the scene. The one or more subject attributes may comprise facial likeness, body shape, color or movement.
[0013] In another aspect, an imaging system with customized automatic image settings is disclosed. In some embodiments, the system may comprise means for capturing an image having one or more scene characteristics, means for providing image settings to be used for capturing images, the means for providing configured to operate in a manual mode and operate in an automatic mode, and the means for providing configured to receive customized image settings from a user of the system when in the manual mode and configured to provide automatically determined image settings when in the automatic mode, means for capturing a first image in the manual mode using manual image settings from the means for providing image settings while identifying one or more scene characteristics of the first image, means for associating the manual image settings used to capture the first image with one or more of the identified scene characteristics of the first image, means for determining an auto- setting relationship based on the association, and means for determining customized automatic image settings based on the auto-setting relationship and subsequent scene characteristics of a subsequent image captured in the automatic mode.
[0014] In some embodiments, the one or more scene characteristics may comprise one or more environmental attributes of the scene. In some embodiments, the one or more scene characteristics may comprise one or more subject attributes of the scene.
[0015] In another aspect, a method of customizing automatic image settings of an imaging device are disclosed, where the device has an image sensor configured to capture an image having one or more scene characteristics and the device further has an electronic settings module configured with image settings to be used for capturing images, the settings module configured to operate in a manual mode and operate in an automatic mode, and the settings module configured to receive manual image settings from a user of the device when in the manual mode and configured to provide automatically determined customized image settings when in the automatic mode, the method comprising.
[0016] In some embodiments, the method comprises capturing a first image in the manual mode using the manual image settings from the settings module while identifying one or more scene characteristics of the first image, associating the manual image settings used to capture the first image with one or more of the identified scene characteristics of the first image, determining an auto-setting relationship based on the association, and determining customized automatic image settings based on the auto- setting relationship and subsequent scene characteristics of a subsequent image captured in the automatic mode.
[0017] In some embodiments, the method further comprises associating subsequent manual image settings, used with one or more subsequent images captured in the manual mode, with one or more subsequent scene characteristics of the one or more subsequent images captured in the manual mode. In some embodiments, the method further comprises adjusting the auto-setting relationship based on the subsequent association. In some embodiments, the method further comprises capturing the subsequent image in the automatic mode using the customized automatic image settings.
[0018] In some embodiments, the manual image settings used in the method may comprise auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, or temperature. In some embodiments, the automatic image settings used in the method may comprise auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, or temperature.
[0019] In some embodiments, the one or more scene characteristics used in the method may comprise one or more environmental attributes of the scene. The one or more environmental attributes may comprise brightness or scale. In some embodiments, the one or more scene characteristics comprise one or more subject attributes of the scene. The one or more subject attributes may comprise facial likeness, body shape, color or movement.
[0020] In another aspect, a non-transient computer readable medium configured to store instructions that when executed by a processor perform a method for customizing automatic image settings is disclosed. In some embodiments, the method comprises capturing a first image in the manual mode using the manual image settings from a settings module while identifying one or more scene characteristics of the first image, associating the manual image settings used to capture the first image with one or more of the identified scene characteristics of the first image, determining an auto-setting relationship based on the association, and determining customized automatic image settings based on the auto-setting relationship and subsequent scene characteristics of a subsequent image captured in the automatic mode.
[0021] In some embodiments of the non-transient computer readable medium, the method further comprises associating subsequent manual image settings, used with one or more subsequent images captured in the manual mode, with one or more subsequent scene characteristics of the one or more subsequent images captured in the manual mode. The method may further comprise adjusting the auto-setting relationship based on the subsequent association. The method may further comprise capturing the subsequent image in the automatic mode using the customized automatic image settings. [0022] In some embodiments of the non-transient computer readable medium, the manual image settings may comprise auto focus area, auto exposure area, ISO, white- balance, aperture, shutter speed, focus area, flash, or temperature. In some embodiments, the customized automatic image settings may comprise auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, or temperature.
[0023] In some embodiments of the non-transient computer readable medium, the one or more scene characteristics may comprise one or more environmental attributes of the scene. In some embodiments, the one or more scene characteristics may comprise one or more subject attributes of the scene.
BRIEF DESCRIPTION OF THE DRAWINGS
[0024] The foregoing and other features of the present disclosure will become more fully apparent from the following description and appended claims, taken in conjunction with the accompanying drawings. Understanding that these drawings depict only several embodiments in accordance with the disclosure and are not to be considered limiting of its scope, the disclosure will be described with additional specificity and detail through use of the accompanying drawings. In the following detailed description, reference is made to the accompanying drawings, which form a part hereof. In the drawings, similar symbols typically identify similar components, unless context dictates otherwise. The illustrative embodiments described in the detailed description, drawings, and claims are not meant to be limiting. Other embodiments may be utilized, and other changes may be made, without departing from the spirit or scope of the subject matter presented here. It will be readily understood that the aspects of the present disclosure, as generally described herein, and illustrated in the drawings, can be arranged, substituted, combined, and designed in a wide variety of different configurations, all of which are explicitly contemplated and make part of this disclosure.
[0025] FIG. 1A shows an embodiment of a scene having scene characteristics captured by an imaging device.
[0026] FIG. IB is a table illustrating an example of manual settings that may be manually selected for the scene of FIG. 1 A.
[0027] FIG. 2 A shows another embodiment of a scene having scene characteristics captured by an imaging device. [0028] FIG. 2B is a table illustrating an example of automatic settings that may be automatically selected for the scene of FIG. 2A.
[0029] FIG. 3 shows a block diagram of an imaging device having customized automatic image settings.
[0030] FIG. 4 shows an embodiment of an overview process for determining customized automatic image settings.
[0031] FIG. 5A shows an embodiment of a process, for determining or adjusting an auto-setting relationship based on manually selected image settings for use in determining customized automatic image settings, that may be used in the process of FIG. 4.
[0032] FIG. 5B shows an embodiment of a process for determining customized automatic image settings that may be used in the process of FIG. 4.
[0033] FIG. 6 shows an embodiment of a process for applying custom automatic image settings to an image of a scene based on an auto-setting relationship that takes into account preferred manual settings for the same scene.
[0034] FIG. 7 is a table with showing embodiments of manual settings and scene characteristics that may be used to determine an embodiment of an auto-setting relationship.
DETAILED DESCRIPTION
[0035] The following detailed description is directed to certain specific embodiments of the development as described with reference to the accompanying figures. In this description, reference is made to the drawings wherein like parts or steps may be designated with like numerals throughout for clarity. Reference in this specification to "one embodiment," "an embodiment," or "in some embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. The appearances of the phrases "one embodiment," "an embodiment," or "in some embodiments" in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, various features are described which may be exhibited by some embodiments and not by others. Similarly, various requirements are described which may be requirements for some embodiments but not for other embodiments.
[0036] Embodiments and examples of the invention will now be described with reference to the accompanying figures, wherein like numerals refer to like elements throughout. The terminology used in the description presented herein is not intended to be interpreted in any limited or restrictive manner, simply because it is being utilized in conjunction with a detailed description of certain specific embodiments of the invention. Furthermore, embodiments of the invention may include several novel features, no single one of which is solely responsible for its desirable attributes or which is essential to practicing the invention described herein.
[0037] This disclosure provides methods, devices and systems with customized automatic image settings that may be used to capture images with an imaging device, such as a camera. The imaging device may learn the custom preferences of a user based on manually-selected image settings for past captured images. The scenes captured in manual mode may be associated with the manual settings selected for those scenes to form and build a relationship that is used with subsequent images taken in automatic mode. Then, the user may capture subsequent images in automatic mode, and image settings that take into account the user's previously preferred manual image settings may be applied automatically based on the relationship. Thus, the automatic image settings may be customized based on a user's past preferences, and the relationship on which the automatic image settings are based may continue to evolve based on future manually- selected settings.
[0038] FIG. 1A shows an embodiment of a scene 100 having scene characteristics 105 that may be viewed and/or captured by an imaging device. The scene 100 may include a variety of scene characteristics 105. The scene characteristics 105 may relate to the environment in the scene 100, to subjects in the scene 100, or to other features of the scene 100. As shown in FIG. 1A, the scene characteristics 105 in the scene 100 include a baby playing on the floor in an indoor room with the curtains drawn. The scene characteristics 105 also include a wall, the floor, a table and a couch.
[0039] In some embodiments, the scene characteristics 105 include environmental attributes. Environmental attributes may include brightness, scale, or other environmental features of the scene 100. For example, the scene characteristics 105 shown in FIG. 1A include the relative less bright setting of an indoor room with the curtains drawn, as compared to the brightness outside during the day. As another example, the scene characteristics 105 shown in FIG. 1A include the relatively large scale of the items in the close up scene 100, as compared to what the scale of these items would be from far away.
[0040] In some embodiments, the scene characteristics 105 include subject attributes. Subject attributes may include features of people, animals or other things that are the subject of the scene 100. In some embodiments, subject attributes may include facial likeness, body shape, color, movement, and/or other features of the subject or subjects. For example, the scene characteristics 105 shown in FIG. 1A include the facial likeness of the baby, the shape of the baby, the color of the baby, and the movement of the baby. Other items in the scene 100 can also have subject attributes, such as the floor, wall, window curtains, table or couch.
[0041] It is appreciated that while certain scene characteristics 105 have been described as one or another type of characteristic, some characteristics 105 may belong to more than one type. For example, the scale of an object may be described as an environmental attribute, a subject attribute, and/or any other type of scene characteristic 105. Therefore, the discussion of a particular object in the context of one particular type of characteristic 105 is not meant to limit the scope of the present disclosure.
[0042] In some embodiments, the image capturing device may have a manual mode. In the manual mode, the image settings applied to a captured image may be manually chosen by the user of the device. FIG. IB is a table 110 having image settings for which values may be manually selected for an image capturing device to capture an image of the scene of FIG. 1A while the device is in manual mode. Manual settings may include a variety of image settings. In some embodiments, manual settings include image settings such as auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, temperature, and/or others. As shown in FIG. IB, some image settings, for which values may be selected in manual mode, are listed in the left column under the label "Setting." Corresponding values that may be manually chosen for these settings are shown in the right column under the label "Value." For example, for the image setting of "shutter speed," the corresponding value chosen may be "fast," for the image setting of "flash," the corresponding value chosen may be "off," and for the image setting of "temperature," the corresponding value chosen may be "cold." "Temperature" here refers to the color temperature of the image, which may be warm, cool, or other temperatures. These are merely examples, and other values may be chosen for these settings. Further, other settings may be manually selected besides these three.
[0043] A user of an image capturing device may want to use the values for the settings shown in table 110 for capturing an image of the scene 100 shown in FIG. 1A. For example, because the scene 100 includes the scene characteristics 105 of a close up baby indoors, a user may thus want to manually apply a fast shutter speed due to quick movements of the baby, no flash so that the baby's eyes are not harmed, and cold temperature since the baby is close up.
[0044] In some embodiments, the image capturing device may have an automatic mode. In automatic mode, the image settings applied to a captured image are automatically chosen by the device. FIG. 2A shows another embodiment of a scene 100 having scene characteristics 105 viewed and/or captured by an imaging device in automatic mode. As shown in FIG. 2A, the scene characteristics 105 include a baby playing on the floor in an indoor room with the curtains drawn. The scene 100 also includes a crib. It is appreciated that the scene 100 shown in FIG. 2A has different particular scene characteristics 105 than the scene 100 shown in FIG. 1A. For instance, the baby in FIG. 2A is a different baby from that in FIG. 1A.
[0045] FIG. 2B is a table 120 having image settings for which values may be automatically selected by an image capturing device to capture an image of the scene of FIG. 2A while the device is in automatic mode. Automatic settings may include a variety of image settings, including those used for manual mode. In some embodiments, automatic settings include image settings such as auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, temperature, and/or others. As shown in FIG. 2B, some image settings, for which values may be selected in automatic mode, are listed in the left column under the label "Setting." Corresponding values that may be automatically chosen for these settings are shown in the right column under the label "Value." For example, for the image setting of "shutter speed," the corresponding automatically-chosen value may be "fast," for the image setting of "flash," the corresponding automatically-chosen value may be "off," and for the image setting of "temperature," the corresponding automatically-chosen value may be "cold." These are merely examples, and other values may be chosen for these settings. Further, other settings may be automatically selected besides these three. [0046] The values automatically chosen for the image settings as shown in the table 120 of FIG. 2B may or may not coincide with a user's desired values for those image settings. For instance, the device may have been initially programmed to automatically choose a particular set of image settings given a particular set of scene characteristics 105. Thus, the device may have been programmed to automatically choose values for the image settings that are different than a user's preference, or that are different than those shown in table 120 of FIG. 2B, to capture an image of the scene 100 in FIG. 2A. For example, because the scene 100 is indoors, the value for the image setting "flash" may have been automatically selected as "on" based on the initial programming of the device. In some embodiments, a user may desire to alter this and other automatically-chosen image settings. In some embodiments, a user may desire that the device automatically choose different image settings given the scene characteristics 105. For instance, a user may desire that the device automatically choose the value of "off for the image setting "flash" for the scene 100 of FIG. 2A. Thus, in some embodiments, the automatic image settings may be customized. In some embodiments, the automatic image settings may be customized based on a user's preferences. Therefore, the devices and methods disclosed herein allow for customizing the automatic image settings of an image capturing device. In some embodiments, the devices and methods disclosed herein allow for customizing the automatic image settings of an image capturing device based on a user's past manually-chosen image settings, as discussed in more detail below.
[0047] As mentioned, in some embodiments, the automatic image settings may be customized based on a user's previous manually-chosen settings. For instance, the automatically-chosen settings shown in the table 120 of FIG. 2B may be based on the user's manually-chosen image settings shown in table 110 of FIG. IB. In some embodiments, for the scene 100 shown in FIG. 2A, the device may initially be programmed to automatically choose a value of "slow" for the image setting "shutter speed," a value of "on" for the image setting "flash," and a value of "warm" for the image setting "temperature." A user may then manually choose the values for these image settings that are shown in table 110 of FIG. IB to capture the scene characteristics 105 of the scene 100 shown in FIG. 1A. Subsequently, the user may then choose the automatic mode for the device to capture a subsequent image that includes the subsequent scene characteristics 105 of scene 100 shown in FIG. 2A. In some embodiments, the device will learn from the user's previous manually-chosen image settings to inform and alter the automatic mode. These values may be subsequently automatically chosen because the user manually selected them for a previous scene 100 having the same or similar scene characteristics 105. For instance, the scene 100 of FIG. 1A includes a baby, indoors and close up. Similarly, the scene 100 of FIG. IB includes a baby, indoors and close up. Although it is a different baby in a different room in the scene 100 of FIG. 2A, similar scene characteristics 105 are present as were in the scene 100 of FIG. 1A. Therefore, for example, a subsequent image of the scene 100 shown in FIG. 2A may have the image settings shown in table 120 of FIG. 2B applied to the captured image, as opposed to the initially-programmed automatic image settings. Thus, for example, for the scene characteristics 105 of FIG. 2A, instead of the initially-programmed value "slow" for shutter speed, the automatic mode may subsequently choose "fast;" instead of the initially-programmed value "on" for flash, the automatic mode may subsequently choose "off;" and instead of the subsequently value "warm" for flash, the automatic mode may subsequently choose "on."
[0048] FIG. 3 shows a block diagram of an imaging device 300 that may have customized automatic image settings. It is understood that the device 300 that may have customized automatic image settings includes devices that may merely have the ability to have customized automatic image settings. For example, a device 300 that is new may not yet know a user's preferences, and so there are no actual customized automatic image settings yet. Or, as another example, the device 300 may have its customized automatic image settings erased. It is understood that these and other examples are included as devices that may have customized automatic image settings and are thus within the scope of the present disclosure.
[0049] As shown in FIG. 3, the imaging device 300 may have a set of components including a processor 320 coupled to an imaging sensor 315. A working memory 305, storage 310, electronic display 325, and a memory 330 may also be included in the device 300 and in data communication with the processor 320.
[0050] The device 300 may be a digital camera, cell phone, tablet, personal digital assistant, or the like. A plurality of applications may be available to the user on device 300. These applications may relate to sensing scene characteristics, applying image settings to capture an image, selecting an image capture mode such as manual or automatic, associating scene characteristics with the image settings used to capture the image of the scene, building and/or adjusting an auto-setting relationship based on the associations. Many other applications that are known in the art may further be included.
[0051] In the example illustrated in FIG. 3, the processor 320 may be a general purpose processing unit or a processor specially designed for color sensing applications. As shown, the processor 320 is connected to a memory 330 and a working memory 305. In the illustrated embodiment, the memory 330 stores image sensor control module 335, scene characteristics recognition module 340, mode module 345, setting module 350, association module 355, auto-setting relationship module 357 and operating system 360. These modules include instructions that configure the processor 320 to perform various image processing and device management tasks. Working memory 305 may be used by processor 320 to store a working set of processor instructions contained in the modules of memory 330. Alternatively, working memory 305 may also be used by processor 320 to store dynamic data created during the operation of the device 300.
[0052] The processor 320 may be configured by several modules. Image sensor control module 335 may include instructions that configure the processor 320 to capture an image using the imaging sensor 315. For example, image sensor control module 335 may include instructions that configure the processor 320 to capture an image of the scene 100, having scene characteristics 105 shown in FIGS. 1A or IB, using the imaging sensor 315.
[0053] Scene characteristics recognition module 340 may include instructions that configure the processor 320 to identify scene characteristics. In some embodiments, the scene characteristics recognition module 340 may include instructions that configure the processor 320 to identify one or more environmental attributes, one or more subject attributes, and/or other attributes of a scene. For example, scene characteristics recognition module 340 may include instructions that configure the processor 320 to identify scene characteristics of the scene 100 shown in FIG. IB, such as the close up scale of the baby and other objects, the relative brightness of the ambient environment in the room, the facial likeness of the baby, the movement of the baby, and/or others.
[0054] Mode module 345 may include instructions that configure the processor 320 to enter an imaging mode. In some embodiments, the mode module 345 may include instructions that configure the processor 320 to enter a manual mode, an automatic mode, a hybrid mode, or other modes. For example, mode module 345 may include instructions that configure the processor 320 to enter manual mode for capturing an image with the imaging sensor 315 using the manually-chosen image settings shown in table 110 of FIG. IB. As another example, mode module 345 may include instructions that configure the processor 320 to enter automatic mode for capturing an image with the imaging sensor 315 using the automatically-chosen image settings shown in table 120 of FIG. 2B.
[0055] The setting module 350 may include instructions that configure the processor 320 to provide image settings for device 300. In some embodiments, setting module 350 may include instructions that configure the processor 320 to provide image settings for device 300 to use for capturing an image with the imaging sensor 315. In some embodiments, setting module 350 may include instructions that configure the processor 320 to provide image settings for device 300 that relate to auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, temperature, and/or other image settings. For example, image settings to be applied to an image of a scene captured by the device 300 with the imaging sensor 315 may be provided by setting module 350. These settings may include, for example, the manual or automatic values for the settings shown in tables 110 and 120 of FIGS. IB and 2B, respectively.
[0056] Association module 355 may include instructions that configure the processor 320 to associate scene characteristics from the captured scene with the image settings used to capture the image of that scene. In some embodiments, association module 355 may include instructions that configure the processor 320 to associate scene characteristics with image settings that relate to auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, temperature, and/or other image settings. In some embodiments, association module 355 may include instructions that configure the processor 320 to associate image settings with scene characteristics that relate to subject attributes, environmental attributes, and/or other attributes of the scene. For example, association module 355 may include instructions that configure the processor 320 to associate the scene characteristics 105 of the scene 100 shown in FIG. 2A with the image settings shown in table 120 of FIG. 2B. In some embodiments, an association may be generated by association module 355 and stored in memory 310. For example, association module 355 may generate and store in memory 310 the association of A) a close up baby who is moving quickly in an indoor environment with darker lighting conditions, with B) a fast shutter speed, no flash and cold temperature. Further details of associating scene characteristics with image settings are discussed herein, for example with respect to FIGS. 4-7.
[0057] Auto-setting relationship module 357 may include instructions that configure the processor 320 to generate and/or provide automatic image settings, which may be custom automatic image settings, for device 300. In some embodiments, setting module 350 may include instructions to provide automatic image settings for device 300 to use for capturing an image with the imaging sensor 315. In some embodiments, setting module 350 may include instructions to provide automatic image settings for device 300 that relate to auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, temperature, and/or other image settings. For example, setting module 350 may provide instructions that configure the processor 320 to provide automatic image settings to be applied to an image of a scene captured by the device 300 with the imaging sensor 315. These settings may include, for example, the values for the automatic image settings shown in table 120 of FIG. 2B.
[0058] The various modules may further call subroutines in other modules. In some embodiments, the mode module 345 may configure the processor 320 to call subroutines in the setting module 350 to provide image settings based on the current imaging mode, such as manual or automatic mode. In some embodiments, setting module 350 may configure the processor 320 to call subroutines in scene characteristics recognition module 340 to identify scene characteristics of a scene viewed by the imaging sensor 315. In some embodiments, image sensor control module 335 may provide instructions that configure the processor 320 to apply the image settings provided by setting module 350, based on the identified scene characteristics provided by scene characteristics recognition module 340, to capture an image of the scene using the imaging sensor 315. In some embodiments, scene characteristics recognition module 340 may provide instructions that configure the processor 320 to provide to association module 355 the identified scene characteristics from the captured scene. In some embodiments, setting module 350 may provide instructions that configure the processor 320 to provide to association module 355 the image settings used with scene characteristics. In some embodiments, association module 355 may provide instructions that configure the processor 320 to provide the auto-setting relationship module 357 associations between scene characteristics and image settings. [0059] Operating system module 360 may configure the processor 320 to manage the memory and processing resources of device 300. For example, operating system module 360 may include device drivers to manage hardware resources such as the electronic display 325, storage 310, or imaging sensor 315. Therefore, in some embodiments, instructions contained in the other modules discussed above may not interact with these hardware resources directly, but instead interact through standard subroutines or APIs located in the operating system module 360. Instructions within operating system 360 may then interact directly with these hardware components.
[0060] Processor 320 may write data to storage module 310. While storage module 310 is represented graphically as a traditional disk device, those with skill in the art would understand multiple embodiments could include either a disk based storage device or one of several other type storage mediums to include a memory disk, USB drive, flash drive, remotely connected storage medium, virtual disk driver, or the like.
[0061] Although FIG. 3 depicts a device comprising separate components to include a processor, imaging sensor, and memory, one skilled in the art would recognize that these separate components may be combined in a variety of ways to achieve particular design objectives. For example, in an alternative embodiment, the memory components may be combined with processor components to save cost and improve performance.
[0062] Additionally, although FIG. 3 illustrates two memory components, to include memory component 330 comprising several modules, and a separate memory 305 comprising a working memory, one with skill in the art would recognize several embodiments utilizing different memory architectures. For example, a design may utilize ROM or static RAM memory for the storage of processor instructions implementing the modules contained in memory 330. Alternatively, processor instructions may be read at system startup from a disk storage device that is integrated into device 300 or connected via an external device port. The processor instructions may then be loaded into RAM to facilitate execution by the processor. For example, working memory 305 may be a RAM memory, with instructions loaded into working memory 305 before execution by the processor 320.
[0063] FIG. 4 shows an embodiment of a process 400 for customizing automatic image settings. The process 400 may be performed by or with a variety of imaging devices. In some embodiments, the process 400 may be performed by or with the device 300 described with respect to FIG. 3.
[0064] The process 400 may include block 410 wherein an imaging mode is selected. As shown, in some embodiments, block 410 may include selecting either manual or automatic mode. In some embodiments, an imaging mode may be selected in block 410 using the mode module 345 and processor 320 of device 300.
[0065] The process 400 may further include block 420 wherein one or more image settings are selected. In some embodiments, block 420 may include selecting image settings including auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, temperature, and/or other settings. In some embodiments, block 420 may include automatically or annually selecting image settings. In some embodiments, block 420 may include selecting particular values for these or other settings. For instance, block 420 may include manually selecting the values shown in table 110 of FIG. IB. Block 420 may also include automatically selecting the values shown in table 120 of FIG. 2B.
[0066] Block 420 may be performed with a variety of modules and components of an imaging device. In some embodiments, image settings may be selected in block 420 using the setting module 350 and processor 320 of device 300. In some embodiments, image settings may be selected in block 420 using setting module 350, mode module 345 and processor 320 of device 300. In some embodiments, image settings may be selected in block 420 using auto-setting relationship module 357, setting module 350, mode module 345 and processor 320 of device 300.
[0067] The process 400 may further include block 430 wherein an image of a scene is captured. In some embodiments, block 430 may include capturing an image of a scene using the image settings selected in block 420. In some embodiments, block 430 may include capturing an image of a scene using the mode selected in block 410. For example, block 430 may include capturing the scene 100 of FIG. 1A using the manually- chosen image settings shown in table 110 of FIG. IB. As another example, block 430 may include capturing the scene 100 of FIG. 2A using the automatically-chosen image settings shown in table 120 of FIG. 2B.
[0068] Block 430 may be performed with a variety of modules and components of an imaging device. In some embodiments, a scene is captured in block 430 using image sensor control module 350, processor 320 and imaging sensor 315 of device 300. In some embodiments, a scene is captured in block 430 using setting module 350, image sensor control module 350, processor 320 and imaging sensor 315 of device 300. In some embodiments, a scene is captured in block 430 using mode module 345, image sensor control module 350, processor 320 and imaging sensor 315 of device 300. In some embodiments, a scene is captured in block 430 using setting module 350, mode module 345, image sensor control module 350, processor 320 and imaging sensor 315 of device 300.
[0069] As an example of block 430 performed in manual mode by device 300, mode module 345 may provide instructions that configure the processor 320 to capture an image in manual mode, setting module 350 may provide instructions that configure the processor 320 to use the image settings shown in table 110 of FIG. IB when capturing the image, and image sensor control module 350 may provide instructions that configure the processor 320 to capture the image using the imaging sensor 315. As an example of block 430 performed in automatic mode by device 300, mode module 345 may provide instructions that configure the processor 320 to capture an image in automatic mode, scene characteristics recognition module 340 may identify the scene characteristics of the scene to be captured, auto- setting relationship module 357 may provide instructions that configure the processor 320 to build and/or use an auto-setting relationship to apply automatically-chosen image settings based on the identified scene characteristics, setting module 350 may provide instructions that configure the processor 320 to use the image settings shown in table 120 of FIG. 2B when capturing the image, and image sensor control module 350 may provide instructions that configure the processor 320 to capture the image using the imaging sensor 315.
[0070] The process 400 may further include block 440 wherein scene characteristics are identified. In some embodiments, block 440 may include identifying subject attributes, environmental attributes, and/or other scene characteristics. For example, block 440 may include identifying scene characteristics 105 of the scenes 100 of FIG. 1A or FIG. 2A. For example, block 440 may include identifying scene characteristics 105 of the scene 100 in FIG. 2A as a facial likeness of a baby, who is moving quickly, has a close up scale, and is in an indoor lighting environment.
[0071] Block 440 may be performed with a variety of modules and components of an imaging device. In some embodiments, block 440 may include identifying scene characteristics using the scene characteristics recognition module 340 of device 300. In some embodiments, block 440 may include identifying and providing the identified scene characteristics to other modules of the device 300. In some embodiments, block 440 may include identifying scene characteristics using the scene characteristics recognition module 340 and the image sensor control module 355 of device 300. In some embodiments, block 440 may include identifying scene characteristics using the scene characteristics recognition module 340, the image sensor control module 355 and the imaging sensor 315 of device 300.
[0072] As an example of block 440 performed by or with the device 300, the image sensor control module 355 may provide instructions that configure the processor 320 to view the scene using the imaging sensor 315, the imaging sensor 315 may then view and/or detect one or more scene characteristics in the scene, and the scene characteristics recognition module 340 may configure the processor 320 to identify the viewed and/or detected scene characteristics. Applying this example using the scene 100 in FIG. 2A, the image sensor control module 355 may provide instructions that configure the processor 320 to view the room and objects in it using the imaging sensor 315, the imaging sensor 315 may then detect that there are three objects in an indoor environment in the scene 100, and the scene characteristics recognition module 340 may configure the processor 320 to identify the objects as a baby that is close up and moving quickly, a close up and stationary crib and window, and an environment that has relatively darker indoor ambient lighting.
[0073] The process 400 may further include block 450 wherein scene characteristics are associated with the image settings used to capture the image of the scene. In some embodiments, block 450 may include associating subject attributes, environmental attributes, and/or other scene characteristics with particular values for auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, temperature, and/or other image settings. In some embodiments, block 450 may include associating the scene characteristics of the scene with manually-chosen image settings. For example, block 450 may include associating the scene characteristics 105 of the scene 100 of FIG. 1A with the manually-chosen image settings shown in table 110 of FIG. IB. Thus, for example, the scene characteristics of a close up, quickly-moving baby in an indoor lighting environment may be associated with a fast shutter speed, no flash, and cold temperature. In some embodiments, block 450 may include associating the scene characteristics of the scene with automatically-chosen image settings. For example, block 450 may include associating the scene characteristics 105 of the scene 100 of FIG. 2A with the automatically-chosen image settings given in table 120 of FIG. 2B. These are merely examples, and many other types and numbers of scene characteristics may be associated with many other types and numbers of image settings. Further details of associating scene characteristics with image settings are discussed herein, for example with respect to FIGS. 5A-7.
[0074] Block 450 may be performed with a variety of modules and components of an imaging device. In some embodiments, block 450 may include associating scene characteristics using association module 355. In some embodiments, block 450 may include associating scene characteristics using one or more modules and/or components of an imaging device. In some embodiments, block 450 may include associating scene characteristics using the image sensor control module 335, the scene characteristics recognition module 340, the mode module 345, the setting module 350, the association module 355, the operating system 360, the processor 320, the imaging sensor 315, the storage 310, and/or the working memory 305 of device 300.
[0075] As an example of block 450 performed by or with the device 300, the scene characteristics recognition module 340 may provide instructions that configure the processor 320 to provide identified scene characteristics to the association module 355, the setting module 350 may provide instructions that configure the processor 320 to provide image settings used with those identified scene characteristics to the association module 355, and the association module 355 may provide instructions that configure the processor 320 to generate one or more associations between the provided scene characteristics and the provided image settings. Applying this example using the scene 100 in FIG. 2A, the scene characteristics recognition module 340 may provide instructions that configure the processor 320 to provide to the association module 355 the identified scene characteristics 105 of a baby that is close up and moving quickly, a close up and stationary crib and window, and an environment that has relatively darker indoor ambient lighting, the setting module 350 may provide instructions that configure the processor 320 to provide to the association module 355 the image settings shown in table 120 of FIG. 2B, and the association module 355 may provide instructions that configure the processor 320 to generate one or more associations between (A) a baby that is close up and moving quickly, a close up and stationary crib and window, and an environment that has relatively darker indoor ambient lighting, and (B) a fast shutter speed, no flash, and a cold temperature.
[0076] The process 400 may further include block 460 wherein an auto-setting relationship is determined or adjusted. In some embodiments, block 460 may include determining an auto-setting relationship that is used in automatic mode to automatically provide image settings based on one or more identified scene characteristics. In some embodiments, block 460 may include determining an auto-setting relationship based on the one or more associations formed in block 450. In some embodiments, block 460 may include determining an auto-setting relationship based on one or more associations formed between (A) subject attributes, environmental attributes, and/or other scene characteristics and (B) particular values for auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, temperature, and/or other image settings.
[0077] In some embodiments, block 460 may include determining an auto- setting relationship based on one or more associations of the scene characteristics with manually-chosen image settings. For example, block 460 may include determining an auto-setting relationship based on one or more associations between the scene characteristics 105 of the scene 100 of FIG. 1A with the manually-chosen image settings shown in table 110 of FIG. IB. Thus, for example, an auto-setting relationship may be determined that provides the image settings of fast shutter speed, no flash, and cold temperature when the scene characteristics of a close up, quickly-moving baby in an indoor lighting environment are identified. These are merely examples, and many other types and numbers of image settings may be provided based on many other types and numbers of scene characteristics. Further details of determining an auto-setting relationship are discussed herein, for example with respect to FIGS. 5A-7.
[0078] In some embodiments, block 460 may include adjusting an existing auto-setting relationship. In some embodiments, block 460 may include adjusting an auto-setting relationship that is used in automatic mode to automatically provide image settings based on one or more identified scene characteristics. In some embodiments, block 460 may include adjusting an existing auto-setting relationship based on the one or more associations formed in block 450. In some embodiments, block 460 may include adjusting an existing auto-setting relationship based on one or more associations formed between (A) subject attributes, environmental attributes, and/or other scene characteristics and (B) particular values for auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, temperature, and/or other image settings.
[0079] In some embodiments, block 460 may include adjusting an auto-setting relationship based on one or more associations of the scene characteristics with manually- chosen image settings. For example, an auto-setting relationship may exist in the device 300 that will provide the image settings of a flash turned on and slow shutter speed when the scene characteristic of an indoor lighting environment is encountered. Next, the device 300 may be used to capture an image of the scene 100 in FIG. 1A using the manually-chosen images settings shown in table 110 of FIG. IB, e.g. a flash turned off and a fast shutter speed. This may adjust the auto-setting relationship such that the image settings shown in table 120 of FIG. 2B will now be applied to subsequent scenes 100 that include an indoor lighting environment with a close up of a quickly moving baby. This is merely an example, and many other auto-setting relationships may be adjusted in many other manners. Further details of adjusting an auto-setting relationship are discussed herein, for example with respect to FIGS. 5A-7.
[0080] Block 460 may be performed with a variety of modules and components of an imaging device. In some embodiments, block 460 may include determining or adjusting an auto-setting relationship using auto-setting relationship module 357. In some embodiments, block 460 may include determining or adjusting an auto-setting relationship using one or more modules and/or components of an imaging device. In some embodiments, block 460 may include determining or adjusting an auto- setting relationship using the image sensor control module 335, the scene characteristics recognition module 340, the mode module 345, the setting module 350, the association module 355, the operating system 360, the processor 320, the imaging sensor 315, the storage 310, and/or the working memory 305 of device 300.
[0081] As an example of block 460 performed by or with the device 300 to determine an auto-setting relationship, the association module 355 may provide instructions that configure the processor 320 to provide one or more associations to the auto-setting relationship module 357, and the auto-setting relationship module 357 may provide instructions that configure the processor 320 to generate an auto-setting relationship to be used with subsequent images and to store this auto-setting relationship in storage 310. Applying this example using the scene 100 in FIG. 1A, the association module 355 may provide instructions that configure the processor 320 to provide to the auto-setting relationship module 357 the association of (A) a baby that is close up and moving quickly, a close up and stationary crib and window, and an environment that has relatively darker indoor ambient lighting, with (B) a fast shutter speed, no flash, and a cold temperature, and the auto- setting relationship module 357 may provide instructions that configure the processor 320 to generate and store in storage 310 an auto-setting relationship, such that when the scene 100 of FIG. 2A is subsequently encountered in automatic mode this auto-setting relationship in storage 310 will be accessed and used and thus the settings shown in table 120 of FIG. 2B will be automatically applied to the subsequently captured image.
[0082] The process 400 may further include block 470 wherein one or more customized automatic image settings are determined. In some embodiments, customized automatic image settings may be determined in block 470 using the auto-setting relationship determined or adjusted in block 460. In some embodiments, customized automatic image settings may be determined in block 470 when subsequent scene characteristics in a subsequent scene are identified. For example, the scene characteristics 105 of the scene 100 shown in FIG. 2A may be identified, and an auto-setting relationship may be used to determine the customized automatic image settings to apply to the captured image of that scene 100. In this example, the auto-setting relationship may be based on associations between the scene characteristics 105 of FIG. 1A and the image settings shown in table 110 of FIG. IB. For instance, if the scene characteristics 105 of the scene 100 shown in FIG. 2A are identified, then an auto-setting relationship based on associations between the scene characteristics 105 of FIG. 1A and the image settings shown in table 110 of FIG. IB may be used such that the image is taken with the customized image settings that are the same as those shown in table 120 of FIG. 2B, i.e. a fast shutter speed, the flash off, and the temperature set to cold. Further details of determining customized image settings are discussed herein, for example with respect to FIGS. 5A-7.
[0083] FIG. 5A shows an embodiment of a process 500 for determining or adjusting an auto-setting relationship based on manually selected image settings for use in determining customized automatic image settings to be used with subsequently captured images. The process 500 may be used in the overview process 400 shown in FIG. 4.
[0084] The process 500 may include block 505 wherein manual mode is selected. In some embodiments, block 505 includes a user of an image capture device selecting the manual mode. For instance, a user of the device 300 may manually select manual mode using a touch-screen display 325 that provides a menu option for selecting manual mode. Further, for example, the mode module 345 may provide instructions that configure the processor 320 to respond to user input selecting manual mode by entering device 300 into manual mode. Further, any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to selecting the manual mode, including those discussed with respect to block 410 of process 400 shown in FIG. 4 that relate to selecting the manual mode, may apply to block 505 of process 500.
[0085] The process 500 may further include block 510 wherein image settings are manually selected. In some embodiments, block 510 includes a user of an image capture device manually selecting image settings. In some embodiments, block 510 may be performed by or with the device 300 shown in FIG. 3. For instance, a user of the device 300 may manually select image settings using a touch-screen display 325 that provides a menu option for manually selecting image settings. Further, for example, the setting module 345 may provide instructions that configure the processor 320 to respond to manually- selected image settings by applying those settings to an image captured with the imaging sensor 315. Further, any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to manually selecting image settings, including those discussed with respect to block 420 of process 400 shown in FIG. 4 that relate to manually selecting image settings, may apply to block 510 of process 500.
[0086] The process 500 may further include block 515 wherein an image of a scene is captured. In some embodiments, block 510 includes capturing an image of a scene using the image settings manually selected in block 510. In some embodiments, block 515 may be performed by or with the device 300 shown in FIG. 3. For instance, a user of the device 300 may capture on image of the scene using a touch-screen display 325 that provides a menu option for capturing an image with the imaging sensor 315. Further, for example, setting module 350 may provide instructions that configure the processor 320 to capture an image with the imaging sensor 315 using the manually- selected image settings. Further, any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to capturing an image, including those discussed with respect to block 430 of process 400 shown in FIG. 4 that relate to capturing an image, may apply to block 515 of process 500. [0087] The process 500 may further include block 520 wherein scene characteristics of a scene are identified. In some embodiments, block 520 includes identifying the scene characteristics of the scene captured in block 515. In some embodiments, block 520 may be performed by or with the device 300 shown in FIG. 3. For instance, the scene characteristics recognition module 340 of device 300 may provide instructions that configure the processor 320 to identify the scene characteristics of a scene viewed by an imaging sensor 315. Further, any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to identifying scene characteristics, including those discussed with respect to block 440 of process 400 shown in FIG. 4, may apply to block 515 of process 500.
[0088] The process 500 may further include block 525 wherein scene characteristics are associated with manually-selected image settings. In some embodiments, block 525 includes associating scene characteristics identified in block 520 with the manually- selected image settings selected in block 510. In some embodiments, block 525 may be performed by or with the device 300 shown in FIG. 3. For instance, the association module 355 of the device 300 may provide instructions that configure the processor 320 to associate the scene characteristics with manually- selected image settings using. Further, any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to associating scene characteristics with manually- selected image settings, including those discussed with respect to block 450 of process 400 shown in FIG. 4, may apply to block 515 of process 500.
[0089] The process 500 may further include decision block 530 wherein it is determined whether an auto-setting relationship is defined. In some embodiments, block 530 includes using various modules and/or components to determine whether an auto- setting relationship is defined. In some embodiments, block 530 may be performed by or with the device 300 shown in FIG. 3. For instance, the association module 355 of device 300 may provide instructions that configure the processor 320 to determine whether an auto-setting relationship has already been determined. As a result, for example, the processor 320 may search the storage 310 or working memory 305 to determine whether an auto-setting relationship already exists. These are just some examples of how block 530 may be performed by or with the device 300, and other modules and/or components of the device 300 may be used to perform block 530. [0090] If it is determined in decision block 530 that an auto-setting relationship is not defined, then the process 500 may continue to block 535, wherein an auto-setting relationship is determined. In some embodiments, block 535 includes determining an auto-setting relationship based on the scene characteristics identified in block 520 and the image settings manually chosen in block 510. In some embodiments, block 535 may be performed by or with the device 300 shown in FIG. 3. For instance, the auto-setting relationship module 357 of device 300 may provide instructions that configure the processor 320 to determine an auto-setting relationship. Further, any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to determining an auto-setting relationship, including those discussed with respect to block 460 of process 400 shown in FIG. 4, may apply to block 530 of process 500. After block 535, the process 500 may continue to block 545.
[0091] If it is determined in decision block 530 that an auto-setting relationship is defined, then the process 500 may continue to block 540, wherein the auto- setting relationship is adjusted. In some embodiments, block 540 includes adjusting the auto-setting relationship based on the scene characteristics identified in block 520 and the image settings manually chosen in block 510. In some embodiments, block 540 may be performed by or with the device 300 shown in FIG. 3. For instance, the auto-setting relationship module 357 of device 300 may provide instructions that configure the processor 320 to adjust the auto-setting relationship. Further, any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to adjusting an auto-setting relationship, including those discussed with respect to block 460 of process 400 shown in FIG. 4, may apply to block 540 of process 500. After block 540, the process 500 may continue to block 545.
[0092] The process 500 may further include block 545 wherein customized automatic image settings are determined. In some embodiments, block 545 includes determining customized automatic image settings based on the auto-setting relationship determined in block 535. In some embodiments, block 545 includes determining customized automatic image settings based on the auto-setting relationship adjusted in block 540. In some embodiments, block 545 includes determining customized automatic image settings based on applying an auto-setting relationship to scene characteristics identified in subsequent scenes 100 captured in subsequent images. In some embodiments, block 545 may be performed by or with the device 300 shown in FIG. 3. For instance, the auto-setting relationship module 357 of device 300 may provide instructions that configure the processor 320 to apply the auto-setting relationship to subsequently-identified scene characteristics to generate image settings to be automatically applied to images subsequently-captured with the imaging sensor 315. Further, any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to determining customized automatic image settings, including those discussed with respect to block 470 of process 400 shown in FIG. 4, may apply to block 545 of process 500.
[0093] FIG. 5B shows an embodiment of a process 550 for determining customized automatic image settings. The process 550 may be used in the process 400 shown in FIG. 4. For example, process 550 may be used in block 470 of process 400. In some embodiments, process 550 may be used after the process 500 shown in FIG. 5A is performed. For instance, process 500 may be performed to determine or adjust an auto- setting relationship. Then, process 550 may be performed to apply the auto-setting relationship to subsequent images in the automatic mode to determine customized automatic image settings. In some embodiments, process 500 may be performed multiple times before process 550 is performed. In some embodiments, process 500 may be performed to determine an auto-setting relationship, then process 550 may be performed to determine customized automatic image settings, and then process 500 may be performed again to adjust the auto-setting relationship.
[0094] The process 550 may include block 560 wherein automatic mode is selected. In some embodiments, block 560 includes a user of an image capture device selecting the automatic mode. For instance, a user of the device 300 may select automatic mode using a touch-screen display 325 that provides a menu option for selecting automatic mode. Further, for example, the mode module 345 may provide instructions that configure the processor 320 to respond to user input selecting automatic mode by entering device 300 into automatic mode. Further, any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to selecting the automatic mode, including those discussed with respect to block 410 of process 400 shown in FIG. 4 that relate to selecting the automatic mode, may apply to block 560 of process 550.
[0095] The process 550 may further include block 565 wherein scene characteristics of a scene are identified. In some embodiments, block 565 includes identifying the scene characteristics of the scene viewed by an imaging sensor. In some embodiments, block 565 may be performed by or with the device 300 shown in FIG. 3. For instance, the scene characteristics recognition module 340 of device 300 may provide instructions that configure the processor 320 to identify the scene characteristics of a scene viewed by an imaging sensor 315. Further, any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to identifying scene characteristics, including those discussed with respect to block 520 of process 500 shown in FIG. 5A and block 440 of process 400 shown in FIG. 4, may apply to block 565 of process 550.
[0096] The process 550 may further include block 570 wherein customized automatic image settings are determined. In some embodiments, block 570 includes determining customized automatic image settings based on an auto-setting relationship. In some embodiments, block 570 includes determining customized automatic image settings based on applying an auto-setting relationship to scene characteristics identified in a scene. In some embodiments, block 570 includes determining customized automatic image settings based on applying an auto-setting relationship to scene characteristics identified in block 565. In some embodiments, block 570 may be performed by or with the device 300 shown in FIG. 3. For instance, the auto-setting relationship module 357 of device 300 may provide instructions that configure the processor 320 to apply the auto- setting relationship to the scene characteristics to generate image settings to be automatically applied to an image of the scene captured with the imaging sensor 315. Further, any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to determining customized automatic image settings, including those discussed with respect to block 470 of process 400 shown in FIG. 4 and block 545 of process 500 shown in FIG. 5A, may apply to block 545 of process 500.
[0097] The process 550 may further include block 575 wherein an image of a scene is captured. In some embodiments, block 570 includes capturing an image of a scene using the customized automatic image settings determined in block 570. In some embodiments, block 575 may be performed by or with the device 300 shown in FIG. 3. For instance, the image sensor control module 357 and/or setting module 350 of device 300 may provide instructions that configure the processor 320 to apply the customized automatic image settings determined in block 570 to an image of the scene captured with the imaging sensor 315. Further, any of the features, capabilities, methods, steps, modules and/or components discussed above with respect to capturing an image using customizing automatic image settings may apply to block 575 of process 500.
[0098] FIG. 6 shows an embodiment of a process for applying custom automatic image settings to an image of a scene based on an auto-setting relationship that takes into account preferred manual settings for the same scene.
[0099] The process 600 may include block 605 wherein manual mode is selected. Block 605 may include similar features and functionalities as discussed above with respect to block 505 of process 500 shown in FIG. 5A.
[0100] The process 600 may further include block 610 wherein scene characteristics of a scene are identified. Block 610 may include similar features and functionalities as discussed above with respect to block 520 of process 500 shown in FIG. 5A.
[0101] The process 600 may further include block 615 wherein image settings are manually selected. Block 615 may include similar features and functionalities as discussed above with respect to block 510 of process 500 shown in FIG. 5A.
[0102] The process 600 may further include block 620 wherein scene characteristics are associated with manually-selected image settings. In some embodiments, block 620 includes associating the scene characteristics identified in block 610 with the image settings manually selected in block 615. Block 620 may further include similar features and functionalities as discussed above with respect to block 525 of process 500 shown in FIG. 5A.
[0103] The process 600 may further include decision block 625 wherein it is determined whether an auto-setting relationship is defined. Decision block 625 may further include similar features and functionalities as discussed above with respect to decision block 530 of process 500 shown in FIG. 5A. If it is determined in decision block 625 that an auto-setting relationship is not defined, then the process 600 may continue to block 630, wherein an auto-setting relationship is determined. If it is determined in decision block 625 that an auto-setting relationship is defined, then the process 600 may continue to block 635, wherein an existing auto-setting relationship is adjusted.
[0104] The process 600 may further include block 630 wherein an auto-setting relationship is determined. Block 630 may include similar features and functionalities as discussed above with respect to block 535 of process 500 shown in FIG. 5A. After block 630, the process 600 may continue to block 640. [0105] The process 600 may further include block 635 wherein an auto-setting relationship is adjusted. Block 635 may include similar features and functionalities as discussed above with respect to block 540 of process 500 shown in FIG. 5A. After block 635, the process 600 may continue to block 640.
[0106] The process 600 may further include block 640 wherein customized automatic image settings are determined. In some embodiments, block 640 includes determining customized automatic image settings based on the auto-setting relationship determined in block 630. In some embodiments, block 640 includes determining customized automatic image settings based on the auto-setting relationship adjusted in block 635. Block 635 may further include similar features and functionalities as discussed above with respect to block 545 of process 500 shown in FIG. 5A.
[0107] The process 600 may further include block 645 wherein an image of the scene is captured. In some embodiments, block 645 includes capturing an image of the scene using the customized automatic image settings determined in block 640. In some embodiments, therefore, the scene is captured in block 645 using an auto-setting relationship that is determined or adjusted based on the association determined in block 620, which was determined based on manually- selected image settings for that same scene. In this manner, a user may therefore use an auto-setting relationship that takes into account the user's preferences with respect to a scene to capture an image of that same scene. In other words, custom automatic image settings may be determined for a particular scene using an auto-setting relationship that is adjusted or determined on the fly. Thus, instead of merely using the manual image settings selected in block 615 to capture an image of the scene for which those manual settings were selected, rather an auto- setting relationship is used that takes into account those manual image settings selected in block 615 to capture an image of that same scene.
[0108] FIG. 7 is a table showing embodiments of manual settings and scene characteristics that may be used to determine an embodiment of an auto-setting relationship. This auto-setting relationship may in turn be used to determine custom automatic image settings. The various associations and auto-setting relationships that will be discussed with respect to the manual settings and scene characteristics shown in FIG. 7 may be used in the various devices, systems and methods discussed above with respect to FIGS. 1A-6. [0109] As shown in FIG. 7, the table 700 may include column 710 with various manual settings that may be applied in manual mode. The settings may be identified generically as Mi, M2, M3...Mm. There are therefore "m" number of possible manual settings "M" shown. The table 700 may further include column 720 with various scene characteristics that may be in a scene. The scene characteristics are identified generically as Si, S2, S3...Sn. There are therefore "n" number of possible scene characteristics "S" shown. It is further understood that "m" does not necessarily equal "n." There may be more manual settings than scene characteristics or vice versa, or there may be the same number of manual settings as scene characteristics. For instance, there may be ten manual settings, in which case m=10, and one hundred scene characteristics, in which case n=100. This is merely an example. Thus, the variables "m" and "n" are used as generic subscripts that may take the value of any positive integer and may or may not be the same number.
[0110] In some embodiments, an auto-setting relationship may be determined based on associations formed between the manual settings and scene characteristics shown in table 700. In some embodiments, all Si for 1 < i < n are independent, and similarly all Mj for 1 < j < m are independent. Thus, in some embodiments, a conditional probability that a particular manual setting will be chosen, given the "n" possible choices of scene characteristics "S," may be determined as follows: P(MIS) ~ P(SIM)*P(M). In this example, P(MIS) represents the conditional probability that a particular manual setting "M" will be chosen given all possible choices of scene characteristics "S" present in a given scene, P(SIM) represents the conditional probability that a particular scene characteristic "S" is present in the scene given the selected manual settings "M," and P(M) represents the probability that a particular manual setting "M" may be chosen out of all possible manual settings for a device 300. Therefore, in this manner, customized automatic image settings may be determined by computing the highest probability for P(MIS) by multiplying P(SIM) times P(M).
[0111] In some embodiments, the auto-setting relationship may be adjusted based on future or subsequent associations. In some embodiments, the conditional probability P(MIS) may change as more and more associations between manual settings and scene characteristics are determined. For example, Mi and M2 may be associated with Si, while M2 may also be associated with S2. In some embodiments, the auto setting relationship may determine that if S2 is encountered, then Mi should be applied to the captured image. The reason is that there is a link between S2 and Mi given the association between Si with both M2 and Mi. In other words, S2 is associated with M2, which is associated with Si, which is associated with Mi. In this manner, Mi may be applied when S2 is identified.
[0112] In some embodiments, the associations between "S" and "M" may be selectively used to determine the auto-setting relationship. For instance, there may be associations between Mi and S2, Mi and S2, M2 and Si, and M3 and Si. In some embodiments, all of these associations may be used to determine the auto-setting relationship. In other embodiments, some of these associations may be selectively ignored to determine the auto-setting relationship. Thus, in some embodiments, only the associations between Mi and S2, Mi and S2, and M2 and Si may be used to determine the auto-setting relationship, while the association between M3 and Si may not be included in determining the auto-setting relationship.
[0113] The logical blocks, modules and flow chart sequences are illustrative only. A person of skill in the art will understand that the steps, decisions, and processes embodied in the flowcharts described herein may be performed in an order other than that described herein. Thus, the particular flowcharts and descriptions are not intended to limit the associated processes to being performed in the specific order described.
[0114] Those of skill in the art will recognize that the various illustrative logical blocks, modules, and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, software stored on a computer readable medium and executable by a processor, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
[0115] The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
[0116] A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor reads information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC.
[0117] While the above detailed description has shown, described, and pointed out novel features of the invention as applied to various embodiments, it will be understood that various omissions, substitutions, and changes in the form and details of the device or process illustrated may be made by those skilled in the art without departing from the spirit of the invention. As will be recognized, the present invention may be embodied within a form that does not provide all of the features and benefits set forth herein, as some features may be used or practiced separately from others. The scope of the invention is indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.
[0118] A person skilled in the art will recognize that each of these sub-systems may be inter-connected and controllably connected using a variety of techniques and hardware and that the present disclosure is not limited to any specific method of connection or connection hardware.
[0119] The technology is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well- known computing systems, environments, and/or configurations that may be suitable for use with the invention include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, a microcontroller or microcontroller based system, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
[0120] As used herein, instructions refer to computer-implemented steps for processing information in the system. Instructions may be implemented in software, firmware or hardware and include any type of programmed step undertaken by components of the system.
[0121] A microprocessor may be any conventional general purpose single- or multi-chip microprocessor such as a Pentium® processor, a Pentium® Pro processor, a 8051 processor, a MIPS® processor, a Power PC® processor, an Alpha® processor, or a duo core or quad core processor. In addition, the microprocessor may be any conventional special purpose microprocessor such as a digital signal processor or a graphics processor. The microprocessor typically has conventional address lines, conventional data lines, and one or more conventional control lines.
[0122] The system may be used in connection with various operating systems such as Linux®, UNIX® or Microsoft Windows®. The system control may be written in any conventional programming language such as C, C++, BASIC, Pascal, .NET (e.g., C#), or Java, and ran under a conventional operating system. C, C++, BASIC, Pascal, Java, and FORTRAN are industry standard programming languages for which many commercial compilers may be used to create executable code. The system control may also be written using interpreted languages such as Perl, Python or Ruby. Other languages may also be used such as PHP, JavaScript, and the like.
[0123] The foregoing description details certain embodiments of the systems, devices, and methods disclosed herein. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the systems, devices, and methods may be practiced in many ways. As is also stated above, it should be noted that the use of particular terminology when describing certain features or aspects of the invention should not be taken to imply that the terminology is being re-defined herein to be restricted to including any specific characteristics of the features or aspects of the technology with which that terminology is associated.
[0124] It will be appreciated by those skilled in the art that various modifications and changes may be made without departing from the scope of the described technology. Such modifications and changes are intended to fall within the scope of the embodiments. It will also be appreciated by those of skill in the art that parts included in one embodiment are interchangeable with other embodiments; one or more parts from a depicted embodiment may be included with other depicted embodiments in any combination. For example, any of the various components described herein and/or depicted in the Figures may be combined, interchanged or excluded from other embodiments.
[0125] With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art may translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations may be expressly set forth herein for sake of clarity.
[0126] It will be understood by those within the art that, in general, terms used herein are generally intended as "open" terms (e.g., the term "including" should be interpreted as "including but not limited to," the term "having" should be interpreted as "having at least," the term "includes" should be interpreted as "includes but is not limited to," etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases "at least one" and "one or more" to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles "a" or "an" limits any particular claim containing such introduced claim recitation to embodiments containing only one such recitation, even when the same claim includes the introductory phrases "one or more" or "at least one" and indefinite articles such as "a" or "an" (e.g., "a" and/or "an" should typically be interpreted to mean "at least one" or "one or more"); the same holds true for the use of definite articles used to introduce claim recitations.
[0127] In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of "two recitations," without other modifiers, typically means at least two recitations, or two or more recitations). Furthermore, in those instances where a convention analogous to "at least one of A, B, and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B, and C" would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.).
[0128] It will be further understood by those within the art that virtually any disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase "A or B" will be understood to include the possibilities of "A" or "B" or "A and B."
[0129] All references cited herein are incorporated herein by reference in their entirety. To the extent publications and patents or patent applications incorporated by reference contradict the disclosure contained in the specification, the specification is intended to supersede and/or take precedence over any such contradictory material.
[0130] The term "comprising" as used herein is synonymous with "including," "containing," or "characterized by," and is inclusive or open-ended and does not exclude additional, unrecited elements or method steps.
[0131] All numbers expressing quantities used in the specification and claims are to be understood as being modified in all instances by the term "about." Accordingly, unless indicated to the contrary, the numerical parameters set forth in the specification and attached claims are approximations that may vary depending upon the desired properties sought to be obtained by the present invention. At the very least, and not as an attempt to limit the application of the doctrine of equivalents to the scope of the claims, each numerical parameter should be construed in light of the number of significant digits and ordinary rounding approaches.
[0132] The above description discloses several methods, devices and systems of the present invention. This invention is susceptible to modifications in the methods, devices and systems. Such modifications will become apparent to those skilled in the art from a consideration of this disclosure or practice of the invention disclosed herein. Consequently, it is not intended that this invention be limited to the specific embodiments disclosed herein, but that it cover all modifications and alternatives coming within the true scope and spirit of the invention as embodied in the following claims. Therefore, although this invention has been disclosed in the context of certain preferred embodiments and examples, it will be understood by those skilled in the art that the present invention extends beyond the specifically disclosed embodiments to other alternative embodiments and/or uses of the invention and equivalents thereof. It is also contemplated that various combinations or sub-combinations of the specific features and aspects of the embodiments may be made and still fall within the scope of the invention. Accordingly, it should be understood that various features and aspects of the disclosed embodiments can be combined with or substituted for one another in order to form varying modes of the disclosed invention. Thus, it is intended that the scope of the present invention herein disclosed should not be limited by the particular disclosed embodiments described above, but should be determined only by a fair reading of the claims that follow.

Claims

WHAT IS CLAIMED IS:
1. An imaging device with customized automatic image settings, the device comprising:
an image sensor configured to capture an image having one or more scene characteristics;
an electronic settings module configured with image settings to be used for capturing images, the settings module configured to operate in a manual mode and operate in an automatic mode, and the settings module configured to receive manual image settings from a user of the device when in the manual mode and configured to provide automatically determined customized image settings when in the automatic mode; and
a processor in data communication connection with the image sensor and the settings module, the processor configured to execute a set of instructions to perform a method comprising
capturing a first image in the manual mode using the manual image settings from the settings module while identifying one or more scene characteristics of the first image;
associating the manual image settings used to capture the first image with one or more of the identified scene characteristics of the first image;
determining an auto-setting relationship based on the association; and
determining customized automatic image settings based on the auto-setting relationship and subsequent scene characteristics of a subsequent image captured in the automatic mode.
2. The device of claim 1, wherein associating the manual image settings used to capture the first image with one or more of the identified scene characteristics of the first image comprises:
determining a first probability defined as the probability that the manual image settings used to capture the first image would be selected from a set of possible manual image settings; and determining a second probability defined as the probability that the identified scene characteristics would be present given the manual image settings used to capture the first image.
3. The device of claim 2, wherein determining the customized automatic image settings based on the auto-setting relationship and the subsequent scene characteristics of the subsequent image captured in the automatic mode comprises:
determining the most probable manual image settings using the subsequent scene characteristics by multiplying the first probability times the second probability,
wherein the determined most probable manual image settings are the customized automatic image settings to be applied to the subsequent image.
4. The device of claim 1, the method further comprising associating subsequent manual image settings, used with one or more subsequent images captured in the manual mode, with one or more subsequent scene characteristics of the one or more subsequent images captured in the manual mode.
5. The device of claim 4, the method further comprising adjusting the auto- setting relationship based on the subsequent association.
6. The device of claim 1, the method further comprising capturing the subsequent image in the automatic mode using the customized automatic image settings.
7. The device of claim 1, wherein the manual image settings comprise:
auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, or temperature, and
wherein the customized automatic image settings comprise auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, or temperature.
8. The device of claim 1, wherein the one or more scene characteristics comprise one or more environmental attributes of the scene.
9. The device of claim 8, wherein the one or more environmental attributes comprise brightness or scale.
10. The device of claim 1, wherein the one or more scene characteristics comprise one or more subject attributes of the scene.
11. The device of claim 10, wherein the one or more subject attributes comprise facial likeness, body shape, color or movement.
12. An imaging system with customized automatic image settings, the system comprising:
means for capturing an image having one or more scene characteristics; means for providing image settings to be used for capturing images, the means for providing configured to operate in a manual mode and operate in an automatic mode, and the means for providing configured to receive customized image settings from a user of the system when in the manual mode and configured to provide automatically determined image settings when in the automatic mode; means for capturing a first image in the manual mode using manual image settings from the means for providing image settings while identifying one or more scene characteristics of the first image;
means for associating the manual image settings used to capture the first image with one or more of the identified scene characteristics of the first image; means for determining an auto-setting relationship based on the association; and
means for determining customized automatic image settings based on the auto-setting relationship and subsequent scene characteristics of a subsequent image captured in the automatic mode.
13. The system of claim 12, wherein the one or more scene characteristics comprise
one or more environmental attributes of the scene.
14. The system of claim 12, wherein the one or more scene characteristics comprise
one or more subject attributes of the scene.
15. A method of customizing automatic image settings of an imaging device, the device having an image sensor configured to capture an image having one or more scene characteristics and the device further having an electronic settings module configured with image settings to be used for capturing images, the settings module configured to operate in a manual mode and operate in an automatic mode, and the settings module configured to receive manual image settings from a user of the device when in the manual mode and configured to provide automatically determined customized image settings when in the automatic mode, the method comprising: capturing a first image in the manual mode using the manual image settings from the settings module while identifying one or more scene characteristics of the first image;
associating the manual image settings used to capture the first image with one or more of the identified scene characteristics of the first image;
determining an auto-setting relationship based on the association; and determining customized automatic image settings based on the auto-setting relationship and subsequent scene characteristics of a subsequent image captured in the automatic mode.
16. The method of claim 15, further comprising:
associating subsequent manual image settings, used with one or more subsequent images captured in the manual mode, with one or more subsequent scene characteristics of the one or more subsequent images captured in the manual mode.
17. The method of claim 16, further comprising:
adjusting the auto-setting relationship based on the subsequent association.
18. The method of claim 15, further comprising:
capturing the subsequent image in the automatic mode using the customized automatic image settings.
19. The method of claim 15, wherein the manual image settings comprise:
auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, or temperature, and
wherein the customized automatic image settings comprise auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, or temperature.
20. The method of Claim 15, wherein the one or more scene characteristics comprise one or more environmental attributes of the scene.
21. The method of claim 20, wherein the one or more environmental attributes comprise brightness or scale.
22. The method of claim 15, wherein the one or more scene characteristics comprise one or more subject attributes of the scene.
23. The method of claim 22, wherein the one or more subject attributes comprise facial likeness, body shape, color or movement.
24. A non-transient computer readable medium configured to store instructions that when executed by a processor perform a method for customizing automatic image settings, the method comprising:
capturing a first image in the manual mode using the manual image settings from a settings module while identifying one or more scene characteristics of the first image;
associating the manual image settings used to capture the first image with one or more of the identified scene characteristics of the first image;
determining an auto-setting relationship based on the association; and determining customized automatic image settings based on the auto-setting relationship and subsequent scene characteristics of a subsequent image captured in the automatic mode.
25. The non-transient computer readable medium of claim 24, wherein the method further comprises associating subsequent manual image settings, used with one or more subsequent images captured in the manual mode, with one or more subsequent scene characteristics of the one or more subsequent images captured in the manual mode.
26. The non-transient computer readable medium of claim 25, wherein the method further comprises adjusting the auto-setting relationship based on the subsequent association.
27. The non-transient computer readable medium of claim 24, wherein the method further comprises capturing the subsequent image in the automatic mode using the customized automatic image settings.
28. The non-transient computer readable medium of claim 24, wherein the manual image settings comprise:
auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, or temperature, and
wherein the customized automatic image settings comprise auto focus area, auto exposure area, ISO, white-balance, aperture, shutter speed, focus area, flash, or temperature.
29. The non-transient computer readable medium of claim 24, wherein the one or more scene characteristics comprise one or more environmental attributes of the scene.
30. The non-transient computer readable medium of claim 24, wherein the one or more scene characteristics comprise one or more subject attributes of the scene.
PCT/CN2014/082372 2014-07-17 2014-07-17 Imaging systems, devices and methods for customized automatic image settings WO2016008127A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2014/082372 WO2016008127A1 (en) 2014-07-17 2014-07-17 Imaging systems, devices and methods for customized automatic image settings

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2014/082372 WO2016008127A1 (en) 2014-07-17 2014-07-17 Imaging systems, devices and methods for customized automatic image settings

Publications (1)

Publication Number Publication Date
WO2016008127A1 true WO2016008127A1 (en) 2016-01-21

Family

ID=55077824

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/082372 WO2016008127A1 (en) 2014-07-17 2014-07-17 Imaging systems, devices and methods for customized automatic image settings

Country Status (1)

Country Link
WO (1) WO2016008127A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10944907B2 (en) 2017-12-13 2021-03-09 Qualcomm Incorporated Generating an image using automatic mode settings while in manual mode

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1786809A (en) * 2004-12-10 2006-06-14 索尼株式会社 Image pickup apparatus and control method of the same
US20080024624A1 (en) * 2006-07-25 2008-01-31 Fujifilm Corporation Photographing apparatus and exposure control method
US20130335792A1 (en) * 2012-06-19 2013-12-19 Xerox Corporation Detecting common errors in repeated scan workflows by use of job profile metrics

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1786809A (en) * 2004-12-10 2006-06-14 索尼株式会社 Image pickup apparatus and control method of the same
US20080024624A1 (en) * 2006-07-25 2008-01-31 Fujifilm Corporation Photographing apparatus and exposure control method
US20130335792A1 (en) * 2012-06-19 2013-12-19 Xerox Corporation Detecting common errors in repeated scan workflows by use of job profile metrics

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10944907B2 (en) 2017-12-13 2021-03-09 Qualcomm Incorporated Generating an image using automatic mode settings while in manual mode
US11368619B2 (en) 2017-12-13 2022-06-21 Qualcomm Incorporated Generating an image using automatic mode settings while in manual mode

Similar Documents

Publication Publication Date Title
EP2878121B1 (en) Method and apparatus for dual camera shutter
US11070717B2 (en) Context-aware image filtering
US20130021512A1 (en) Framing of Images in an Image Capture Device
WO2019091412A1 (en) Image capture method, apparatus, terminal, and storage medium
US10629167B2 (en) Display apparatus and control method thereof
CN103841324A (en) Shooting processing method and device and terminal device
JP2016530827A (en) System, device and method for tracking objects on a display
US9185300B2 (en) Photographing apparatus for scene catergory determination and method for controlling thereof
WO2016061011A2 (en) Camera capture recommendation for applications
US20170178356A1 (en) System and method to modify display of augmented reality content
US20190114062A1 (en) A computer implemented method for creating a dynamic light effect and controlling a lighting device according to the dynamic light effect
US20170178298A1 (en) System and method for adjusting perceived depth of an image
CN112153360B (en) Method, device and equipment for determining exposure parameters of head-mounted equipment
US20160140748A1 (en) Automated animation for presentation of images
CN111866393A (en) Display control method, device and storage medium
TW202318331A (en) Camera initialization for reduced latency
WO2022143311A1 (en) Photographing method and apparatus for intelligent view-finding recommendation
CN114926351B (en) Image processing method, electronic device, and computer storage medium
CN117135257A (en) Image display method and electronic device
WO2016008127A1 (en) Imaging systems, devices and methods for customized automatic image settings
US20230388659A1 (en) Customized image reprocessing system using a machine learning model
US11770621B2 (en) Customized image reprocessing system using a machine learning model
CN115883958A (en) Portrait shooting method
KR20230149615A (en) Method and apparatus for light estimation
CN115776615A (en) Exposure adjustment method and device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14897798

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14897798

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载