WO2006095110A2 - Procédé pour commander une action, notamment une modification de netteté, à partir d'une image numérique en couleurs - Google Patents
Procédé pour commander une action, notamment une modification de netteté, à partir d'une image numérique en couleurs Download PDFInfo
- Publication number
- WO2006095110A2 WO2006095110A2 PCT/FR2006/050197 FR2006050197W WO2006095110A2 WO 2006095110 A2 WO2006095110 A2 WO 2006095110A2 FR 2006050197 W FR2006050197 W FR 2006050197W WO 2006095110 A2 WO2006095110 A2 WO 2006095110A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- sharpness
- color
- digital image
- capture apparatus
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 183
- 230000009471 action Effects 0.000 title claims abstract description 82
- 230000004048 modification Effects 0.000 title claims description 21
- 238000012986 modification Methods 0.000 title claims description 21
- 239000003086 colorant Substances 0.000 claims abstract description 122
- 238000005259 measurement Methods 0.000 claims abstract description 73
- 230000003287 optical effect Effects 0.000 claims description 93
- 238000012545 processing Methods 0.000 claims description 79
- 230000006870 function Effects 0.000 claims description 46
- 230000003595 spectral effect Effects 0.000 claims description 40
- 238000011282 treatment Methods 0.000 claims description 30
- 230000004044 response Effects 0.000 claims description 23
- 230000006835 compression Effects 0.000 claims description 18
- 238000007906 compression Methods 0.000 claims description 18
- 230000008859 change Effects 0.000 claims description 17
- 230000006872 improvement Effects 0.000 claims description 17
- 230000004075 alteration Effects 0.000 claims description 15
- 238000001514 detection method Methods 0.000 claims description 11
- 230000033001 locomotion Effects 0.000 claims description 10
- 230000008569 process Effects 0.000 claims description 10
- 238000004519 manufacturing process Methods 0.000 claims description 9
- 238000004364 calculation method Methods 0.000 claims description 8
- 230000003247 decreasing effect Effects 0.000 claims description 8
- 238000009432 framing Methods 0.000 claims description 8
- 230000004913 activation Effects 0.000 claims description 6
- 238000011161 development Methods 0.000 claims description 5
- 238000012546 transfer Methods 0.000 claims description 5
- 238000001228 spectrum Methods 0.000 claims description 4
- 238000003860 storage Methods 0.000 claims description 4
- 238000001429 visible spectrum Methods 0.000 claims description 4
- 230000005540 biological transmission Effects 0.000 claims description 3
- 230000001419 dependent effect Effects 0.000 claims description 3
- 238000002156 mixing Methods 0.000 claims description 3
- 230000006641 stabilisation Effects 0.000 claims description 3
- 238000011105 stabilization Methods 0.000 claims description 3
- 230000000875 corresponding effect Effects 0.000 description 46
- 238000010586 diagram Methods 0.000 description 40
- 230000008901 benefit Effects 0.000 description 14
- 230000000694 effects Effects 0.000 description 13
- 238000012937 correction Methods 0.000 description 12
- 238000013461 design Methods 0.000 description 11
- 230000035945 sensitivity Effects 0.000 description 10
- 230000006978 adaptation Effects 0.000 description 7
- 230000001276 controlling effect Effects 0.000 description 7
- 230000007547 defect Effects 0.000 description 7
- 238000001914 filtration Methods 0.000 description 6
- 238000006073 displacement reaction Methods 0.000 description 5
- 238000009877 rendering Methods 0.000 description 5
- 230000007704 transition Effects 0.000 description 5
- 238000007792 addition Methods 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 4
- 230000018109 developmental process Effects 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 238000007796 conventional method Methods 0.000 description 3
- 230000007423 decrease Effects 0.000 description 3
- 238000009826 distribution Methods 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 230000005855 radiation Effects 0.000 description 3
- 230000003416 augmentation Effects 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 238000011143 downstream manufacturing Methods 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 239000000126 substance Substances 0.000 description 2
- 238000002604 ultrasonography Methods 0.000 description 2
- 241000579895 Chlorostilbon Species 0.000 description 1
- 241000593989 Scardinius erythrophthalmus Species 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 230000005672 electromagnetic field Effects 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 229910052876 emerald Inorganic materials 0.000 description 1
- 239000010976 emerald Substances 0.000 description 1
- 238000005265 energy consumption Methods 0.000 description 1
- 230000004907 flux Effects 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 238000000386 microscopy Methods 0.000 description 1
- 201000005111 ocular hyperemia Diseases 0.000 description 1
- 230000010363 phase shift Effects 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 229920006395 saturated elastomer Polymers 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 229910052709 silver Inorganic materials 0.000 description 1
- 239000004332 silver Substances 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 238000001931 thermography Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 238000011144 upstream manufacturing Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/958—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
- H04N23/959—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
- H04N25/61—Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
- H04N25/615—Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4" involving a transfer function modelling the optical system, e.g. optical transfer function [OTF], phase transfer function [PhTF] or modulation transfer function [MTF]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
- H04N25/61—Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
- H04N25/615—Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4" involving a transfer function modelling the optical system, e.g. optical transfer function [OTF], phase transfer function [PhTF] or modulation transfer function [MTF]
- H04N25/6153—Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4" involving a transfer function modelling the optical system, e.g. optical transfer function [OTF], phase transfer function [PhTF] or modulation transfer function [MTF] for colour signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/134—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
Definitions
- the focusing point of the lens for a wavelength is variable depending on the distance to which the object 4 is represented in the image.
- FIG. 1b shows the new locations 4.1 and 4.2 of the focusing points associated, respectively, with the wavelengths ⁇ 1 and ⁇ 2 when the object represented has moved from a very great distance (FIG. 1a) to a closer distance. ( Figure Ib).
- the senor is located on the focal point of the color ( ⁇ l) which previously did not form a clear image.
- the focusing point of the lens for a wavelength and an object distance is variable depending on the position in the image of the object represented.
- FIG. 2 which is an example of the spectral distribution of an image along axis 6.1
- the images are generally composed of several colors whose intensities (axis of ordinates 6.2) can be close.
- blue 5.1 components (wavelength around 450 nm), green 5.2 (wavelength around 550 nm) and red (near 600 nm wavelength) are represented, but it is clear that the invention applies to an image regardless of its distribution of colors and wavelengths considered (eg infrared or ultraviolet).
- the invention generally relates to a method for improving the sharpness of at least one color of a digital image comprising the steps of - choosing from the colors of the image at least one color called net color,
- the invention also relates to a method of producing a capture apparatus which comprises a capture optical system, and a sensor, and / or a servo system, the image being processed, for its improvement, by digital image processing means; a method in which the parameters of the optical system and / or the sensor and / or the servo system are determined or selected from the capabilities of the digital means image processing, so as to minimize the costs of implementation and / or optimize the performance of the capture apparatus.
- the method further comprises the step of decomposing the digital image into regions; said choice of the net color being made for each region.
- said choice of the net color consists of choosing the sharpest color according to a predetermined rule. In one embodiment, said choice of "clean color" is predetermined.
- said digital image is derived from a capture apparatus and said choice of the net color is a function of the distance between the capture apparatus and at least one object of the captured scene to obtain said digital image.
- the method further comprises the step of determining an enslaving instruction of said capturing apparatus from the sharpness of at least two colors; so that the focus is done in fewer steps and is accelerated.
- said method further comprises the step of selecting an optic a set of predetermined optics; said optics having features such that images of an object at at least two predetermined distances have distinct distinct colors; so that the depth of field is improved and / or the cost of optics is decreased.
- said method further comprises the step of designing an optic taking into account the method according to the invention; said optics having characteristics such that the images of an object at at least two predetermined distances have distinct distinct colors, so that the depth of field and / or the aperture and / or any other optical characteristic is improved and / or the cost of optics is decreased,
- the invention also relates to a method for producing a device (20) for capturing and / or restoring images which comprises an optical system (22, 22 ') for capturing and / or restoring images, a sensor (24) and / or generator (24 ') of images, and / or a servo system (26), the image being processed, for its improvement, by digital means (28, 28') of image processing,
- the method being such that the parameters of the optical system and / or the sensor and / or the image generator and / or the servo system are determined or selected, based on the capabilities of the digital image processing means , and in particular the improvement of the sharpness of a color according to the sharpness of another color according to a process according to one of the preceding claims,
- the invention also relates to a digital image obtained by a method according to one of the preceding embodiments or from an apparatus according to the preceding embodiment.
- - Digital image means an image in digital form.
- the image may be from an image capture apparatus.
- the digital image may be represented by a set of numerical values, hereinafter called gray level, each numerical value being associated with a color sensitivity and a relative geometric position on a surface or a volume.
- gray level a set of numerical values
- color refers to the set of numerical values associated with the same color sensitivity.
- the digital image is preferably the raw image of the sensor ("raw” format) before the demosaicing operation ("demosaicing" in English).
- the digital image can also have undergone a treatment, for example a demosaicing, a white balance.
- the digital image has not undergone subsampling.
- the image capture apparatus includes a sensor with sensitive elements.
- sensitive element is meant a sensor element for converting a flow of energy into an electrical signal.
- the energy flow can take the form of a luminous flux, X-rays, a magnetic field, an electromagnetic field or sound waves.
- the sensitive elements may be, depending on the case, juxtaposed on a surface and / or superimposed in a volume.
- the sensitive elements may be arranged in a rectangular matrix, a hexagonal matrix or other geometry.
- the invention applies to sensors comprising sensitive elements of at least two different types, each type having a color sensitivity, each color sensitivity corresponding to the portion of the energy flow converted into an electrical signal by the sensor. sensing element of the sensor.
- the sensors In the case of a visible image sensor, the sensors generally have a sensitivity in 3 colors and the digital image has 3 colors: red 5.1, green 5.2 and blue 5.3 represented in FIG. 2 which shows on the vertical axis 6.2 the amount of energy converted and on the horizontal axis 6.1 the wavelength. Some sensors have a sensitivity in 4 colors red, green, emerald, blue.
- color is also meant a combination, including linear signals delivered by the sensor.
- the sharpness of a color may correspond to the measurement of a value called BXU which is a measure of the area of blur spot, as described in the article published in the Proceedings of IEEE, International Conference of Image Processing, Singapore 2004 ", and entitled” Uniqueness of Blur Measure “by Jérians BUZZI and Frédéric GUICHARD.
- BXU is the variance of the impulse response (that is, its average surface). Processing capabilities can be limited to a maximum value of BXU.
- the sharpness of a color is obtained by calculating a gradient.
- the sharpness of a color can be obtained by a gradient calculation of 9 gray levels taken in neighboring geometric positions in the color considered.
- the invention refers to the sharpness of at least two colors. According to one embodiment, the sharpness of at least two colors is only considered relatively relative to each other. For this embodiment, a gradient makes it possible to simply calculate a relative sharpness between two colors independently of the content of the image.
- the invention mentions to choose from the colors at least one color called "clean color”. In one embodiment, this choice can be made by determining which of at least two colors is the sharpest. For this embodiment, a gradient makes it possible to simply determine the sharpest color among at least two colors.
- An image capture device is, for example, a disposable camera, a digital camera, a DSLR (digital or not), a scanner, a fax, an endoscope, a camera, a camcorder, a video camera surveillance, a toy, a camera or a camera integrated or connected to a telephone, personal assistant or computer, thermal imaging camera, ultrasound machine, MRI (magnetic resonance) imaging equipment, X-ray machine.
- servo system means means of the mechanical, chemical, electronic or computer, allowing elements or parameters of the device to meet a set. These include the autofocus system, automatic white balance control, automatic exposure control, optical element control, for example to maintain consistent quality images, an image stabilization system, an optical and / or digital zoom factor control system, or a saturation control system, or a contrast control system.
- the digital image processing means can take various forms depending on the application.
- the digital image processing means may be integrated wholly or partly into the apparatus, as in the following examples:
- An image capture apparatus which produces modified images, for example a digital camera which incorporates image processing means.
- An image rendering apparatus which displays or prints modified images, for example a video projector or a printer including image processing means.
- a mixed device that corrects the defects of its elements for example a scanner / printer / fax including image processing means.
- a professional image capture apparatus that produces modified images, for example an endoscope including image processing means.
- the determined or selected parameters of the optical system are chosen from the group comprising: the number of optical elements of the system, the nature of the materials composing the optical elements of the optical system, the cost of the optical system materials, the treatment optical surfaces, assembly tolerances, parallax value as a function of focal length, aperture characteristics, aperture mechanisms, range of possible focal lengths, focusing characteristics, focusing mechanisms to the point, anti-alias filters, clutter, depth of field, focal length and focus characteristics, geometric distortions, chromatic aberrations, decentering, vignetting, sharpness characteristics, - and / or the determined or selected parameters of the sensor and / or image generator are selected from the group consisting of: pixel quality, area pixels, the number of pixels, the microlens matrix, the anti-alias filters, the geometry of the pixels, the arrangement of the pixels, and / or the determined or selected parameters of the servo system are selected from the group consisting of: focus measurement, exposure metering, white balance measurement, focus set point, set point the set time, the
- the focusing can be carried out in various ways, in particular by controlling the position of moving elements of the optical system or by controlling the geometry of elements. deformable optics.
- the depth of field can be defined as the range of distances in which the object generates a sharp image, that is to say whose sharpness is greater than a given threshold for a color , usually green, or again, or as the distance between the nearest object plane and the farthest object plane for which the blur spot does not exceed predetermined dimensions.
- the invention relates to a method for controlling an action from a measurement made on at least one digital image, having at least two colors, from an image capture apparatus, in which: - the relative sharpness is measured between at least two colors on at least one region R of the image, and
- region it means part or all of the image.
- a region has one or more pixels, contiguous or not.
- the relative sharpness and / or relative sharpness in a region can be expressed as a single numerical value, for example reflecting the average relative sharpness in the region, or by several numerical values that account for the relative sharpness in different parts of the region.
- At least one action is controlled according to the measured relative sharpness.
- This action is notably (without the list being limiting):
- the treatment may consist (without the list being limiting) in one of the following actions:
- the known methods do not make it possible to control this type of action from a measurement of relative sharpness of at least one region of the image, but require the use of a particular device in addition to the image sensor to estimate a distance.
- the known methods allow a measurement of distance in only one point or a limited number of points while the invention makes it possible to measure the distance in a large number of points simultaneously.
- the controlled action is included in the group comprising:
- the controlled action comprises processing on at least one zone Z 'of the digital image and / or another digital image.
- the sharpness measurement in a digital camera is performed on the image displayed before shooting, and the image taken later is processed at full resolution (while the measurement performed on the displayed image before shooting, is usually at lower resolution) from the last measurement or a combination of the last measurements.
- the zone Z ' constitutes all or part of the region (on which the relative sharpness measurement has been made) of the digital image, and / or the entire digital image, and / or a zone distinct from the region of the digital image, and / or an area of another digital image, and / or another entire digital image.
- the zone Z ' is a pixel and one defines a region of N pixels on which one measures the relative sharpness and, depending on this relative sharpness, we apply a filter that conveys the sharpness of the sharpest color to the other color so that the sharpness of the pixel is increased.
- the depth of field is increased.
- the zone Z 'on which the treatment is carried out can constitute an entire digital image, especially when the sharpness of the entire image is increased.
- the correspondence between the pixels of the two images can be done by associating the pixels of the two images located in the same place.
- This case has the advantage of avoiding storing the digital image between the measurement and the processing without annoying artifact if the images are captured with a small time interval, for example 1 / 15s.
- the treatment may consist of lighting nearby objects and sinking the background, for example for a videoconference.
- the treatment of the brightness will consist of lighting the background and darken the nearest objects to compensate for the effect of the flash.
- an MPEG4 codec is provided with a close object / remote object segmentation, in order to allow to strongly compress the remote object to keep a maximum quality of the main subject that is close.
- Example of changing the compression ratio As above in the case of a video conference, the compression ratio may be higher for the background than for the main subject.
- the treatment consists of replacing a background with a landscape or a setting.
- the processing comprises a sharpness modification for each pixel of the zone Z 'by means of a filter mixing the values attached to the pixel over a predetermined neighborhood at each pixel, the parameters of the filter being a function of the measured relative sharpness.
- zone Z ' is determined from the measured relative sharpness.
- the distance-dependent information may be a distance, for example with an indication of accuracy, or a range of distance values such as, for example, a distance less than one centimeter, a distance of between 1 and 10 centimeters, and a distance of 10 centimeters. centimeters and 1 meter, and beyond one meter.
- Distance-based information can also be represented by a criterion of "too close", “near”, “near”, “far”, or “macro”. Information based on distance can also be translated into nature information of objects or subjects such as “portrait” or "landscape”.
- a focusing servo made from measurements made directly on a single digital image is particularly advantageous compared to known focus tuning, or "autofocus", for which it is necessary to perform measurements on successive images. .
- the illumination control can be carried out according to this main subject whereas with the state of the art the flash power is adjusted according to the focus without determining the main subject, that is to say, in particular, the closest subject.
- less enlightened subjects can be numerically treated by lightening.
- Example of control of another device When a mobile robot has to move, the regions closest to the mobile robot are determined and a trajectory free of any obstacle is determined from the objects closest to the mobile robot.
- the controlled action includes a signal supply such as an indication signal of the object of main interest of the digital image, and / or a focus area, and / or an alarm signal indicating a change in the digitally monitored and imaged scene, and / or a distance from at least a portion of the imaged scene to the capture apparatus.
- a signal supply such as an indication signal of the object of main interest of the digital image, and / or a focus area, and / or an alarm signal indicating a change in the digitally monitored and imaged scene, and / or a distance from at least a portion of the imaged scene to the capture apparatus.
- a digital camera can be arranged a frame, including a predetermined shape, around the main subject to tell the photographer what is the main subject detected by the camera when shooting.
- This indication signal of the main subject is used especially before the shooting itself to tell the photographer what the subject or the clearest object.
- This signal can also be an indication that the nearest object or subject is too close to the camera to be able to be in focus.
- the signal is constituted, for example, by the plaintext message "Foreground too close", or by an exaggeration of the blur of the first plane, or by a visible change in the color of the first plane.
- the signal indicating that the foreground scene or object is too far away may take into account the final use of the image to be taken, including the resolution chosen for that purpose.
- a subject that is blurred on a television or computer receiver screen may be sharp on a small screen of the type of that of a camera.
- a blurry subject for a print on paper of 24cm x 30cm is not inevitably for an impression of 10cm X 15cm.
- the blur indication signal can also accommodate the subject.
- the detection of a bar code is more tolerant to blur than a natural image.
- the camera In a video surveillance system of an object, the camera is set to monitor two regions. The first of these regions is the one where the object is located and the second region is the entire field of the camera. If an object in the shooting field approaches the object to be monitored, an alarm is triggered.
- the controlled action is made to depend on at least one characteristic of the capture apparatus during the shooting, in particular the focal length, the aperture, the focusing distance, the exposure parameters, white balance settings, resolution, compression, or a user-made setting.
- the controlled action is a function of the relative sharpness measured and the relative sharpness between at least two colors depends on the setting of the camera including the focal length, the aperture and the distance to the camera. point.
- the digital image constitutes a raw image derived from the sensor of the capture apparatus.
- This arrangement facilitates the measurement of relative sharpness because if we use a raw image or "raw”, the measurement is not affected by the treatments such as demosaicing, sharpening filtering, color space change or the tone curve.
- the raw image from the sensor may, however, have undergone processing such as denoising, digital gain, black level compensation.
- processing such as denoising, digital gain, black level compensation.
- the relative sharpness measurement and / or the controlled action can be performed in the capture apparatus.
- the command comprises a command for detecting and / or recognizing a portion of the image, such as a detection and / or face recognition.
- the controlled action includes measuring the position and / or movement of the capture apparatus.
- one or more objects intended to remain stationary in a scene of a captured image are stored in memory and motion or position detection is performed by determining the variation of the relative sharpness over time. This arrangement can, for example, be used to make a visual computer interface of the "mouse" type in three dimensions.
- a scene is lit by one or more natural or artificial sources as well as possibly by one (or more) flash (es) controlled by the camera.
- an image capture apparatus performs an exposure control (exposure time, sensor gain and, if appropriate, aperture), a white balance control (gain of each color in the image). the entire image) and possibly the flash (duration and power of the flash) according to measurements in a digital image of the scene (eg saturated area analysis, histogram analysis, average color analysis) and / or measurements made with a complementary device: infrared range finder, flash pre-flash ..., focus servo to find the focus producing the sharpest image by comparing the sharpness of several images taken with different focus.
- These controls change the contrast and / or brightness and / or color of the image but do not use a measure of the relative sharpness between at least two colors on at least one region R of the image.
- the enslavement of the white balance can be performed for example on a subject of significant size in the center of the image, possibly at the expense of a background illuminated differently.
- the method includes determining a close portion in the image and a remote portion and the white balance control performs separate measurements on these two regions to determine the presence or absence of multiple lights, and make separate offsets. for each of these regions. If the position of the area of interest is provided to the focus servo, the focus action will be faster and the main subject (area of interest) can be tracked, even if it is in focus. movement.
- the controlled action includes providing a signal to the user indicating that the image is too close to be sharp.
- the indexing may consist of the provision of a signal indicating that it is a portrait or group of people.
- the distinction between these two situations is made according to whether the pictorial scene includes one or more objects or related subjects. If the distance of the objects or subjects is greater than a predetermined limit, then the image can be considered to represent a landscape.
- the controlled action includes providing to a capture device sound (s), distance information and / or direction relative to the capture apparatus, a subject or object in the digital image.
- a camcorder or a cameraphone one can determine the main subject (s), determine the distances and / or directions of these main subjects and focus the sound capture on the main subject or main topics and thus eliminate the background noise.
- the directivity control of the sound capture can be performed using two microphones and a phase shift between the signals of these microphones.
- a particular application of this latter provision is, in a video conference, the use of a wide-angle image capture apparatus and an automatic tracking of the subject that is expressed orally.
- the controlled action includes the setting of a high compression for the background and a compression for the main subject (s), this (these) main subject (s) ) being determined as constituting an area of the image satisfying criteria based on the measured relative sharpness.
- the invention also relates to a sensor thus defined, independently of a capture apparatus and method, according to the invention, defined above.
- the invention also relates to a capture apparatus comprising such a sensor, this capture apparatus may also be used independently of the method defined above.
- the invention also relates, according to a provision that can be used in combination with the (or independently) provisions defined above, a digital image capture apparatus which comprises a sensor having, on the one hand, pixels whose spectral response is mainly in the domain visible to the human eye and, on the other hand, additional pixels having a spectral response, mainly outside the spectrum visible to the human eye, this sensor was such that the d image from the additional pixels has sharpness, in at least a range of distances between the capture apparatus and the image scene, greater than the sharpness of the portion of the image from the spectral response pixels mainly in the visible range .
- the additional pixels may be sensitive to infrared and / or ultraviolet radiation.
- Pixels sensitive to ultraviolet radiation can be used to improve sharpness for short distances, while pixels sensitive to infrared radiation can be used to improve sharpness at great distances.
- infrared and / or ultraviolet one can hear any part of the spectrum beyond or below the visible spectrum, including the near infra-red such as 700 to 800 or 700 to 900nm, or near ultraviolet near 400nm.
- the capture apparatus is provided with a zoom lens without moving or deformable focusing element, the relative sharpness between at least two colors on at least one region R of the image being variable according to the focal length and / or the position of the object imaged with respect to the apparatus.
- variable-focus optics comprises, for example, a single mobile or deformable optical group.
- a zoom is made with at least two mobile groups, for example one or two for the focal length and the other for focusing.
- the focus and focus are independent, ie when the focal length changes, it is not necessary to change the focus. This eliminates the time needed for focusing.
- varifocal lenses less expensive, in which the focus must be changed when the focal length varies.
- zooms in which two complexly linked mobile optical groups are used to vary the focal length, the focus being achieved by a third group.
- the digital image is derived from at least two sensors.
- each sensor is dedicated to a specific color.
- the controlled action includes adding an object to an image and / or replacing a portion of an image based on the relative sharpness measured on the digital image.
- the method adds a character next to the main subject. It is also possible, by way of example, to add an object in a given position in the image; the object will be the correct size in the image if the distance of the scene imaged at this position is taken into account.
- the method includes capturing a sequence of images, the digital image being part of the sequence and the controlled action being performed on at least one other image of the sequence.
- the estimate of the relative sharpness can be performed on preview images before shooting, at lower resolution, while the correction can be performed on an image permanently stored, for example by means of a choice of filters resulting from a measurement made on the preview images.
- the controlled action includes modifying a setting of the capture apparatus, including focal length, aperture, focus distance.
- the digital image is preferably the raw image of the sensor (format "raw” in English) before demosaicing operation ("demosaicing" in English).
- the digital image may also have been processed, for example, white balance.
- the digital image has not undergone subsampling.
- an optical system, sensor and image processing means that produces a raw image having a better quality or particular characteristics, for example an extension of the depth of field, while maintaining image-like characteristics.
- raw directly from the sensor and in particular an accounting with the functional blocks or known components performing the function of conversion raw image to visible image ("image pipe” or "image signal processor” in English).
- the raw image has undergone demosaicing.
- the optics of the capture apparatus exhibit strong longitudinal chromatic aberrations, for example such that, for a focus, an aperture and a focal length determined, there is at least one color for which the distance object of best sharpness is less than f 2 OJ 3 k being a coefficient less than 0.7, preferably less than 0.5, f being the focal length, 0 the aperture and P the most small (among all the colors of the image) diameters of the blur spot of an object point lying at infinity.
- the comparison of the sharpnesses is carried out by using a measurement M on pixels of the digital image.
- the measurement M in a given pixel P for a given color channel C corresponds to the gradient of the variation of C in a neighborhood P. It is obtained by the following calculation:
- V (P) a neighborhood of the pixel P.
- the measure M at the pixel P having a color C can be defined by the ratio between SM and GM. This gives a value M (P, C).
- This measurement does not allow, in itself, to characterize precisely and completely the sharpness of the color C. In fact, it depends on the content of the image (type of scene imaged: textures, gradients, etc ..) in the neighborhood V (P) of the pixel P.
- a frank transition in the scene imaged for the same sharpness of color will generate a measurement M higher than a smooth transition in the pictorial scene.
- a transition will be present in the same way in each color, thus affecting the measurement M in the same way between the colors. In other words, when a clear transition appears on a color C, the same type of transition appears on the other colors.
- the controlled action when the controlled action is to determine the position of the main subject in the image, the controlled action further comprises automatic framing, including centering the image on the main subject.
- the method may be implemented in an image capture or image processing apparatus or device.
- These devices or devices are part of the group comprising: an electronic component, integrating or not a sensor, an electronic subassembly integrating an optical, a sensor and possibly an image processing module ("camera module”) or any other form as defined above.
- FIG. 2 already described, is the color spectral diagram of an image
- FIGS. 3a and 3b are diagrams representing the improvement of the sharpness of a color by means of the same clear color according to FIG. invention
- FIG. 4 is a diagram showing the improvement of the sharpness of a color by means of different distinct colors associated with distinct regions of an image according to the invention
- FIGS. 5, 6 and 7 are diagrams representing improvement of the sharpness of a color by means of different distinct colors associated with the whole of an image according to the invention
- FIG. 8 is a diagram showing the servo-control of an apparatus according to a sharpness difference between the net color and the color to be improved according to the invention
- FIG. 9 is a diagram representing the choice of a color net from a distance measured between an object and a device capturing the image of this object
- FIGS. 15, 15a, and 15b illustrate a property of an image capture apparatus according to the invention and of a conventional apparatus
- FIGS. 18.1 and 18.2 represent means for implementing the method according to the invention
- Figures 19.1, 19.2 and 19.3 represent steps of the method according to the invention according to several embodiments
- Figures 20.1 and 20.2 show other embodiments of the invention.
- the sharpness of these two colors varies differently depending on this distance but overall, in this example, the first color 13.2 has a better sharpness than that of a second color 13.1 of the same image.
- CA CN + F (CO - CN)
- the filter F will have the particularity of removing the details of the image to which it is applied.
- a linear low-pass filter or averager
- one of the numerous known nonlinear filters having the particularity of removing details such as, for example, a median filter.
- the improvement of a color in the region 11.2 is done by considering the color 8.3 as the net color while the improvement of a color in the region
- the regions of an image may be predetermined or not.
- a region may be a spatial area delimited by one or more pixels.
- FIG. 5 represents the sharpness (axis of ordinates 7.2) of two colors 8.2 and 8.3 as a function of the distance (7.1) between at least one object of the scene captured to obtain said image and the apparatus of capture.
- a method according to the invention can consider the color 8.3 as a clean color, used to correct the sharpness of a color, over the range of distances 9.1, while the color 8.2 is considered the net color to improve a color from an object of the captured scene to obtain the image at a distance from the capture apparatus in the range 9.2.
- the sharpness of the colors on the image can be improved towards a profile as shown in the diagram 6, namely the juxtaposition of the sharpest colors on the image.
- the region of image may vary depending on the geometric position of the region of image and / or other image capture parameters such as focal length, aperture, focus, etc. To determine the sharpest color in the sense of the invention, it is not necessary to know the parameters indicated above.
- the choice of the net color can also be determined by the software activation of at least one image capture mode such as a macro mode as described later. In such a context, we can consider the image as a single region.
- a consequence of the invention is therefore to allow an extension of the depth of field of an optical system as detailed below with the help of Figure 9.
- the depth of field of a capture device initially limited by the sharpness of the color 8.2 and the sharpness threshold 8.1, is increased by using a second color 8.3 having a satisfactory sharpness (below the threshold 8.1) over a new range of distances between at least one object of the scene captured to obtain said image and the capture apparatus.
- such an application is implemented in fixed focus cameras, such as cameraphones.
- the optical design of these devices provides a range of sharpness for great distances up to a few tens of centimeters at best on the basis of a green color, similar to the color 8.2 of Figure 5.
- the blue color does not focus in the same way, it can present a sharpness at distances smaller than the green color, similar to the color 8.3.
- the invention makes it possible to increase the sharpness of an image at a short distance from a cameraphone by attributing to the green color, and to the other colors, the sharpness of the blue color, increasing corollary the depth of field of the device.
- a macro function is provided to allow imaging of objects near the capture apparatus within a predetermined range of distances, referred to as the macro range of distances 9.1, to the apparatus.
- a capture apparatus makes it possible to move all or part of the optics to perform the macro function.
- the method or system that is the subject of the invention makes it possible to dispense with such a displacement.
- the sharpest color is predetermined for the macro range 9.1, for example by measuring the sharpness 8.2 and 8.3 of the colors of the digital images obtained by the capture apparatus for each color by producing digital images. from objects located at different distances from the capture apparatus.
- the clearest color ( Figure 5) is the one corresponding to 8.3. This predetermination can be carried out definitively, for example at the time of the design of the apparatus
- the depth of field is increased without increasing the cost, complexity or bulk of the optics and / or without the need to change the exposure, so without reducing the opening, or increase the noise level or increase the motion blur.
- the increase in depth of field produced according to the invention also benefits an apparatus having variable parameters at the time of capturing the digital image and having an influence on the sharpness of the colors, in particular a zoom capture apparatus, and / or an optics with variable focus and / or variable aperture.
- the sharpness curves 8.2 and 8.3 corresponding to the value of the variable parameters according to the digital image are then used.
- a method and a device are also provided for reducing longitudinal chromatic aberrations of a digital image.
- the distance measurement performed according to the invention benefits especially fixed optics, including telephones.
- the invention consists, according to one of its aspects, which can be used independently of the previously described aspects, from the capabilities of the digital image processing means 128, 128 'for determining or selecting the parameters of the optical system 122, 122' and / or the sensor or image generator 124, 124 'and / or the servo system 126.
- an optical system can distort the images in such a way that a rectangle can be deformed into a cushion, with a convex shape of each of the sides or in a barrel with a concave shape of each of the sides.
- Moiré is corrected by setting the anti-alias filters.
- FIG. 15b is a diagram similar to that of FIG. 15a showing the properties of a servo-control of an apparatus produced according to the invention, on the assumption that the digital image processing means make it possible to correct the blurring until to a value of BXU equal to 14.
- this value is the limit for the blur to be correctable by the digital processing means.
- the optical system provides a small image spot 1100.
- This system has a modulation transfer function (MTF) represented by a diagram where the spatial frequencies are on the abscissa.
- the value of the cutoff frequency is fc.
- the FTM function comprises a bearing 1110 in the vicinity of the zero frequencies and a part decreasing rapidly towards the value fc.
- the optic represented by the diagram of FIG. 17b has an image spot 1114 of dimensions substantially greater than the image spot 1100 and its FTM has the same cutoff frequency fc as in the case of FIG. 17a.
- the variation of this MTF as a function of the spatial frequency is different: this frequency decreases relatively regularly from the origin towards the cutoff frequency.
- the optics shown in FIG. 17b will provide more detail than the optics shown in FIG. 17a, and this despite the fact that the image spot is larger than in the case of Figure 17a. We will therefore choose the optics corresponding to Figure 17b.
- a strong overlap between the three bands also reduces the differences in sharpness between the colors, thus reducing in particular the range of distances for which at least one of the three colors is clear.
- the accuracy of distance measurements according to the method depends in particular on the variation of the relative sharpness depending on the distance. This variation depends on the amount of chromatic aberration that can be achieved with the capture system (sensor and optics). However, the spectral frequency range of the visible light, and therefore the useful light for a photograph, is relatively small: of the order of 400nm to 700nm. Also, the variation of relative sharpness as a function of the distance is then limited with a conventional Bayer sensor.
- a photographic apparatus is thus obtained which makes it possible to provide more precise indications of the distance of the imaged objects from all the NxM pixels of the image.
- FIG. 20.2 one starts from a conventional Bayer in which three R, G, B pixels and a U pixel corresponding to a portion of a UV or infrared spectral band are provided.
- infrared and / or ultraviolet we can hear any part of the spectrum beyond or below the visible spectrum, including the near infra-red such as 700 to 800 or 700 to 900nm, or near ultraviolet near 400nm.
- This U-pixel is used to enhance the sharpness of visible colors as shown in the diagram of Figure 20.1.
- the optics can be designed so that over a wide range of distances: the smallest of the spot diagram diameters (between the three colors) is below a first predetermined threshold and that the largest of the spot pattern diameters between the three colors is below a second predetermined threshold.
- the two thresholds are determined as a function, for example, of the capabilities and constraints of digital image processing, on the one hand (as for example the size of filter "F" described below), and characteristics of the image. sensor, on the other hand.
- the processing comprises a modification of sharpness for each pixel of the zone Z 'by means of a filter mixing the pixel values on a predetermined neighborhood of each pixel, the parameters of the filter being function the relative sharpness measured.
- An alternative embodiment of the invention consists in choosing or adapting the sharpness filters to the measured relative sharpnesses.
- the filter M can modify the value of the pixel P as a function of the values of the pixels on a neighborhood of the pixel P on the set of three colors.
- the filter can be chosen M as an operator performing the following operations:
- RA RN + c_RG * M_RG (GN) + c_RR * M_RR (RN) + c_RB * M_RB (BN)
- M_ ⁇ R, G, B ⁇ ⁇ R, G, B ⁇ represent filters, which can be chosen as linear zero-sum filters, such as high-pass frequency filters.
- the c_ ⁇ R, G, B ⁇ ⁇ R, G, B ⁇ represent weights weighting the impact of each filter M_ ⁇ R, G, B ⁇ ⁇ R, G, B ⁇ .
- This filtering example can also reflect the sharpness of the sharpest color on others.
- the high-pass filters M_ ⁇ R, G, B ⁇ ⁇ R, G, B ⁇ will give values close to 0, when they are applied to the green and red colors. are blurred in the example.
- GA will therefore be equal to GN plus c_GB * M_GB (BN), that is to say GN plus the high frequencies of blue.
- the green color thus inherits the sharpness of the clean color (blue). It is the same for the red color.
- the filters M_ ⁇ R, G, B ⁇ ⁇ R, G, B ⁇ and the coefficients c_ ⁇ R, G, B ⁇ ⁇ R, G, B ⁇ can be adapted to different possible values of the colors.
- An embodiment of such an adaptation, in the context of RGB images from a given capture device, is as follows:
- the association table between the relative purities considered and the set of filters may include other inputs, such as, for example, the position of the zone Z 'in the field of the image or of the shooting parameters such as the value of the focal length, aperture, focus distance, etc., of the optical system at the time of shooting. Indeed, it is usual that the sharpness characteristics of a digital image also depend on these factors.
- the sharpness correction of a digital image it will first cut the image field into several zones Z 'and the method will be applied to each of the zones.
- the clipping will preferably be performed according to the characteristics of the sharpness of the colors so that in each zone the sharpness of the colors is relatively homogeneous.
- an automatic adaptation of the sharpness filter applied to the digital image, to the distance between the image scene and the capture device is thus obtained. It should be noted that, thanks to the use of the relative sharpness this automatic adaptation to the distance, can be done without the explicit knowledge of this distance.
- this embodiment of the method also allows the automatic adaptation of treatments aimed for example at the correction of optical defects and / or sensors whose effects on the image depend on the distance between pictorial scene and capture apparatus.
- the blur or loss of sharpness
- optical defects and / or sensor such as geometric distortion or vignetting, are other examples.
- Principle of the invention is an example, but other optical defects and / or sensor (s) such as geometric distortion or vignetting, are other examples.
- FIG. 19.1 there is shown an image 10 having a region R and having two colors 195 and 196, a relative sharpness measurement 190 between the two colors 195 and 196 on the region R of the image 10, an action 191 controlled according to the measured relative sharpness.
- the controlled action also depends on a mode 193 corresponding for example to a choice of the user of the device, and / or a characteristic of the capture device when shooting.
- Fig. 19.2 there is shown an image 10 having a region R and having two colors 195 and 196, a relative sharpness measurement 190 between the two colors 195 and 196 on the region R of the image 10, a commanded action 191 as a function of the measured relative sharpness comprising a processing of the image 10 and producing a processed image 192.
- the controlled action also depends on a mode 193 corresponding, for example, to a choice of the user of the apparatus, and / or a characteristic of the capture apparatus during shooting.
- Fig. 19.3 there is shown an image 10 having a region R and having two colors 195 and 196, a relative sharpness measurement 190 between the two colors 195 and 196 on the region R of the image 10, a commanded action 191 according to the measured relative sharpness having a processing of another image 194 and producing a processed image 198.
- the controlled action also depends on a mode 193 corresponding for example to a choice of the user of the device , and / or a characteristic of the capture apparatus during shooting.
- the action command consists of modifying the contrast and / or the brightness and / or the color of the image as a function of the relative sharpness between at least two colors on at least two colors. a region R of the image.
- the use of the relative sharpness between at least two colors on at least one region R of the image allows by example of simulating the addition of localized lighting for example a flash positioned anywhere in the scene, and / or, conversely, to reduce the effect of a flash or lighting of various colors in the scene.
- the digital image is divided into regions according to the relative sharpness between at least two colors, so that each image region a part of the scene within a given range of distances and is oriented in one direction. given.
- An indication of the direction can be obtained from the local variation of the relative sharpness in the image.
- An indication of the distance can be obtained from the relative sharpness as previously described.
- the three-dimensional geometry of the scene is reconstructed by measuring the distance at a large number of points of the image. We then use a known technique in image synthesis to add lighting to the scene (ray tracing or other).
- lighting is added to the subject (s) main (s) adapted to each subject to cause a "fill-in" effect simulating one or more flash (s) positioned (s) opposite or on the side of each subject.
- This operation can be performed automatically and independently for each subject. With the known technique the addition of lighting for each subject is possible only with studio lighting.
- the flash power can be determined according to the nearest subject to properly illuminate it, and complement the lighting of other subjects by adding simulated lighting.
- the color of the illumination can also be determined for each region by a known method of estimating the white balance and then making the color of the scene lighting uniform.
- the white balance is estimated globally for lack of information on the 3 - dimensional geometry of the scene.
- Figure 18.1 there is shown a sensor 2, producing a raw image 180 pretreated 181, for example a white balance, and / or black level compensation, and / or a noise reduction to produce a pretreated image 182.
- a relative sharpness measurement 190 controlling an action 191 corresponding to a processing implementing the pretreated image 182 and the relative sharpness measurement 190, to produce a processed image 192.
- a downstream processing of the processed image 192 has been represented, corresponding, for example, to a demosaicing or other processing necessary to convert a raw image into a visible image.
- FIG. 18.2 shows a sensor 2 producing a raw image 180.
- a relative sharpness measurement 190 controlling an action 191 corresponding to a processing implementing the raw image 180 and the relative sharpness measurement 190 has also been shown. to produce a processed image 192.
- a downstream processing of the processed image 192 corresponding, for example, to a demosaicing or other processing necessary to convert a raw image into a visible image.
- the action implements a processing on a visible image.
- the invention applies to an apparatus having variable parameters at the time of capturing the digital image and having an influence on the sharpness of the colors including a zoom capture apparatus, and / or an optics with variable focus and / or a variable aperture.
- the sharpness curves 8.2 and 8.3 corresponding to the value of the variable parameters according to the digital image are then used.
- the invention makes it possible to restore the focus digitally without a moving group and instantaneously, which thus makes it possible to reduce the complexity of a zoom by removing at least one moving part.
- the relative sharpness between two colors may be variable, whereas this is not acceptable in known optics.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
- Color Television Image Signal Generators (AREA)
- Studio Devices (AREA)
- Facsimile Image Signal Circuits (AREA)
- Paper (AREA)
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA2600185A CA2600185C (fr) | 2005-03-07 | 2006-03-06 | Procede pour commander une action, notamment une modification de nettete, a partir d'une image numerique en couleurs |
US11/817,977 US7920172B2 (en) | 2005-03-07 | 2006-03-06 | Method of controlling an action, such as a sharpness modification, using a colour digital image |
EP06726221A EP1856907A2 (fr) | 2005-03-07 | 2006-03-06 | Procédé pour commander une action, notamment une modification de netteté, à partir d'une image numérique en couleurs |
JP2008500243A JP5535476B2 (ja) | 2005-03-07 | 2006-03-06 | カラーデジタル画像を使用する、機能すなわち鮮鋭度の変更を活動化する方法 |
US12/820,965 US20110109749A1 (en) | 2005-03-07 | 2010-06-22 | Method for activating a function, namely an alteration of sharpness, using a colour digital image |
US12/820,951 US8212889B2 (en) | 2005-03-07 | 2010-06-22 | Method for activating a function, namely an alteration of sharpness, using a colour digital image |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR0550601 | 2005-03-07 | ||
FR0550601A FR2880958B1 (fr) | 2005-01-19 | 2005-03-07 | Procede d'amelioration de la nettete d'au moins une couleur d'une image numerique |
Related Child Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/817,977 A-371-Of-International US7920172B2 (en) | 2005-03-07 | 2006-03-06 | Method of controlling an action, such as a sharpness modification, using a colour digital image |
US12/820,951 Division US8212889B2 (en) | 2005-03-07 | 2010-06-22 | Method for activating a function, namely an alteration of sharpness, using a colour digital image |
US12/820,965 Division US20110109749A1 (en) | 2005-03-07 | 2010-06-22 | Method for activating a function, namely an alteration of sharpness, using a colour digital image |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2006095110A2 true WO2006095110A2 (fr) | 2006-09-14 |
WO2006095110A3 WO2006095110A3 (fr) | 2006-11-02 |
Family
ID=36632499
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/FR2006/050197 WO2006095110A2 (fr) | 2005-03-07 | 2006-03-06 | Procédé pour commander une action, notamment une modification de netteté, à partir d'une image numérique en couleurs |
Country Status (7)
Country | Link |
---|---|
US (3) | US7920172B2 (fr) |
EP (1) | EP1856907A2 (fr) |
JP (3) | JP5535476B2 (fr) |
KR (1) | KR101265358B1 (fr) |
CN (2) | CN102984448B (fr) |
CA (4) | CA2834963C (fr) |
WO (1) | WO2006095110A2 (fr) |
Cited By (52)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007259159A (ja) * | 2006-03-24 | 2007-10-04 | Matsushita Electric Ind Co Ltd | 撮像装置 |
EP2268043A1 (fr) * | 2008-06-18 | 2010-12-29 | Panasonic Corporation | Dispositif de traitement d'images, dispositif, procédé et programme d'imagerie |
WO2011058236A1 (fr) | 2009-11-16 | 2011-05-19 | Dxo Labs | Systeme optique et procede de conception associe |
CN101321295B (zh) * | 2007-06-07 | 2011-07-13 | 株式会社东芝 | 摄像装置 |
CN102866495A (zh) * | 2011-05-13 | 2013-01-09 | 索尼公司 | 图像处理装置、图像处理方法及图像处理程序 |
US8379115B2 (en) | 2007-11-20 | 2013-02-19 | Motorola Mobility Llc | Image capture device with electronic focus |
FR2982678A1 (fr) * | 2011-11-14 | 2013-05-17 | Dxo Labs | Procede et systeme de capture de sequence d'images avec compensation des variations de grandissement |
US8462237B2 (en) | 2008-11-14 | 2013-06-11 | Kabushiki Kaisha Toshiba | Solid-state image pickup device which senses and processes light into primary color bands and an all wavelength band |
US8643748B2 (en) | 2007-11-20 | 2014-02-04 | Motorola Mobility Llc | Compact stationary lens optical zoom image capture system |
WO2014076836A1 (fr) * | 2012-11-19 | 2014-05-22 | 富士機械製造株式会社 | Appareil de montage de composant et appareil d'inspection de montage |
US8836825B2 (en) | 2011-06-23 | 2014-09-16 | Panasonic Corporation | Imaging apparatus |
US8988590B2 (en) | 2011-03-28 | 2015-03-24 | Intermec Ip Corp. | Two-dimensional imager with solid-state auto-focus |
US8994873B2 (en) | 2009-08-10 | 2015-03-31 | Dxo Labs | Image-capture system and method with two operating modes |
US9007368B2 (en) | 2012-05-07 | 2015-04-14 | Intermec Ip Corp. | Dimensioning system calibration systems and methods |
US9080856B2 (en) | 2013-03-13 | 2015-07-14 | Intermec Ip Corp. | Systems and methods for enhancing dimensioning, for example volume dimensioning |
US9118826B2 (en) | 2009-03-19 | 2015-08-25 | Digitaloptics Corporation | Dual sensor camera |
US9239950B2 (en) | 2013-07-01 | 2016-01-19 | Hand Held Products, Inc. | Dimensioning system |
US9282252B2 (en) | 2009-05-04 | 2016-03-08 | Digitaloptics Corporation | Dual lens digital zoom |
US9752864B2 (en) | 2014-10-21 | 2017-09-05 | Hand Held Products, Inc. | Handheld dimensioning system with feedback |
US9762793B2 (en) | 2014-10-21 | 2017-09-12 | Hand Held Products, Inc. | System and method for dimensioning |
US9779276B2 (en) | 2014-10-10 | 2017-10-03 | Hand Held Products, Inc. | Depth sensor based auto-focus system for an indicia scanner |
US9779546B2 (en) | 2012-05-04 | 2017-10-03 | Intermec Ip Corp. | Volume dimensioning systems and methods |
US9786101B2 (en) | 2015-05-19 | 2017-10-10 | Hand Held Products, Inc. | Evaluating image values |
US9823059B2 (en) | 2014-08-06 | 2017-11-21 | Hand Held Products, Inc. | Dimensioning system with guided alignment |
US9835486B2 (en) | 2015-07-07 | 2017-12-05 | Hand Held Products, Inc. | Mobile dimensioner apparatus for use in commerce |
US9841311B2 (en) | 2012-10-16 | 2017-12-12 | Hand Held Products, Inc. | Dimensioning system |
US9857167B2 (en) | 2015-06-23 | 2018-01-02 | Hand Held Products, Inc. | Dual-projector three-dimensional scanner |
US9897434B2 (en) | 2014-10-21 | 2018-02-20 | Hand Held Products, Inc. | Handheld dimensioning system with measurement-conformance feedback |
US9939259B2 (en) | 2012-10-04 | 2018-04-10 | Hand Held Products, Inc. | Measuring object dimensions using mobile computer |
US9940721B2 (en) | 2016-06-10 | 2018-04-10 | Hand Held Products, Inc. | Scene change detection in a dimensioner |
US10007858B2 (en) | 2012-05-15 | 2018-06-26 | Honeywell International Inc. | Terminals and methods for dimensioning objects |
US10025314B2 (en) | 2016-01-27 | 2018-07-17 | Hand Held Products, Inc. | Vehicle positioning and object avoidance |
US10060729B2 (en) | 2014-10-21 | 2018-08-28 | Hand Held Products, Inc. | Handheld dimensioner with data-quality indication |
US10066982B2 (en) | 2015-06-16 | 2018-09-04 | Hand Held Products, Inc. | Calibrating a volume dimensioner |
US10094650B2 (en) | 2015-07-16 | 2018-10-09 | Hand Held Products, Inc. | Dimensioning and imaging items |
US10134120B2 (en) | 2014-10-10 | 2018-11-20 | Hand Held Products, Inc. | Image-stitching for dimensioning |
US10140724B2 (en) | 2009-01-12 | 2018-11-27 | Intermec Ip Corporation | Semi-automatic dimensioning with imager on a portable device |
US10163216B2 (en) | 2016-06-15 | 2018-12-25 | Hand Held Products, Inc. | Automatic mode switching in a volume dimensioner |
US10203402B2 (en) | 2013-06-07 | 2019-02-12 | Hand Held Products, Inc. | Method of error correction for 3D imaging device |
US10225544B2 (en) | 2015-11-19 | 2019-03-05 | Hand Held Products, Inc. | High resolution dot pattern |
US10247547B2 (en) | 2015-06-23 | 2019-04-02 | Hand Held Products, Inc. | Optical pattern projector |
US10249030B2 (en) | 2015-10-30 | 2019-04-02 | Hand Held Products, Inc. | Image transformation for indicia reading |
US10321127B2 (en) | 2012-08-20 | 2019-06-11 | Intermec Ip Corp. | Volume dimensioning system calibration systems and methods |
US10339352B2 (en) | 2016-06-03 | 2019-07-02 | Hand Held Products, Inc. | Wearable metrological apparatus |
US10393506B2 (en) | 2015-07-15 | 2019-08-27 | Hand Held Products, Inc. | Method for a mobile dimensioning device to use a dynamic accuracy compatible with NIST standard |
US10584962B2 (en) | 2018-05-01 | 2020-03-10 | Hand Held Products, Inc | System and method for validating physical-item security |
US10733748B2 (en) | 2017-07-24 | 2020-08-04 | Hand Held Products, Inc. | Dual-pattern optical 3D dimensioning |
US10775165B2 (en) | 2014-10-10 | 2020-09-15 | Hand Held Products, Inc. | Methods for improving the accuracy of dimensioning-system measurements |
US10909708B2 (en) | 2016-12-09 | 2021-02-02 | Hand Held Products, Inc. | Calibrating a dimensioner using ratios of measurable parameters of optic ally-perceptible geometric elements |
US11029762B2 (en) | 2015-07-16 | 2021-06-08 | Hand Held Products, Inc. | Adjusting dimensioning results using augmented reality |
US11047672B2 (en) | 2017-03-28 | 2021-06-29 | Hand Held Products, Inc. | System for optically dimensioning |
US11639846B2 (en) | 2019-09-27 | 2023-05-02 | Honeywell International Inc. | Dual-pattern optical 3D dimensioning |
Families Citing this family (74)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100776805B1 (ko) * | 2006-09-29 | 2007-11-19 | 한국전자통신연구원 | 스테레오 비전 처리를 통해 지능형 서비스 로봇 시스템에서효율적인 영상 정보의 전송을 위한 장치 및 그 방법 |
KR100834577B1 (ko) * | 2006-12-07 | 2008-06-02 | 한국전자통신연구원 | 스테레오 비전 처리를 통해 목표물 검색 및 추종 방법, 및이를 적용한 가정용 지능형 서비스 로봇 장치 |
US20090102924A1 (en) * | 2007-05-21 | 2009-04-23 | Masten Jr James W | Rapidly Deployable, Remotely Observable Video Monitoring System |
DE102007031230B3 (de) * | 2007-07-04 | 2008-10-30 | Bundesdruckerei Gmbh | Dokumentenerfassungssystem und Dokumentenerfassungsverfahren |
JP5032911B2 (ja) * | 2007-07-31 | 2012-09-26 | キヤノン株式会社 | 画像処理装置及び画像処理方法 |
JP5298507B2 (ja) | 2007-11-12 | 2013-09-25 | セイコーエプソン株式会社 | 画像表示装置及び画像表示方法 |
KR101412752B1 (ko) * | 2007-11-26 | 2014-07-01 | 삼성전기주식회사 | 디지털 자동 초점 영상 생성 장치 및 방법 |
JP5171361B2 (ja) * | 2008-04-07 | 2013-03-27 | 株式会社日立製作所 | 撮像装置 |
JP5132401B2 (ja) * | 2008-04-16 | 2013-01-30 | キヤノン株式会社 | 画像処理装置及び画像処理方法 |
US8160355B1 (en) * | 2008-05-18 | 2012-04-17 | Pixim Israel Ltd. | Method, device and computer program product for performing white balancing of a digital image |
GB2463480A (en) * | 2008-09-12 | 2010-03-17 | Sharp Kk | Camera Having Large Depth of Field |
JP5158713B2 (ja) * | 2008-11-26 | 2013-03-06 | 京セラ株式会社 | 撮像装置および車載カメラシステム |
JP5300133B2 (ja) * | 2008-12-18 | 2013-09-25 | 株式会社ザクティ | 画像表示装置及び撮像装置 |
JP5213688B2 (ja) * | 2008-12-19 | 2013-06-19 | 三洋電機株式会社 | 撮像装置 |
US8379321B2 (en) * | 2009-03-05 | 2013-02-19 | Raytheon Canada Limited | Method and apparatus for accurate imaging with an extended depth of field |
JP2010257037A (ja) * | 2009-04-22 | 2010-11-11 | Sony Corp | 情報処理装置および方法、並びにプログラム |
JP2010288150A (ja) * | 2009-06-12 | 2010-12-24 | Toshiba Corp | 固体撮像装置 |
CN101938535B (zh) * | 2009-06-29 | 2014-01-15 | 鸿富锦精密工业(深圳)有限公司 | 电子设备 |
TWI451357B (zh) * | 2009-09-09 | 2014-09-01 | Himax Tech Ltd | 字型反鋸齒方法 |
WO2011052172A1 (fr) | 2009-10-27 | 2011-05-05 | パナソニック株式会社 | Dispositif d'imagerie et dispositif de mesure de distance utilisant ce dispositif |
US20110149021A1 (en) * | 2009-12-17 | 2011-06-23 | Samir Hulyalkar | Method and system for sharpness processing for 3d video |
US20110188116A1 (en) * | 2010-02-02 | 2011-08-04 | Nikolay Ledentsov Ledentsov | Device for generation of three-demensional images |
JP5528173B2 (ja) * | 2010-03-31 | 2014-06-25 | キヤノン株式会社 | 画像処理装置、撮像装置および画像処理プログラム |
TWI495335B (zh) * | 2010-04-21 | 2015-08-01 | Hon Hai Prec Ind Co Ltd | 取像模組及其運作方法 |
JP2011229603A (ja) * | 2010-04-26 | 2011-11-17 | Fujifilm Corp | 内視鏡装置 |
JP2011229625A (ja) * | 2010-04-26 | 2011-11-17 | Fujifilm Corp | 内視鏡装置 |
JP5630105B2 (ja) * | 2010-07-05 | 2014-11-26 | 株式会社ニコン | 画像処理装置、撮像装置および画像処理プログラム |
JP5811635B2 (ja) * | 2011-03-07 | 2015-11-11 | 株式会社ニコン | 画像処理装置、撮像装置および画像処理プログラム |
US8736722B2 (en) * | 2010-07-15 | 2014-05-27 | Apple Inc. | Enhanced image capture sharpening |
JP5576739B2 (ja) | 2010-08-04 | 2014-08-20 | オリンパス株式会社 | 画像処理装置、画像処理方法、撮像装置及びプログラム |
JP5582935B2 (ja) | 2010-09-22 | 2014-09-03 | 富士フイルム株式会社 | 撮像モジュール |
US9225766B2 (en) * | 2010-10-29 | 2015-12-29 | Sears Brands, L.L.C. | Systems and methods for providing smart appliances |
US9697588B2 (en) | 2010-11-15 | 2017-07-04 | Intuitive Surgical Operations, Inc. | System and method for multi-resolution sharpness transport across color channels |
US9979941B2 (en) * | 2011-01-14 | 2018-05-22 | Sony Corporation | Imaging system using a lens unit with longitudinal chromatic aberrations and method of operating |
CN102158648B (zh) * | 2011-01-27 | 2014-09-10 | 明基电通有限公司 | 影像截取装置及影像处理方法 |
JP5806504B2 (ja) | 2011-05-17 | 2015-11-10 | オリンパス株式会社 | 撮像装置およびこれを備える顕微鏡システム |
US8711275B2 (en) * | 2011-05-31 | 2014-04-29 | Apple Inc. | Estimating optical characteristics of a camera component using sharpness sweep data |
US8749892B2 (en) | 2011-06-17 | 2014-06-10 | DigitalOptics Corporation Europe Limited | Auto-focus actuator for field curvature correction of zoom lenses |
US8953058B2 (en) * | 2011-06-29 | 2015-02-10 | Fotonation Limited | Axial chromatic aberration correction |
JP5847471B2 (ja) * | 2011-07-20 | 2016-01-20 | キヤノン株式会社 | 画像処理装置、撮像装置、画像処理方法および画像処理プログラム |
JP5265826B1 (ja) | 2011-09-29 | 2013-08-14 | オリンパスメディカルシステムズ株式会社 | 内視鏡装置 |
TWI528833B (zh) * | 2011-11-09 | 2016-04-01 | 鴻海精密工業股份有限公司 | 立體攝像裝置 |
JP5898481B2 (ja) * | 2011-12-13 | 2016-04-06 | キヤノン株式会社 | 撮像装置及び焦点検出方法 |
EP2677363A1 (fr) * | 2012-06-20 | 2013-12-25 | bioMérieux | Dispositif optique comprenant une caméra, un diaphragme et moyens d'éclairage |
JP6129309B2 (ja) * | 2012-07-12 | 2017-05-17 | デュアル・アパーチャー・インターナショナル・カンパニー・リミテッド | ジェスチャに基づくユーザインターフェース |
TWI451344B (zh) * | 2012-08-27 | 2014-09-01 | Pixart Imaging Inc | 手勢辨識系統及手勢辨識方法 |
JP5738904B2 (ja) * | 2013-01-28 | 2015-06-24 | オリンパス株式会社 | 画像処理装置、撮像装置、画像処理方法及びプログラム |
JP6086829B2 (ja) * | 2013-06-26 | 2017-03-01 | オリンパス株式会社 | 画像処理装置及び画像処理方法 |
US9464885B2 (en) | 2013-08-30 | 2016-10-11 | Hand Held Products, Inc. | System and method for package dimensioning |
WO2015059346A1 (fr) * | 2013-10-25 | 2015-04-30 | Nokia Technologies Oy | Appareil et procédé pour la création d'une carte de profondeur |
JP6256132B2 (ja) | 2014-03-14 | 2018-01-10 | 株式会社リコー | 撮像システム |
CA2942921C (fr) * | 2014-03-18 | 2019-07-30 | Integrated Medical Systems International, Inc. | Endoscope optiquement adaptatif |
KR101591172B1 (ko) | 2014-04-23 | 2016-02-03 | 주식회사 듀얼어퍼처인터네셔널 | 이미지 센서와 피사체 사이의 거리를 결정하는 방법 및 장치 |
EP2942940A1 (fr) * | 2014-05-06 | 2015-11-11 | Nokia Technologies OY | Procédé et appareil permettant de définir le contenu visible d'une image |
US9232132B1 (en) * | 2014-06-10 | 2016-01-05 | Gregory S. Tseytin | Light field image processing |
US9557166B2 (en) | 2014-10-21 | 2017-01-31 | Hand Held Products, Inc. | Dimensioning system with multipath interference mitigation |
US20160225150A1 (en) * | 2015-02-02 | 2016-08-04 | Capso Vision, Inc. | Method and Apparatus for Object Distance and Size Estimation based on Calibration Data of Lens Focus |
US10475361B2 (en) | 2015-02-02 | 2019-11-12 | Apple Inc. | Adjustable display illumination |
US10943333B2 (en) * | 2015-10-16 | 2021-03-09 | Capsovision Inc. | Method and apparatus of sharpening of gastrointestinal images based on depth information |
US11354783B2 (en) | 2015-10-16 | 2022-06-07 | Capsovision Inc. | Method and apparatus of sharpening of gastrointestinal images based on depth information |
US10624533B2 (en) | 2015-10-16 | 2020-04-21 | Capsovision Inc | Endoscope with images optimized based on depth map derived from structured light images |
US9715721B2 (en) | 2015-12-18 | 2017-07-25 | Sony Corporation | Focus detection |
US10277829B1 (en) * | 2016-08-12 | 2019-04-30 | Apple Inc. | Video capture in low-light conditions |
JP6801434B2 (ja) * | 2016-12-20 | 2020-12-16 | 富士通株式会社 | 生体画像処理装置、生体画像処理方法および生体画像処理プログラム |
CN110418719B (zh) | 2017-01-25 | 2022-01-04 | 康丽数码有限公司 | 染色合成织物上喷墨打印的油墨组、方法及图像 |
WO2019048492A1 (fr) * | 2017-09-08 | 2019-03-14 | Sony Corporation | Dispositif d'imagerie, procédé et programme de production d'images d'une scène |
CN107613284B (zh) * | 2017-10-31 | 2019-10-08 | 努比亚技术有限公司 | 一种图像处理方法、终端和计算机可读存储介质 |
CN108650462B (zh) * | 2018-05-14 | 2020-06-09 | Oppo广东移动通信有限公司 | 拍摄预览显示方法、装置、终端及存储介质 |
US10679024B2 (en) * | 2018-07-24 | 2020-06-09 | Cognex Corporation | System and method for auto-focusing a vision system camera on barcodes |
US11336840B2 (en) | 2020-09-02 | 2022-05-17 | Cisco Technology, Inc. | Matching foreground and virtual background during a video communication session |
CN114339187B (zh) * | 2020-09-30 | 2024-06-14 | 北京小米移动软件有限公司 | 图像处理方法、图像处理装置及存储介质 |
US11893668B2 (en) | 2021-03-31 | 2024-02-06 | Leica Camera Ag | Imaging system and method for generating a final digital image via applying a profile to image information |
US12254644B2 (en) | 2021-03-31 | 2025-03-18 | Leica Camera Ag | Imaging system and method |
CN114724000B (zh) * | 2022-06-09 | 2022-08-30 | 深圳精智达技术股份有限公司 | 一种屏拍图摩尔纹的处理方法、处理装置及处理设备 |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USH101H (en) * | 1984-10-01 | 1986-08-05 | The United States Of America As Represented By The Secretary Of The Army | Ultraviolet and infrared focal place array |
JPS63247680A (ja) | 1987-04-02 | 1988-10-14 | Mitsubishi Electric Corp | 画像追尾装置 |
JPH01212981A (ja) * | 1988-02-20 | 1989-08-25 | Sanyo Electric Co Ltd | オートフォーカス装置 |
JPH01276884A (ja) * | 1988-04-27 | 1989-11-07 | Nec Corp | ビデオカメラの焦合装置 |
US5161107A (en) * | 1990-10-25 | 1992-11-03 | Mestech Creation Corporation | Traffic surveillance system |
JPH06138362A (ja) * | 1991-02-06 | 1994-05-20 | Sony Corp | オートフォーカス装置 |
GB9125954D0 (en) * | 1991-12-06 | 1992-02-05 | Vlsi Vision Ltd | Electronic camera |
US6292212B1 (en) * | 1994-12-23 | 2001-09-18 | Eastman Kodak Company | Electronic color infrared camera |
JP3960647B2 (ja) * | 1997-01-09 | 2007-08-15 | オリンパス株式会社 | 自動合焦装置 |
EP0878970A3 (fr) | 1997-05-16 | 1999-08-18 | Matsushita Electric Industrial Co., Ltd. | Système de mesure de l'erreur d'alignement et de l'aberration chromatique d'un capteur d'image pour une caméra vidéo |
US5973846A (en) * | 1998-11-30 | 1999-10-26 | Hewlett-Packard Company | Offset spectra lens system for a two spectra automatic focusing system |
JP2000299874A (ja) | 1999-04-12 | 2000-10-24 | Sony Corp | 信号処理装置及び方法並びに撮像装置及び方法 |
JP2000338385A (ja) * | 1999-05-28 | 2000-12-08 | Ricoh Co Ltd | 自動合焦装置およびその合焦方法 |
US6859229B1 (en) * | 1999-06-30 | 2005-02-22 | Canon Kabushiki Kaisha | Image pickup apparatus |
JP2001103358A (ja) * | 1999-09-30 | 2001-04-13 | Mitsubishi Electric Corp | 色収差補正装置 |
JP4696407B2 (ja) | 2001-06-20 | 2011-06-08 | 株式会社ニコン | 商品推奨システムおよび商品推奨方法 |
JP2003018407A (ja) * | 2001-07-02 | 2003-01-17 | Konica Corp | 画像処理方法及び画像処理装置 |
US20030063185A1 (en) * | 2001-09-28 | 2003-04-03 | Bell Cynthia S. | Three-dimensional imaging with complementary color filter arrays |
JP4126938B2 (ja) | 2002-03-22 | 2008-07-30 | セイコーエプソン株式会社 | 画像処理装置および画像出力装置 |
JP2004120487A (ja) * | 2002-09-27 | 2004-04-15 | Fuji Photo Film Co Ltd | 撮像装置 |
JP2004228662A (ja) * | 2003-01-20 | 2004-08-12 | Minolta Co Ltd | 撮像装置 |
JP4010254B2 (ja) * | 2003-02-06 | 2007-11-21 | ソニー株式会社 | 画像記録再生装置、画像撮影装置及び色収差補正方法 |
US20040165090A1 (en) | 2003-02-13 | 2004-08-26 | Alex Ning | Auto-focus (AF) lens and process |
US20040174446A1 (en) * | 2003-02-28 | 2004-09-09 | Tinku Acharya | Four-color mosaic pattern for depth and image capture |
US20040169748A1 (en) * | 2003-02-28 | 2004-09-02 | Tinku Acharya | Sub-sampled infrared sensor for use in a digital image capture device |
JP4378994B2 (ja) | 2003-04-30 | 2009-12-09 | ソニー株式会社 | 画像処理装置、画像処理方法ならびに撮像装置 |
FR2860089B1 (fr) * | 2003-09-23 | 2005-11-11 | Do Labs | Procede et systeme pour modifier une image numerique de maniere differenciee et quasi reguliere par pixel |
JP4665422B2 (ja) * | 2004-04-02 | 2011-04-06 | ソニー株式会社 | 撮像装置 |
JP4815807B2 (ja) | 2004-05-31 | 2011-11-16 | 株式会社ニコン | Rawデータから倍率色収差を検出する画像処理装置、画像処理プログラム、および電子カメラ |
US20060093234A1 (en) * | 2004-11-04 | 2006-05-04 | Silverstein D A | Reduction of blur in multi-channel images |
-
2006
- 2006-03-06 CA CA2834963A patent/CA2834963C/fr not_active Expired - Fee Related
- 2006-03-06 CA CA2835047A patent/CA2835047C/fr not_active Expired - Fee Related
- 2006-03-06 EP EP06726221A patent/EP1856907A2/fr not_active Withdrawn
- 2006-03-06 US US11/817,977 patent/US7920172B2/en not_active Expired - Fee Related
- 2006-03-06 WO PCT/FR2006/050197 patent/WO2006095110A2/fr active Application Filing
- 2006-03-06 CA CA2834883A patent/CA2834883C/fr not_active Expired - Fee Related
- 2006-03-06 KR KR1020077022875A patent/KR101265358B1/ko active Active
- 2006-03-06 CA CA2600185A patent/CA2600185C/fr not_active Expired - Fee Related
- 2006-03-06 CN CN201210177544.6A patent/CN102984448B/zh not_active Expired - Fee Related
- 2006-03-06 CN CNA2006800123908A patent/CN101204083A/zh active Pending
- 2006-03-06 JP JP2008500243A patent/JP5535476B2/ja not_active Expired - Fee Related
-
2010
- 2010-06-22 US US12/820,951 patent/US8212889B2/en not_active Expired - Fee Related
- 2010-06-22 US US12/820,965 patent/US20110109749A1/en not_active Abandoned
-
2013
- 2013-06-06 JP JP2013119919A patent/JP5633891B2/ja not_active Expired - Fee Related
-
2014
- 2014-07-29 JP JP2014153695A patent/JP6076300B2/ja not_active Expired - Fee Related
Cited By (81)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007259159A (ja) * | 2006-03-24 | 2007-10-04 | Matsushita Electric Ind Co Ltd | 撮像装置 |
CN101321295B (zh) * | 2007-06-07 | 2011-07-13 | 株式会社东芝 | 摄像装置 |
US8643748B2 (en) | 2007-11-20 | 2014-02-04 | Motorola Mobility Llc | Compact stationary lens optical zoom image capture system |
US8379115B2 (en) | 2007-11-20 | 2013-02-19 | Motorola Mobility Llc | Image capture device with electronic focus |
US7986352B2 (en) | 2008-06-18 | 2011-07-26 | Panasonic Corporation | Image generation system including a plurality of light receiving elements and for correcting image data using a spatial high frequency component, image generation method for correcting image data using a spatial high frequency component, and computer-readable recording medium having a program for performing the same |
EP2312858A1 (fr) * | 2008-06-18 | 2011-04-20 | Panasonic Corporation | Appareil de traitement d'images, appareil d'imagerie, procédé de traitement d'images et programme |
EP2268043A4 (fr) * | 2008-06-18 | 2010-12-29 | Panasonic Corp | Dispositif de traitement d'images, dispositif, procédé et programme d'imagerie |
EP2268043A1 (fr) * | 2008-06-18 | 2010-12-29 | Panasonic Corporation | Dispositif de traitement d'images, dispositif, procédé et programme d'imagerie |
US8462237B2 (en) | 2008-11-14 | 2013-06-11 | Kabushiki Kaisha Toshiba | Solid-state image pickup device which senses and processes light into primary color bands and an all wavelength band |
US10140724B2 (en) | 2009-01-12 | 2018-11-27 | Intermec Ip Corporation | Semi-automatic dimensioning with imager on a portable device |
US10845184B2 (en) | 2009-01-12 | 2020-11-24 | Intermec Ip Corporation | Semi-automatic dimensioning with imager on a portable device |
US9118826B2 (en) | 2009-03-19 | 2015-08-25 | Digitaloptics Corporation | Dual sensor camera |
US9282252B2 (en) | 2009-05-04 | 2016-03-08 | Digitaloptics Corporation | Dual lens digital zoom |
US8994873B2 (en) | 2009-08-10 | 2015-03-31 | Dxo Labs | Image-capture system and method with two operating modes |
WO2011058236A1 (fr) | 2009-11-16 | 2011-05-19 | Dxo Labs | Systeme optique et procede de conception associe |
US8988590B2 (en) | 2011-03-28 | 2015-03-24 | Intermec Ip Corp. | Two-dimensional imager with solid-state auto-focus |
US9253393B2 (en) | 2011-03-28 | 2016-02-02 | Intermec Ip, Corp. | Two-dimensional imager with solid-state auto-focus |
CN102866495A (zh) * | 2011-05-13 | 2013-01-09 | 索尼公司 | 图像处理装置、图像处理方法及图像处理程序 |
US8836825B2 (en) | 2011-06-23 | 2014-09-16 | Panasonic Corporation | Imaging apparatus |
US9407827B2 (en) | 2011-11-14 | 2016-08-02 | Dxo Labs | Method and system for capturing sequences of images with compensation for variations in magnification |
FR2982678A1 (fr) * | 2011-11-14 | 2013-05-17 | Dxo Labs | Procede et systeme de capture de sequence d'images avec compensation des variations de grandissement |
US10467806B2 (en) | 2012-05-04 | 2019-11-05 | Intermec Ip Corp. | Volume dimensioning systems and methods |
US9779546B2 (en) | 2012-05-04 | 2017-10-03 | Intermec Ip Corp. | Volume dimensioning systems and methods |
US9292969B2 (en) | 2012-05-07 | 2016-03-22 | Intermec Ip Corp. | Dimensioning system calibration systems and methods |
US9007368B2 (en) | 2012-05-07 | 2015-04-14 | Intermec Ip Corp. | Dimensioning system calibration systems and methods |
US10635922B2 (en) | 2012-05-15 | 2020-04-28 | Hand Held Products, Inc. | Terminals and methods for dimensioning objects |
US10007858B2 (en) | 2012-05-15 | 2018-06-26 | Honeywell International Inc. | Terminals and methods for dimensioning objects |
US10805603B2 (en) | 2012-08-20 | 2020-10-13 | Intermec Ip Corp. | Volume dimensioning system calibration systems and methods |
US10321127B2 (en) | 2012-08-20 | 2019-06-11 | Intermec Ip Corp. | Volume dimensioning system calibration systems and methods |
US9939259B2 (en) | 2012-10-04 | 2018-04-10 | Hand Held Products, Inc. | Measuring object dimensions using mobile computer |
US9841311B2 (en) | 2012-10-16 | 2017-12-12 | Hand Held Products, Inc. | Dimensioning system |
US10908013B2 (en) | 2012-10-16 | 2021-02-02 | Hand Held Products, Inc. | Dimensioning system |
WO2014076836A1 (fr) * | 2012-11-19 | 2014-05-22 | 富士機械製造株式会社 | Appareil de montage de composant et appareil d'inspection de montage |
JPWO2014076836A1 (ja) * | 2012-11-19 | 2017-01-05 | 富士機械製造株式会社 | 部品実装機および実装検査機 |
US9784566B2 (en) | 2013-03-13 | 2017-10-10 | Intermec Ip Corp. | Systems and methods for enhancing dimensioning |
US9080856B2 (en) | 2013-03-13 | 2015-07-14 | Intermec Ip Corp. | Systems and methods for enhancing dimensioning, for example volume dimensioning |
US10228452B2 (en) | 2013-06-07 | 2019-03-12 | Hand Held Products, Inc. | Method of error correction for 3D imaging device |
US10203402B2 (en) | 2013-06-07 | 2019-02-12 | Hand Held Products, Inc. | Method of error correction for 3D imaging device |
US9239950B2 (en) | 2013-07-01 | 2016-01-19 | Hand Held Products, Inc. | Dimensioning system |
US9823059B2 (en) | 2014-08-06 | 2017-11-21 | Hand Held Products, Inc. | Dimensioning system with guided alignment |
US10240914B2 (en) | 2014-08-06 | 2019-03-26 | Hand Held Products, Inc. | Dimensioning system with guided alignment |
US10810715B2 (en) | 2014-10-10 | 2020-10-20 | Hand Held Products, Inc | System and method for picking validation |
US9779276B2 (en) | 2014-10-10 | 2017-10-03 | Hand Held Products, Inc. | Depth sensor based auto-focus system for an indicia scanner |
US10121039B2 (en) | 2014-10-10 | 2018-11-06 | Hand Held Products, Inc. | Depth sensor based auto-focus system for an indicia scanner |
US10134120B2 (en) | 2014-10-10 | 2018-11-20 | Hand Held Products, Inc. | Image-stitching for dimensioning |
US10859375B2 (en) | 2014-10-10 | 2020-12-08 | Hand Held Products, Inc. | Methods for improving the accuracy of dimensioning-system measurements |
US10402956B2 (en) | 2014-10-10 | 2019-09-03 | Hand Held Products, Inc. | Image-stitching for dimensioning |
US10775165B2 (en) | 2014-10-10 | 2020-09-15 | Hand Held Products, Inc. | Methods for improving the accuracy of dimensioning-system measurements |
US9752864B2 (en) | 2014-10-21 | 2017-09-05 | Hand Held Products, Inc. | Handheld dimensioning system with feedback |
US10218964B2 (en) | 2014-10-21 | 2019-02-26 | Hand Held Products, Inc. | Dimensioning system with feedback |
US9762793B2 (en) | 2014-10-21 | 2017-09-12 | Hand Held Products, Inc. | System and method for dimensioning |
US10060729B2 (en) | 2014-10-21 | 2018-08-28 | Hand Held Products, Inc. | Handheld dimensioner with data-quality indication |
US9897434B2 (en) | 2014-10-21 | 2018-02-20 | Hand Held Products, Inc. | Handheld dimensioning system with measurement-conformance feedback |
US10393508B2 (en) | 2014-10-21 | 2019-08-27 | Hand Held Products, Inc. | Handheld dimensioning system with measurement-conformance feedback |
US11906280B2 (en) | 2015-05-19 | 2024-02-20 | Hand Held Products, Inc. | Evaluating image values |
US9786101B2 (en) | 2015-05-19 | 2017-10-10 | Hand Held Products, Inc. | Evaluating image values |
US11403887B2 (en) | 2015-05-19 | 2022-08-02 | Hand Held Products, Inc. | Evaluating image values |
US10593130B2 (en) | 2015-05-19 | 2020-03-17 | Hand Held Products, Inc. | Evaluating image values |
US10066982B2 (en) | 2015-06-16 | 2018-09-04 | Hand Held Products, Inc. | Calibrating a volume dimensioner |
US9857167B2 (en) | 2015-06-23 | 2018-01-02 | Hand Held Products, Inc. | Dual-projector three-dimensional scanner |
US10247547B2 (en) | 2015-06-23 | 2019-04-02 | Hand Held Products, Inc. | Optical pattern projector |
US9835486B2 (en) | 2015-07-07 | 2017-12-05 | Hand Held Products, Inc. | Mobile dimensioner apparatus for use in commerce |
US10612958B2 (en) | 2015-07-07 | 2020-04-07 | Hand Held Products, Inc. | Mobile dimensioner apparatus to mitigate unfair charging practices in commerce |
US10393506B2 (en) | 2015-07-15 | 2019-08-27 | Hand Held Products, Inc. | Method for a mobile dimensioning device to use a dynamic accuracy compatible with NIST standard |
US11353319B2 (en) | 2015-07-15 | 2022-06-07 | Hand Held Products, Inc. | Method for a mobile dimensioning device to use a dynamic accuracy compatible with NIST standard |
US11029762B2 (en) | 2015-07-16 | 2021-06-08 | Hand Held Products, Inc. | Adjusting dimensioning results using augmented reality |
US10094650B2 (en) | 2015-07-16 | 2018-10-09 | Hand Held Products, Inc. | Dimensioning and imaging items |
US10249030B2 (en) | 2015-10-30 | 2019-04-02 | Hand Held Products, Inc. | Image transformation for indicia reading |
US10225544B2 (en) | 2015-11-19 | 2019-03-05 | Hand Held Products, Inc. | High resolution dot pattern |
US10747227B2 (en) | 2016-01-27 | 2020-08-18 | Hand Held Products, Inc. | Vehicle positioning and object avoidance |
US10025314B2 (en) | 2016-01-27 | 2018-07-17 | Hand Held Products, Inc. | Vehicle positioning and object avoidance |
US10339352B2 (en) | 2016-06-03 | 2019-07-02 | Hand Held Products, Inc. | Wearable metrological apparatus |
US10872214B2 (en) | 2016-06-03 | 2020-12-22 | Hand Held Products, Inc. | Wearable metrological apparatus |
US9940721B2 (en) | 2016-06-10 | 2018-04-10 | Hand Held Products, Inc. | Scene change detection in a dimensioner |
US10417769B2 (en) | 2016-06-15 | 2019-09-17 | Hand Held Products, Inc. | Automatic mode switching in a volume dimensioner |
US10163216B2 (en) | 2016-06-15 | 2018-12-25 | Hand Held Products, Inc. | Automatic mode switching in a volume dimensioner |
US10909708B2 (en) | 2016-12-09 | 2021-02-02 | Hand Held Products, Inc. | Calibrating a dimensioner using ratios of measurable parameters of optic ally-perceptible geometric elements |
US11047672B2 (en) | 2017-03-28 | 2021-06-29 | Hand Held Products, Inc. | System for optically dimensioning |
US10733748B2 (en) | 2017-07-24 | 2020-08-04 | Hand Held Products, Inc. | Dual-pattern optical 3D dimensioning |
US10584962B2 (en) | 2018-05-01 | 2020-03-10 | Hand Held Products, Inc | System and method for validating physical-item security |
US11639846B2 (en) | 2019-09-27 | 2023-05-02 | Honeywell International Inc. | Dual-pattern optical 3D dimensioning |
Also Published As
Publication number | Publication date |
---|---|
CA2600185C (fr) | 2016-04-26 |
KR20070121717A (ko) | 2007-12-27 |
CN102984448A (zh) | 2013-03-20 |
CA2834963C (fr) | 2017-04-18 |
CN102984448B (zh) | 2016-05-25 |
JP6076300B2 (ja) | 2017-02-08 |
CA2834963A1 (fr) | 2006-09-14 |
US7920172B2 (en) | 2011-04-05 |
WO2006095110A3 (fr) | 2006-11-02 |
US20080158377A1 (en) | 2008-07-03 |
JP5535476B2 (ja) | 2014-07-02 |
CA2835047C (fr) | 2017-04-18 |
CN101204083A (zh) | 2008-06-18 |
JP5633891B2 (ja) | 2014-12-03 |
JP2015019378A (ja) | 2015-01-29 |
EP1856907A2 (fr) | 2007-11-21 |
JP2008532449A (ja) | 2008-08-14 |
CA2834883C (fr) | 2018-01-23 |
US20110019065A1 (en) | 2011-01-27 |
US8212889B2 (en) | 2012-07-03 |
US20110109749A1 (en) | 2011-05-12 |
KR101265358B1 (ko) | 2013-05-21 |
JP2013214986A (ja) | 2013-10-17 |
CA2835047A1 (fr) | 2006-09-14 |
CA2834883A1 (fr) | 2006-09-14 |
CA2600185A1 (fr) | 2006-09-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2834963C (fr) | Procede pour commander une action, notamment une modification de nettete, a partir d'une image numerique en couleurs | |
JP6935587B2 (ja) | 画像処理のための方法および装置 | |
EP1523730B1 (fr) | Procede et systeme pour calculer une image transformee a partir d'une image numerique | |
EP2174289B1 (fr) | Procede de traitement d'objet numerique et systeme associe. | |
JP5237978B2 (ja) | 撮像装置および撮像方法、ならびに前記撮像装置のための画像処理方法 | |
EP3657784B1 (fr) | Procédé d'estimation d'un défaut d'un système de capture d'images et systèmes associés | |
JP6838994B2 (ja) | 撮像装置、撮像装置の制御方法およびプログラム | |
WO2003007242A2 (fr) | Procede et systeme pour produire des informations formatees liees aux defauts | |
FR2842976A1 (fr) | Dispositif et procede pour fournir un zoom numerique de resolution amelioree dans un dispositif imageur electronique portatif | |
KR20190068618A (ko) | 단말기를 위한 촬영 방법 및 단말기 | |
WO2005125242A2 (fr) | Procede d'amelioration de services relatifs a des donnees multimedia en telephonie mobile | |
FR2996034A1 (fr) | Procede pour creer des images a gamme dynamique etendue en imagerie fixe et video, et dispositif d'imagerie implementant le procede. | |
KR100764414B1 (ko) | Psf 선택 모듈, 디지털 자동 초점 조절 장치 및 psf선택 방법 | |
FR2880958A1 (fr) | Procede d'amelioration de la nettete d'au moins une couleur d'une image numerique | |
WO2023218072A1 (fr) | Procédé de correction globale d' une image, et système associe | |
FR2887346A1 (fr) | Procede et dispositif d'amelioration d'une image numerique |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2600185 Country of ref document: CA Ref document number: 2006726221 Country of ref document: EP Ref document number: 3297/KOLNP/2007 Country of ref document: IN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11817977 Country of ref document: US Ref document number: 2008500243 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020077022875 Country of ref document: KR |
|
NENP | Non-entry into the national phase |
Ref country code: RU |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: RU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200680012390.8 Country of ref document: CN |
|
WWP | Wipo information: published in national office |
Ref document number: 2006726221 Country of ref document: EP |