US8593537B2 - Image sensing apparatus, control method thereof, and program for suppressing image deterioration caused by foreign substance - Google Patents
Image sensing apparatus, control method thereof, and program for suppressing image deterioration caused by foreign substance Download PDFInfo
- Publication number
- US8593537B2 US8593537B2 US12/466,839 US46683909A US8593537B2 US 8593537 B2 US8593537 B2 US 8593537B2 US 46683909 A US46683909 A US 46683909A US 8593537 B2 US8593537 B2 US 8593537B2
- Authority
- US
- United States
- Prior art keywords
- image
- foreign substance
- lens
- information
- moving image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
- 239000000126 substance Substances 0.000 title claims abstract description 67
- 238000000034 method Methods 0.000 title claims description 45
- 230000006866 deterioration Effects 0.000 title description 3
- 238000001514 detection method Methods 0.000 claims abstract description 49
- 230000003287 optical effect Effects 0.000 claims abstract description 45
- 238000003384 imaging method Methods 0.000 claims abstract description 34
- 238000012937 correction Methods 0.000 claims description 86
- 239000012634 fragment Substances 0.000 claims description 72
- 210000001747 pupil Anatomy 0.000 claims description 46
- 238000003860 storage Methods 0.000 claims description 23
- 239000000428 dust Substances 0.000 description 198
- 238000012545 processing Methods 0.000 description 140
- 230000006870 function Effects 0.000 description 32
- 230000008859 change Effects 0.000 description 26
- 238000013467 fragmentation Methods 0.000 description 25
- 238000006062 fragmentation reaction Methods 0.000 description 25
- 230000008569 process Effects 0.000 description 16
- 230000006835 compression Effects 0.000 description 14
- 238000007906 compression Methods 0.000 description 14
- AWSBQWZZLBPUQH-UHFFFAOYSA-N mdat Chemical compound C1=C2CC(N)CCC2=CC2=C1OCO2 AWSBQWZZLBPUQH-UHFFFAOYSA-N 0.000 description 13
- 238000005375 photometry Methods 0.000 description 13
- 238000004891 communication Methods 0.000 description 9
- 230000011514 reflex Effects 0.000 description 7
- 230000002950 deficient Effects 0.000 description 6
- 239000006059 cover glass Substances 0.000 description 4
- 230000006837 decompression Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000005286 illumination Methods 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000007423 decrease Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000003825 pressing Methods 0.000 description 3
- 230000002411 adverse Effects 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 230000000593 degrading effect Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000012552 review Methods 0.000 description 2
- WHXSMMKQMYFTQS-UHFFFAOYSA-N Lithium Chemical compound [Li] WHXSMMKQMYFTQS-UHFFFAOYSA-N 0.000 description 1
- 229910005580 NiCd Inorganic materials 0.000 description 1
- 229910005813 NiMH Inorganic materials 0.000 description 1
- 241000593989 Scardinius erythrophthalmus Species 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910052744 lithium Inorganic materials 0.000 description 1
- 229910001416 lithium ion Inorganic materials 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000001454 recorded image Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/81—Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
- H04N23/811—Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation by dust removal, e.g. from surfaces of the image sensor or processing of the image signal output by the electronic image sensor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
- H04N25/61—Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/65—Control of camera operation in relation to power supply
- H04N23/651—Control of camera operation in relation to power supply for reducing power consumption by affecting camera operations, e.g. sleep mode, hibernation mode or power off of selective parts of the camera
Definitions
- the present invention relates to a technique of suppressing image deterioration caused by a foreign substance adhered to the surface of an optical low-pass filter or the like in an image sensing apparatus using an image sensor such as a CCD sensor or CMOS sensor.
- a foreign substance such as dust or mote is sometimes adhered to an optical system or the surface of an image sensor cover glass or optical filter arranged in front of an image sensor (which will be generically referred to as an image sensor optical component).
- dust is adhered to the image sensor optical component, it blocks light, and an image at the light-blocked portion is not shot, degrading the quality of the shot image.
- Such dust on the image sensor is generally adhered not to the surface of the image sensor but to the surface of the cover glass or optical filter.
- the imaging state changes depending on the aperture value or pupil position of the imaging lens. More specifically, when the aperture value is almost the full-aperture one, the dust image blurs, and even if small dust is adhered, it does not matter. When the aperture value is large, a sharp dust image is formed, and even small dust adversely affects the entire image.
- a method of making dust less noticeable According to this method, an image of only dust on an image sensor is prepared in advance by shooting a white wall or the like while setting the lens to a large aperture value. This image is used in combination with a shot still image (see Japanese Patent Laid-Open No. 2004-222231).
- the present invention is made to overcome the conventional drawbacks, and suppresses the influence, on a moving image, of a foreign substance such as dust adhered to a cover glass, filter, or the like arranged in front of an image sensor.
- an image sensing apparatus comprising an image sensing unit having an image sensor which photoelectrically converts an object image formed via an imaging lens, an optical member which is arranged in front of the image sensor, a foreign substance detection unit which detects, from a foreign substance detection image including an image of a foreign substance adhered to a surface of the optical member, foreign substance information serving as information including information on at least a position and size of the foreign substance, a recording unit which, when shooting a moving image, records moving image data generated based on image signals successively output from the image sensing unit, and records, in addition to the moving image data, lens information including the foreign substance information, information of an aperture value of the imaging lens, and information of a pupil position, and a lens information obtaining unit which, when the lens information is updated by operating the imaging lens by a user during moving image shooting, obtains the updated lens information, wherein when the lens information obtaining unit obtains the updated lens information, the recording unit records the updated lens information in addition to the moving image data.
- a method of controlling an image sensing apparatus including an image sensing unit having an image sensor which photoelectrically converts an object image formed via an imaging lens, and an optical member which is arranged in front of the image sensor, the method comprising a foreign substance detection step of detecting, from a foreign substance detection image including an image of a foreign substance adhered to a surface of the optical member, foreign substance information serving as information including information on at least a position and size of the foreign substance, a recording step of, when shooting a moving image, recording moving image data generated based on image signals successively output from the image sensing unit, and recording, in addition to the moving image data, lens information including the foreign substance information, information of an aperture value of the imaging lens, and information of a pupil position, and a lens information obtaining step of, when the lens information is updated by operating the imaging lens by a user during moving image shooting, obtaining the updated lens information, wherein in the recording step, when the updated lens information is obtained in the lens information obtaining step, the updated
- an image sensing apparatus comprising an image sensing unit which photoelectrically converts an object image to generate an image signal, a foreign substance detection unit which detects, from a foreign substance detection image signal obtained by the image sensing unit, foreign substance information serving as information on at least a position and size of the foreign substance in an image sensing frame of the image sensing unit, a lens information obtaining unit which obtains lens information of a lens used to image an object, and a recording unit which, when shooting a moving image, records moving image data generated based on image signals successively output from the image sensing unit, and records, in addition to the moving image data, the foreign substance information detected by the foreign substance detection unit and the lens information obtained by the lens information obtaining unit, wherein the recording unit fragments the moving image data, records the fragments, adds lens information obtained by the lens information obtaining unit to each fragment, and records the lens information.
- a method of controlling an image sensing apparatus having an image sensing unit which photoelectrically converts an object image to generate an image signal comprising a foreign substance detection step of detecting, from a foreign substance detection image signal obtained by the image sensing unit, foreign substance information serving as information on at least a position and size of the foreign substance in an image sensing frame of the image sensing unit, a lens information obtaining step of obtaining lens information of a lens used to image an object, and a recording step of, when shooting a moving image, recording moving image data generated based on image signals successively output from the image sensing unit, and recording, in addition to the moving image data, the foreign substance information detected in the foreign substance detection step and the lens information obtained in the lens information obtaining step, wherein in the recording step, the moving image data is fragmented to record the fragments, and lens information obtained in the lens information obtaining step is added to each fragment and recorded.
- FIG. 1 is a perspective view showing the outer appearance of a lens-interchangeable single-lens reflex digital camera
- FIG. 2 is a vertical sectional view showing the internal structure of the lens-interchangeable single-lens reflex digital camera
- FIG. 3 is a block diagram showing the circuit arrangement of the lens-interchangeable single-lens reflex digital camera
- FIG. 4 is a flowchart for explaining dust detection processing
- FIG. 5 is a view showing an example of the data format of dust correction data
- FIG. 6 is a flowchart for explaining details of a dust region obtaining routine in step S 27 of FIG. 4 ;
- FIG. 7 is a view showing the processing unit of dust region determination processing in step S 62 of FIG. 6 ;
- FIG. 8 is a view showing an outline of calculating the size of a dust region in step S 63 of FIG. 6 ;
- FIG. 9 is a flowchart for explaining details of an image sensing processing routine in step S 24 of FIG. 4 ;
- FIG. 10 is a flowchart for explaining details of dust removal processing
- FIG. 11 is a flowchart for explaining details of an interpolation routine
- FIG. 12 is a view for explaining the concept of metadata and media data in the MP4 file format or a similar file format
- FIG. 13 is a view for explaining the concept of Fragmented Movie
- FIG. 14 is a flowchart of basic processing in the first embodiment
- FIG. 15 is a view showing an example of the data format of dust position correction data
- FIG. 16 is a chart showing moving image file fragmentation/generation processing in the first embodiment
- FIGS. 17A and 17B are a schematic view of a basic file structure in the first embodiment
- FIGS. 18A and 18B are a schematic view showing the second example of the file structure in the first embodiment
- FIGS. 19A and 19B are a schematic view showing the third example of the file structure in the first embodiment
- FIGS. 20A and 20B are a schematic view showing the fourth example of the file structure in the first embodiment
- FIG. 21 is a block diagram showing the schematic system configuration of an image processing apparatus
- FIG. 22 is a view showing an example of the GUI in the image processing apparatus
- FIG. 23 is a flowchart of basic processing in the second embodiment
- FIG. 24 is a flowchart showing a fragmentation method in zoom driving in the third embodiment
- FIG. 25 is a flowchart showing a fragmentation method in zoom driving in the fourth embodiment.
- FIG. 26 is a flowchart showing a fragmentation method in zoom driving in the fifth embodiment.
- FIG. 1 is a perspective view showing the outer appearance of a digital camera 120 common to all the embodiments of the present invention.
- FIG. 2 is a vertical sectional view of FIG. 1 .
- the top of a camera body 100 includes an accessory shoe 110 , an optical viewfinder 104 , an AE (Auto Exposure) lock button 111 , an AF distance measurement point selection button 113 , and a release button 114 for performing a shooting operation.
- the top of the camera body 100 also includes an electronic dial 411 , mode dial 60 , and external display 409 .
- the electronic dial 411 is a multifunctional signal input device for inputting a numerical value to the camera in combination with another operation button, or switching the shooting mode.
- the external display 409 is formed from a liquid crystal display, and displays shooting conditions (e.g., shutter speed, aperture value, and shooting mode), and other kinds of information.
- the rear side of the camera body 100 includes an LCD monitor 417 for displaying a shot image and various setup windows, a playback switch 66 for playing back a shot image on the LCD monitor 417 , a single shooting/continuous shooting switch 68 , a four-way selector switch 116 , a menu button 124 , and a power switch 72 .
- the single shooting/continuous shooting switch 68 can set a single shooting mode in which when the user presses a shutter switch SW 2 64 (to be described later), shooting of one frame is done and then the camera stands by, and a continuous shooting mode in which shooting continues while the user presses the shutter switch SW 2 64 .
- the four-way selector switch 116 includes four buttons arranged on the top, bottom, right, and left, and a SET button 117 arranged at the center. The user uses the four-way selector switch 116 to instruct the camera to select or execute a menu item or the like displayed on the LCD monitor 417 .
- the user uses the menu button 124 to display, on the LCD monitor 417 , a menu window for making various settings of the camera. For example, when selecting and setting the shooting mode, the user presses the menu button 124 , and operates the top, bottom, right, and left buttons of the four-way selector switch 116 to select a mode he wants. While the mode is selected, the user presses the SET button 117 , completing the setting.
- the LCD monitor 417 in the embodiment is of the transmission type. By only driving the LCD monitor, the user cannot see an image.
- the LCD monitor 417 requires a backlight illumination unit 416 behind it, as shown in FIG. 2 .
- the LCD monitor 417 and backlight illumination unit 416 form an image display unit 28 , as shown in FIG. 3 .
- the image sensing apparatus mainly includes the camera body 100 and a lens-interchangeable lens unit 300 .
- reference numeral 401 denotes an imaging optical axis.
- the lens unit 300 includes an imaging lens 310 formed from a plurality of lenses, a stop 312 , and a lens mount 306 which mechanically connects the lens unit 300 to the camera body 100 .
- the lens unit 300 is detachable from the camera body 100 via the lens mount 306 .
- a mirror 130 is inserted in the imaging optical path, and is movable between a position (position shown in FIG. 2 , which will be called an inclined mirror position) where the mirror 130 guides object light traveling from the lens unit 300 to the optical viewfinder system, and a position (to be called a retraction position) where it retracts from the imaging optical path.
- the mirror 130 may also be a quick return mirror or half-mirror.
- object light guided from the mirror 130 to the optical viewfinder 104 forms an image on a focusing screen 204 .
- a condenser lens 205 improves the visibility of the viewfinder.
- a pentagonal roof prism 132 guides the object light having passed through the focusing screen 204 and condenser lens 205 to an eyepiece lens 208 for viewfinder observation and the optical viewfinder 104 .
- a second curtain 209 and first curtain 210 form a shutter.
- the second curtain 209 and first curtain 210 are opened to expose, for a necessary time, an image sensor 14 which is arranged behind them to photoelectrically convert an object image.
- An optical low-pass filter 418 is arranged in front of the image sensor 14 , and adjusts the special frequency of the object image to be formed on the image sensor 14 . Dust (foreign substance) which adversely affects a shot image is adhered to the optical low-pass filter 418 . Such dust appears as a shadow in an object image formed on the image sensor 14 , degrading the quality of the shot image.
- a printed board 211 holds the image sensor 14 .
- a display board 215 which is another printed board, is arranged behind the printed board 211 .
- the LCD monitor 417 and backlight illumination unit 416 are arranged on a surface of the display board 215 that is opposite to the printed board 211 .
- a recording medium 200 records image data.
- the camera uses a cell (portable power supply) 86 .
- the recording medium 200 and cell 86 are detachable from the camera body.
- FIG. 3 is a block diagram showing the circuit arrangement of the lens-interchangeable digital camera common to all the embodiments of the present invention.
- the arrangement of the lens unit 300 will be explained.
- the lens mount 306 incorporates various functions for electrically connecting the lens unit 300 to the camera body 100 .
- an interface 320 connects the lens unit 300 to the camera body 100 .
- a connector 322 electrically connects the lens unit 300 to the camera body 100 .
- the connector 322 also has a function of exchanging control signals, status signals, and data signals between the camera body 100 and the lens unit 300 and receiving currents of various voltages.
- the connector 322 may also communicate not only by telecommunication but also by optical communication or speech communication.
- a stop control unit 340 controls the stop 312 in cooperation with a shutter control unit 40 (to be described later) which controls a shutter 12 of the camera body 100 based on photometry information from a photometry control unit 46 .
- a focus control unit 342 controls focusing of the imaging lens 310 .
- a zoom control unit 344 controls zooming of the imaging lens 310 .
- a lens system control circuit 350 controls the overall lens unit 300 .
- the lens system control circuit 350 has a memory for storing constants, variables, and programs for operations.
- the lens system control circuit 350 also has a nonvolatile memory for holding identification information (e.g., a number unique to the lens unit 300 ), management information, functional information (e.g., a full-aperture value, minimum aperture value, and focal length), and current and past set values.
- a lens mount 106 mechanically connects the camera body 100 to the lens unit 300 .
- the shutter 12 includes the second curtain 209 and first curtain 210 .
- a light beam which has entered the imaging lens 310 is guided via the stop 312 serving as a light quantity restriction unit, the lens mounts 306 and 106 , the mirror 130 , and the shutter 12 , and forms an optical image on the image sensor 14 .
- An A/D converter 16 converts an analog signal output from the image sensor 14 into a digital signal.
- a timing generator 18 supplies clock signals and control signals to the image sensor 14 , the A/D converter 16 , and a D/A converter 26 .
- a memory control circuit 22 and system control circuit 50 control the timing generator 18 .
- An image processing circuit 20 executes predetermined pixel interpolation processing and color conversion processing for data from the A/D converter 16 or data from the memory control circuit 22 . If necessary, the image processing circuit 20 performs predetermined arithmetic processing using image data output from the A/D converter 16 . Based on the obtained arithmetic result, the system control circuit 50 can execute auto focus (AF) processing, auto exposure (AE) processing, and pre-electronic flash (EF) processing of TTL (Through The Lens) scheme to control the shutter control unit 40 and a focus adjusting unit 42 . The image processing unit 20 also executes predetermined arithmetic processing using image data output from the A/D converter 16 , and performs auto white balance (AWB) processing of TTL scheme based on the obtained arithmetic result.
- AVB auto white balance
- the focus adjusting unit 42 and photometry control unit 46 are provided for exclusive use.
- AF processing, AE processing, and EF processing may also be done using not the image processing circuit 20 but the focus adjusting unit 42 and photometry control unit 46 .
- AF processing, AE processing, and EF processing may also be performed first using the focus adjusting unit 42 and photometry control unit 46 and then using the image processing circuit 20 .
- the memory control circuit 22 controls the A/D converter 16 , the timing generator 18 , the image processing circuit 20 , an image display memory 24 , the D/A converter 26 , a memory 30 , and a compression/decompression circuit 32 .
- Image data output from the A/D converter 16 is written in the image display memory 24 or memory 30 via the image processing circuit 20 and memory control circuit 22 or via only the memory control circuit 22 .
- the image display unit 28 includes the LCD monitor 417 and backlight illumination unit 416 . Display image data written in the image display memory 24 is displayed on the image display unit 28 via the D/A converter 26 .
- the image display unit 28 sequentially displays sensed image data, implementing an electronic viewfinder (EVF) function.
- EVF electronic viewfinder
- the image display unit 28 can arbitrarily turn on/off its display in accordance with an instruction from the system control circuit 50 . When display is OFF, the power consumption of the camera body 100 can be greatly reduced.
- the memory 30 stores shot still images and has a memory capacity enough to store a predetermined number of still images. Even in continuous shooting or panoramic shooting for continuously shooting a plurality of still images, the memory 30 allows writing many images in it at high speed. In moving image shooting, the memory 30 is used as a frame buffer for continuously writing images at a predetermined rate. The memory 30 is also available as the work area of the system control circuit 50 .
- the compression/decompression circuit 32 compresses/decompresses image data using a known compression method.
- the compression/decompression circuit 32 reads out an image from the memory 30 , compresses or decompresses it, and writes the processed data in the memory 30 again.
- the shutter control unit 40 controls the shutter 12 in cooperation with the stop control unit 340 which controls the stop 312 based on photometry information from the photometry control unit 46 .
- the focus adjusting unit 42 executes AF (Auto Focus) processing. According to the single-lens reflex method, a light beam which has entered the imaging lens 310 of the lens unit 300 is guided via the stop 312 , the lens mounts 306 and 106 , the mirror 130 , and a focus adjusting sub-mirror (not shown).
- the focus adjusting unit 42 detects the focus state of an image formed by the light beam as an optical image.
- the photometry control unit 46 executes AE (Auto Exposure) processing. According to the single-lens reflex method, a light beam which has entered the imaging lens 310 of the lens unit 300 is guided via the stop 312 , the lens mounts 306 and 106 , the mirror 130 , and a photometry sub-mirror (not shown). The photometry control unit 46 measures the exposure state of an image formed by the light beam as an optical image.
- An electronic flash 48 has an AF auxiliary light projecting function and an electronic flash control function.
- the photometry control unit 46 also has an EF (Electronic Flash control) processing function in cooperation with the electronic flash 48 .
- AF control may also be done using the measurement result of the focus adjusting unit 42 and an arithmetic result obtained by arithmetically processing image data from the A/D converter 16 by the image processing circuit 20 .
- Exposure control may also be performed using the measurement result of the photometry control unit 46 and an arithmetic result obtained by arithmetically processing image data from the A/D converter 16 by the image processing circuit 20 .
- the system control circuit 50 controls the overall camera body 100 and incorporates a known CPU.
- a memory 52 stores constants, variables, and programs for the operation of the system control circuit 50 .
- a notification unit 54 notifies the outside of operation states and messages using a text, image, and sound in accordance with execution of a program by the system control circuit 50 .
- the notification unit 54 is, e.g., a display unit such as an LCD or LED for providing a visual display, or a sound generation element for generating a notification by sound.
- the notification unit 54 includes one or a combination of them.
- the notification unit 54 is a display unit, it is arranged at one or a plurality of positions near an operation unit 70 of the camera body 100 , like the external display 409 , where the user can easily see a notification.
- Some functions of the notification unit 54 are arranged in the optical viewfinder 104 .
- the image display unit 28 such as an LCD presents a display associated with shooting modes including single shooting/continuous shooting and self timer.
- the image display unit 28 also presents a display associated with recording including the compression ratio, the number of recording pixels, the number of recorded images, and the number of recordable images.
- the image display unit 28 presents a display associated with shooting conditions including the shutter speed, aperture value, exposure compensation, brightness correction, external flash light emission amount, and red eye mitigation.
- the image display unit 28 also displays macro shooting, buzzer setting, battery level, error message, information by a plurality of digits, and the attached/detached states of the recording medium 200 and a PC 210 .
- the image display unit 28 displays the attached/detached state of the lens unit 300 , communication I/F operation, date and time, and the connection state of an external computer.
- Some of the display contents of the notification unit 54 are indicated in the optical viewfinder 104 , which include in-focus, ready for shooting, camera shake warning, flash charge, flash charge completion, shutter speed, aperture value, exposure compensation, and recording medium write operation.
- a nonvolatile memory 56 is an electrically erasable/programmable memory such as an EEPROM, and stores programs (to be described later) and the like.
- Reference numerals 60 , 62 , 64 , 66 , 68 , and 70 denote operation units for inputting various kinds of operation instructions to the system control circuit 50 .
- Each operation unit includes one or a combination of a switch, dial, touch panel, pointing by line-of-sight detection, and voice recognition device.
- the mode dial switch 60 can selectively set a shooting mode such as an automatic shooting mode, programmed shooting mode, shutter speed priority shooting mode, stop priority shooting mode, manual shooting mode, or focal depth priority (depth) shooting mode.
- the mode dial switch 60 can also selectively set a shooting mode such as a portrait shooting mode, landscape shooting mode, closeup shooting mode, sports shooting mode, nightscape shooting mode, panoramic shooting mode, and moving image shooting mode.
- the shutter switch SW 1 62 is turned on by operating the release button 114 halfway (e.g., half stroke) to designate the start of an operation such as AF processing, AE processing, AWB processing, or EF processing.
- the shutter switch SW 2 64 is turned on by operating the release button 114 completely (e.g., full stroke) to designate the start of a series of processes including exposure, development, and recording.
- a signal read out from the image sensor 14 is written in the memory 30 via the A/D converter 16 and memory control circuit 22 .
- the development processing is done using calculation by the image processing circuit 20 or memory control circuit 22 .
- image data is read out from the memory 30 , compressed by the compression/decompression circuit 32 , and written in or transmitted to the recording medium 200 or PC 210 .
- the playback switch 66 designates the start of a playback operation of reading out an image shot in a shooting mode from the memory 30 , recording medium 200 , or PC 210 and displaying it on the image display unit 28 .
- the playback switch 66 can set a functional mode such as a playback mode, multiwindow playback/erase mode, or PC-connected mode.
- the single shooting/continuous shooting switch 68 can set a single shooting mode in which when the user presses the shutter switch SW 2 64 , shooting of one frame is done, and then the camera stands by, and a continuous shooting mode in which shooting continues while the user presses the shutter switch SW 2 64 .
- the operation unit 70 includes various buttons and a touch panel.
- the operation unit 70 includes a live view start/stop button, a movie recording start/stop button, the menu button 124 , the SET button 117 , a multiwindow playback/page feed button, an electronic flash setting button, a single shooting/continuous shooting/self timer switch button, the four-way selector switch 116 , the AE (Auto Exposure) lock button 111 , the AF distance measurement point selection button 113 , and the electronic dial 411 .
- the operation unit 70 includes a playback image move + (plus) button, playback image move ⁇ (minus) button, shooting image quality selection button, exposure compensation button, brightness correction button, external flash light emission amount setting button, and date/time setting button.
- a rotary dial switch is used for the top, bottom, right, and left buttons of the four-way selector switch 116 , it allows the user to more easily select numerical values and functions.
- the operation unit 70 includes an image display ON/OFF switch for turning on/off the image display unit 28 , and a quick review ON/OFF switch for setting a quick review function of automatically playing back shot image data immediately after shooting.
- the operation unit 70 also includes a compression mode switch for selecting a compression ratio for JPEG compression, or a RAW mode to directly digitize a signal from the image sensor and record it on a recording medium.
- the operation unit 70 includes an AF mode setting switch capable of setting a one-shot AF mode or a servo AF mode. In the one-shot AF mode, the auto focus operation starts when the user presses the shutter switch SW 1 62 . Once an in-focus state is obtained, this state is kept held.
- the operation unit 70 also includes a setting switch capable of setting a dust information obtainment mode to sense a dust detection image and obtain dust information, as will be described later.
- the power switch 72 can selectively set the power ON or power OFF mode of the camera body 100 .
- the power switch 72 can also selectively set the power ON or power OFF mode of each of various accessories including the lens unit 300 , an external flash 112 , the recording medium 200 , and the PC 210 which are connected to the camera body 100 .
- a power supply control unit 80 includes a cell detection circuit, DC/DC converter, and switching circuit for switching a block to be energized.
- the power supply control unit 80 detects attachment/detachment of a cell, the type of cell, and the battery level.
- the power supply control unit 80 controls the DC/DC converter based on the detection result and an instruction from the system control circuit 50 .
- the power supply control unit 80 supplies a necessary voltage to the respective units including a recording medium for a necessary period.
- Reference numerals 82 and 84 denote connectors; and 86 , a power supply unit formed from a primary cell (e.g., alkaline cell or lithium cell), a secondary cell (e.g., an NiCd cell, NiMH cell, Li-ion cell, or Li-polymer cell), or an AC adapter.
- a primary cell e.g., alkaline cell or lithium cell
- a secondary cell e.g., an NiCd cell, NiMH cell, Li-ion cell, or Li-polymer cell
- an AC adapter e.g., AC adapter
- Reference numerals 90 and 94 denote interfaces with a PC and a recording medium such as a memory card or hard disk; and 92 and 96 , connectors to connect a PC and a recording medium such as a memory card or hard disk.
- a recording medium attachment detection circuit 98 detects whether the recording medium 200 and/or PC 210 is connected to the connectors 92 and/or 96 .
- the camera has two interfaces and two connectors to connect a recording medium.
- the numbers of interfaces and connectors to connect a recording medium are arbitrary, and the camera can have one or a plurality of interfaces or connectors. Interfaces and connectors of different standards may also be combined.
- Interfaces and connectors complying with various storage medium standards are available. Examples are a PCMCIA (Personal Computer Memory Card International Association) card, CF (Compact Flash®) card, and SD card.
- PCMCIA Personal Computer Memory Card International Association
- CF Compact Flash®
- SD Secure Digital Card
- the interfaces 90 and 94 and the connectors 92 and 96 comply with the standard of the PCMCIA card or CF® card, they can connect various kinds of communication cards.
- Examples of the communication cards are a LAN card, modem card, USB (Universal Serial Bus) card, and IEEE (Institute of Electrical and Electronic Engineers) 1394 card.
- a P1284 card, SCSI (Small Computer System Interface) card, and PHS are also usable.
- Various kinds of communication cards can be connected to transfer image data and management information associated with it to another computer or a peripheral device such as a printer.
- the optical viewfinder 104 can display an optical image formed by a light beam which has entered the imaging lens 310 and is guided via the stop 312 , lens mounts 306 and 106 , and mirrors 130 and 132 by the single-lens reflex method. Only with the optical viewfinder, the user can take a picture without using the electronic viewfinder function of the image display unit 28 .
- the optical viewfinder 104 displays some functions of the notification unit 54 such as the in-focus state, camera shake warning, flash charge, shutter speed, aperture value, and exposure compensation.
- the external flash 112 is attached via the accessory shoe 110 .
- An interface 121 connects the camera body 100 to the lens unit 300 in the lens mount 106 .
- a connector 122 electrically connects the camera body 100 to the lens unit 300 .
- a lens attachment detection unit (not shown) detects whether the lens unit 300 is attached to the lens mount 106 and connector 122 .
- the connector 122 also has a function of transmitting control signals, status signals, data signals, and the like between the camera body 100 and the lens unit 300 and also supplying currents of various voltages.
- the memory 30 of the camera body 100 stores various kinds of optical information (e.g., aperture value, zoom position, pupil distance, and focal length) of the lens unit 300 that are communicated via the connector 122 .
- optical information e.g., aperture value, zoom position, pupil distance, and focal length
- the camera requests communication of the information. Every time the information is updated, the lens may communicate it.
- the connector 122 may also communicate not only by telecommunication but also by optical communication or speech communication.
- the recording medium 200 is, e.g., a memory card or hard disk.
- the recording medium 200 includes a recording unit 202 formed from a semiconductor memory or magnetic disk, an interface 204 with the camera body 100 , and a connector 206 to connect the camera body 100 .
- the recording medium 200 can be a memory card (e.g., PCMCIA card or Compact Flash®), or a hard disk.
- the recording medium 200 may also be a micro DAT, a magnetooptical disk, an optical disk (e.g., CD-R or CD-RW), or a phase-change optical disk (e.g., DVD).
- the PC 210 includes a recording unit 212 formed from a magnetic disk (HD), an interface 214 with the camera body 100 , and a connector 216 to connect the camera body 100 .
- the interface 214 can be a USB or interface, but is not particularly limited.
- the camera shoots a dust detection image (foreign substance detection image) for obtaining dust information (foreign substance information) serving as information on the adhesion position and size of dust (foreign substance). Then, the dust detection image is extracted to generate dust data.
- the dust detection image is preferably obtained by shooting a surface as uniformly bright as possible. However, the uniformity need not be strict because it is desirable to easily shoot the image in a familiar place. For example, the embodiment assumes shooting a blue sky or white wall.
- the system control circuit 50 performs this processing by executing a dust detection processing program stored in the nonvolatile memory 56 .
- a dust detection image is shot.
- the user prepares for dust detection by setting the camera to direct the imaging optical axis 401 of the lens unit 300 to the exit surface of a surface light source or a surface with a uniform color, like a white wall.
- the user also prepares for dust detection by attaching a dust detection light unit (compact point light source attached instead of the lens) to the lens mount 106 .
- the light source of the light unit is, e.g., a white LED, and the size of the light emitting surface is desirably adjusted to comply with a predetermined aperture value (e.g., F 32 ).
- the embodiment will explain dust detection using a general imaging lens.
- the dust detection may also be done by attaching the light unit to the lens mount 106 .
- a dust detection image is an image with a uniform color.
- the system control circuit 50 sets the stop first.
- the imaging state of dust near the image sensor changes depending on the aperture value of the lens, and its position changes depending on the lens pupil position. For this reason, dust correction data needs to hold an aperture value and lens pupil position in detection, in addition to the position and size of dust.
- dust correction data need not always hold an aperture value if it is set to always use the same aperture value even for different lenses when creating dust correction data.
- dust correction data need not always hold it if the light unit is used or the use of only a specific lens is permitted.
- the dust correction data needs to hold an aperture value and lens pupil position in detection.
- the pupil position means a distance from the image sensing plane (focal plane) of the exit pupil.
- F 32 is designated (step S 21 ).
- the system control circuit 50 causes the stop control unit 340 via the connector 122 to control the aperture blades of the lens unit 300 and set the stop to the aperture value designated in step S 21 (step S 22 ).
- the system control circuit 50 causes the focus control unit 342 to set the focus position to infinity (step S 23 ).
- the system control circuit 50 executes shooting in the dust detection mode (step S 24 ). Details of the image sensing processing routine in step S 24 will be explained with reference to FIG. 9 .
- the memory 30 stores the shot image data.
- the system control circuit 50 obtains an aperture value and lens pupil position in shooting (step S 25 ).
- the system control circuit 50 reads out, to the image processing circuit 20 , data corresponding to each pixel of the shot image stored in the memory 30 (step S 26 ).
- the image processing circuit 20 performs processing shown in FIG. 6 , obtaining the position and size of a pixel where dust exists (step S 27 ).
- the nonvolatile memory 56 registers the position and size of the pixel where dust exists, which have been obtained in step S 27 , and the aperture value and lens pupil position information which have been obtained in step S 25 (step S 28 ).
- the system control circuit 50 determines that the light unit has been used.
- the nonvolatile memory 56 registers predetermined lens pupil position information, and an aperture value calculated from the light source diameter of the light unit.
- step S 28 the system control circuit 50 compares the position of a defective pixel (pixel defect) in the manufacture that is recorded in advance in the nonvolatile memory 56 with the position of the readout pixel data, and determines whether the target pixel is defective.
- the nonvolatile memory 56 may also register only the position of a region determined not to have a pixel defect.
- FIG. 5 shows an example of the data format of dust correction data stored in the nonvolatile memory 56 .
- the dust correction data stores lens information, dust position, and size information obtained when a detection image was shot.
- an actual aperture value (F-number) used to shoot a detection image, and lens pupil position at that time are stored as lens information obtained when a detection image was shot.
- the number (integer value) of detected dust regions is stored in the storage area.
- concrete parameters of each dust region are repetitively stored by the number of dust regions.
- the parameters of each dust region are a set of three numerical values: the radius (e.g., 2 bytes) of dust, the x-coordinate (e.g., 2 bytes) of the center in the effective image region, and the y-coordinate (e.g., 2 bytes) of the center.
- step S 27 If the dust correction data size is limited by the capacity of the nonvolatile memory 56 or the like, data are stored preferentially from the start of dust regions obtained in step S 27 . This is because dust regions are sorted in order from the most conspicuous dust in the dust region obtaining routine of step S 27 , which will be described later.
- readout image data is rasterized in the memory 30 , and processed for each predetermined block in order to cope with limb darkening arising from the lens or sensor characteristic.
- Limb darkening is a phenomenon in which the luminance at the periphery of the lens becomes lower than that at the center. It is known that limb darkening can be reduced to a certain degree by setting the lens to a large aperture value. However, even if the lens is set to a large aperture value, dust at the periphery may not be accurately detected depending on the lens when the position of dust in a shot image is determined based on a predetermined threshold value. From this, the influence of limb darkening is reduced by dividing an image into blocks.
- the dust detection result may change between blocks when the threshold value changes between them. To prevent this, blocks are made to overlap each other. A pixel determined to have dust in either block of the overlapping region is handled as a dust region.
- Determination of a dust region in a block is executed according to the processing sequence shown in FIG. 6 .
- a maximum luminance Lmax and average luminance Lave in each block are calculated.
- a pixel whose luminance does not exceed the threshold value is determined as a dust pixel (step S 61 ).
- a maximum value Xmax and minimum value Xmin of the horizontal coordinates of pixels falling within a dust region, and a maximum value Ymax and minimum value Ymin of their vertical coordinates are obtained.
- FIG. 8 shows the relationship between Xmax, Xmin, Ymax, Ymin, and ri.
- step S 64 the average luminance value of each dust region is calculated.
- the size of dust correction data is sometimes limited by the capacity of the nonvolatile memory 56 or the like.
- pieces of dust position information are sorted by the size or average luminance value of the dust region (step S 65 ).
- pieces of dust position information are sorted in descending order of ri. If all dust regions have the same ri, they are sorted in ascending order of the average luminance value. As a result, noticeable dust can be preferentially registered in dust correction data.
- Di represents a sorted dust region
- Ri represents the radius of the dust region Di.
- a large dust region may degrade the image quality if it undergoes interpolation processing later. It is desirable to correct such a large dust region finally.
- the system control circuit 50 performs this processing by executing an image sensing processing program stored in the nonvolatile memory 56 .
- the system control circuit 50 operates the mirror 130 shown in FIG. 3 to flip it up and retract it from the imaging optical path in step S 201 .
- step S 202 the image sensor 14 starts accumulating charges.
- step S 203 the shutter 12 shown in FIG. 3 travels to perform exposure.
- step S 204 the charge accumulation of the image sensor 14 ends.
- step S 205 an image signal is read out from the image sensor 14 , and image data processed by the A/D converter 16 and image processing circuit 20 is temporarily stored in the memory 30 .
- step S 206 read of all image signals from the image sensor ends.
- step S 207 the mirror 130 flips down and returns to the inclined mirror position. Then, a series of image sensing operations ends.
- step S 208 the system control circuit 50 determines whether the shooting mode is still image shooting or dust detection image shooting. If the shooting mode is still image shooting, the process advances to step S 209 to record the shot still image on the recording medium 200 .
- the first embodiment is directed to a method of performing image processing to correct an image quality degraded by dust when shooting a moving image. Prior to a description of moving image processing, still image processing will be explained.
- a still image file to undergo dust removal processing is designated and loaded into an apparatus (which may be the image processing circuit 20 in the camera or an image processing apparatus outside the camera) for performing dust removal processing (step S 1801 ).
- the apparatus for performing dust removal processing obtains dust correction data created in step S 65 of FIG. 6 (step S 1802 ).
- Ri represents the size of dust at the coordinates Di calculated instep S 65 of FIG. 6 .
- an aperture value f 2 and lens pupil position L 2 in shooting are obtained.
- Di is converted by the following equation.
- the unit is a pixel, and “+3” for Ri′ means a margin.
- step S 1806 dust in a region defined by the coordinates Di′ and radius Ri′ is detected, and if necessary, interpolation processing is applied. Details of the interpolation processing will be described later.
- step S 1807 it is determined whether all coordinates have undergone the dust removal processing. If it is determined that all coordinates have been processed, the process ends. If it is determined that all coordinates have not been processed, the process returns to step S 1806 .
- FIG. 11 is a flowchart showing the sequence of the interpolation routine.
- step S 1901 determination of the dust region is done.
- the dust region is a region which satisfies all the following conditions:
- T 2 Y ave ⁇ 0.6 +Y max ⁇ 0.4
- step S 63 in FIG. 6 (3) a region whose radius value calculated by the same method as step S 63 in FIG. 6 is equal to or larger than x1 pixels and smaller than x2 pixels in an isolated region which is selected based on condition (1) and formed from low-luminance pixels.
- x1 represents three pixels, and x2 represents 30 pixels.
- condition (4) may also be eased. For example, when the region of interest contains the coordinates of a range of ⁇ 3 pixels from the coordinates Di in both the X and Y directions, it is determined as a dust region.
- step S 1902 If such a region exists in step S 1902 , the process advances to step S 1903 to perform dust region interpolation. If no such region exists, the process ends.
- the dust region interpolation processing executed in step S 1903 adopts a known defective region interpolation method.
- An example of the known defective region interpolation method is pattern replacement disclosed in Japanese Patent Laid-Open No. 2001-223894.
- a defective region is specified using infrared light.
- a dust region detected in step S 1901 is handled as a defective region, and interpolated by pattern replacement using normal surrounding pixels. For a pixel which cannot be interpolated by pattern replacement, p normal pixels are selected sequentially from one closest to the pixel to be interpolated in image data having undergone pattern correction, and the target pixel is interpolated using the average color of them.
- the MP4 is a moving image file format used to record moving image data in recent digital cameras, digital video cameras, and the like.
- the MP4 file format (see ISO/IEC 14496-14; “Information technology—Cording of audio-visual Objects—Part 14: MP4 file format”; ISO/IEC; 2003-11-24) is extended from a general-purpose file format “ISO Base Media File Format” (see ISO/IEC 14496-12; “Information technology—Cording of audio-visual Objects—Part 12: ISO base media file format”; ISO/IEC; 2004-01-23).
- the MP4 file format aims at recording files of moving image/audio contents data such as MPEG data standardized by ISO/IEC JTC1/SC29/WG11 (International Organization for Standardization/International Engineering Consortium).
- the first embodiment is applicable not only to MP4 but also to another similar file format.
- ISO has established standards “Motion JPEG 2000 file format” (ISO/IEC 15444-3) and “AVC file format” (ISO/IEC 14496-15) as file format standards having the same basic structure as that of MP4.
- FIG. 12 is a conceptual view for explaining the data structure of the MP4 file format.
- An MP4 file 1001 contains metadata (header information) 1002 representing the physical position, temporal position, and characteristic information of video and audio data, and media data 1003 representing the entities of encoded video and audio data.
- metadata 1002 typically contains a video track 1004 for logically handling entire moving image data, and an audio track 1005 for logically handling entire audio data.
- the video track 1004 and audio track 1005 have almost the same configuration contents. More specifically, respective tracks record various kinds of metadata information of actual media data. The contents are slightly different in accordance with the characteristic of media data.
- Data contained in the video track 1004 include, for example, configuration information of a so-called decoder for decoding encoded data, and information on the rectangular size of a moving image.
- the data include an offset 1006 representing a position in a file where media data is actually recorded, and a sample size 1007 representing the size of each frame data (also called a picture) of media data.
- the video track 1004 also records a time stamp 1008 representing the decoding time of each frame data.
- the media data 1003 records the entities of moving image data and audio data in a data structure “chunk” which successively records one or more “samples” representing the basic unit of encoded data.
- the chunk includes a video chunk 1009 containing media data of a moving image, and an audio chunk 1010 containing media data of audio data in accordance with the track of the metadata 1002 .
- the video chunk 1009 and audio chunk 1010 are alternately recorded (interleaved), but the recording positions and order are not limited to those shown in FIG. 12 .
- the recording positions and order shown in FIG. 12 are merely an example of a general recording format.
- this interleave arrangement can improve the accessibility of data recorded in a file because moving image data and audio data to be played back almost simultaneously are arranged at close positions. Thus, the interleave arrangement is very popular.
- the chunk contains one or more samples of each media data.
- the video chunk 1009 successively records video samples (frames) 1011 .
- each video sample (frame) 1011 corresponds to one frame data (picture) of video data.
- Each track and each chunk are associated as follows.
- information contained in the video track 1004 includes information on each video chunk 1009 contained in the media data 1003 .
- the offset 1006 is formed from a table of information representing the relative position of the video chunk 1009 in a corresponding file. By looking up each entry of the table, the position of an actual video chunk can be specified regardless of where the video chunk is recorded.
- the sample size 1007 describes, in a table, the sizes of respective samples, i.e., video frames contained in a plurality of chunks.
- the video track 1004 also describes information on the number of samples contained in each chunk. From this information, samples contained in each video chunk 1009 can be obtained accurately.
- the time stamp 1008 records the decoding time of each sample in a table as the difference between samples.
- a so-called time stamp of each sample can be obtained by calculating the accumulated time.
- the relationship between the track and the chunk is defined so that it is also similarly established between even the audio track 1005 and the audio chunk 1010 .
- the metadata 1002 and media data 1003 can provide encoded data in a necessary unit from an arbitrary position together with additional information such as the time stamp. For descriptive convenience, not all pieces of standardized recording information have been described. Details of the definition contents of the standard can be acquired from a corresponding section of ISO/IEC 14496.
- Data recorded in a file is held in a different type of BOX in accordance with the type of data.
- the metadata 1002 is recorded as a movie BOX ‘moov’ which stores metadata information of whole contents. Information on the above-described chunk and sample is also recorded as BOX having a unique identifier in moov for each track.
- the MP4 file format not only records all metadata in moov, but also permits dividing metadata into a plurality of areas in time series and recording them. This format is called “Fragmented Movie”.
- FIG. 13 shows the file structure of the fragmented movie format.
- the fragmented movie format allows dividing media data and metadata of contents by an arbitrary time. “Fragments” are recorded from the start of a file in time series.
- moov 1101 represents metadata of the first fragment, and holds information on data contained in mdat 1102 .
- moof 1103 subsequent to the mdat 1102 represents metadata of the second fragment, and holds information on mdat 1104 .
- fragments are recorded.
- movie extends Box (‘mvex’) 1105 representing the presence of a fragment needs to be added to the moov 1101 .
- Information contained in the mvex 1105 is, e.g., the duration (time length) of whole contents including all fragments.
- a variety of attributes associated with media data are held as a metadata area separately from the media data. Thus, desired sample data can be easily accessed regardless of how to physically store media data.
- the moving image file format used to record moving image data and audio data in the first embodiment is the MP4 fragmented movie format as shown in FIG. 13 .
- a method of associating the above-described dust correction data with the video sample (frame) 1011 in moving image recording will be explained.
- the method according to the first embodiment is also applicable to standards which adopt file formats and architectures similar to those defined in MP4, such as the standards “Motion JPEG 2000 file format” (ISO/IEC 15444-3) and “AVC file format” (ISO/IEC 14496-15), and a 3GPP (3rd Generation Partnership Project) file format serving as a moving image file which is constrained on the premise of the use on wireless terminals including third-generation cell phones (see 3GPP TS 26.244 “Technical Specification Group Services and System Aspects Transparent end-to-end packet switched streaming service (PSS); 3GPP file format (3GP) (Release 6)” 3rd Generation Partnership Project; 2003-02-28).
- 3GPP TS 26.244 Technical Specification Group Services and System Aspects Transparent end-to-end packet switched streaming service (PSS); 3GPP file format (3GP) (Release 6)” 3rd Generation Partnership Project; 2003-02-28.
- FIG. 14 is a flowchart showing processing to associate dust correction data with the frame 1011 and record a moving image.
- the system control circuit 50 performs this processing by executing a moving image recording processing program stored in the nonvolatile memory 56 .
- the nonvolatile memory 56 stores dust correction data.
- the memory 30 has already stored an aperture value (F-number) and lens pupil position as lens information of a lens attached at the start of moving image shooting.
- the lens information is copied to the memory 52 at the start of moving image recording.
- the system control circuit 50 obtains the lens information by communicating with the lens unit 300 .
- the user needs to change the shooting mode from a still image shooting mode to a moving image shooting mode using the menu button 124 or mode dial 60 .
- the system control circuit 50 flips up the mirror 130 to retract it from the imaging optical path.
- the system control circuit 50 opens the shutter 12 to expose the image sensor 14 to object light.
- Image data obtained by exposure are successively written at a predetermined rate in the memory 30 serving as a frame buffer.
- the LCD monitor 417 functions as an electronic viewfinder (EVF) to sequentially display the written image data.
- EVF electronic viewfinder
- the operation unit 70 detects whether the user has pressed the moving image recording start button (e.g., he has pressed the SET button 117 in the moving image shooting mode). If so, moving image shooting starts to sequentially record image data on the recording medium 200 in the MP4 file format.
- FIG. 15 shows an example of the data format of the created dust position correction data.
- the dust position correction data stores an aperture value and lens pupil position information serving as lens information of a lens used in moving image shooting, and the dust correction data shown in FIG. 5 .
- the memory 52 stores the created dust position correction data.
- step S 1203 the dust position correction data stored in the memory 52 is read and written in the moov of metadata of the current fragment, like dust position correction data 1502 in FIG. 17A .
- the system control circuit 50 functions as an information recording unit and fragment information storage unit. Note that the data structure in FIGS. 17A and 17B will be described later.
- step S 1204 moving image data is written in mdat of the current fragment (step S 1205 ).
- the system control circuit 50 functions as a fragment recording unit.
- step S 1206 it is determined whether the user has requested the end of moving image recording, i.e., he has pressed a moving image recording stop button (e.g., he has pressed the SET button 117 during moving image recording) (step S 1206 ). If the user has requested the end of moving image recording, the process ends (step S 1210 ). If the user has not requested the end, it is checked whether lens information has been updated (step S 1207 ). The lens information is updated upon a change of the lens pupil position when the user operates the lens to zoom in/out the object image, or a change of the aperture value by the user with an operation member such as the electronic dial 411 .
- the zoom control unit 344 notifies the system control circuit 50 of a change of the pupil position via the connectors 322 and 122 . Also, the system control circuit 50 is notified of a change of the aperture value as signals of many switches including the electronic dial 411 . In this case, the zoom control unit 344 and operation unit 70 function as a lens information update notification unit.
- the system control circuit 50 Upon receiving the notification, the system control circuit 50 functions as a lens information obtaining unit.
- the system control circuit 50 stores the notified current lens information in the memory 30 , and rewrites it over lens information stored in the memory 52 .
- the photometry control unit 46 detects an abrupt change of the brightness of an object, it notifies the system control circuit 50 of this. Then, the system control circuit 50 causes the stop control unit 340 to drive and control the aperture blades.
- the system control circuit 50 obtains notified lens information.
- the zoom control unit 344 and focus control unit 342 notify the system control circuit 50 of this, and the system control circuit 50 obtains lens information.
- the lens information to be stored in the memory 52 overwrites an aperture value and lens pupil position used when the dust position correction data was obtained ( FIG. 15 ) (step S 1208 ).
- moof serving as the BOX of metadata of a new fragment and mdat serving as the BOX of media data are added to the current fragment during the write, updating the fragment in which the write position is created (step S 1209 ).
- the system control circuit 50 functions as a fragment creation unit and fragment change control unit. Thereafter, the process returns to step S 1203 to write the dust position correction data updated in step S 1208 in moof of metadata of the added fragment, like dust position correction data 1503 in FIG. 17A .
- step S 1207 If no update of lens information is detected in step S 1207 , moving image shooting, image processing, and compression processing are performed (step S 1204 ) without fragmentation. Moving image data is written in mdat of the current fragment (step S 1205 ).
- step S 1203 The series of processes (steps S 1203 , S 1204 , S 1205 , S 1206 , S 1207 , S 1208 , and S 1209 ) is repeated until the user issues an end request.
- a moving image file created upon receiving an end request (step S 1210 ) records various kinds of metadata information in moov and moof of respective fragments, mvex necessary for the fragment format, and media data in mdat so as to be compatible with the standard.
- dust position correction data is recorded for each fragment.
- dust correction data does not change during moving image shooting.
- dust position correction data may also be recorded in only moov of the first metadata in the format of FIG. 15 .
- only an aperture value and lens pupil position during shooting, which change during moving image shooting, are recorded in moof of metadata after fragmentation.
- FIG. 16 is a chart showing an example of fragmentation of a generated moving image file. Recording starts at time 1301 and stops at time 1304 . Fragmentation events upon detecting changes of lens information such as zoom-in/out and a change of the aperture value occur at time 1302 and time 1303 . A first fragment 1305 stores dust position correction data including lens information, and moving image data from the recording start time 1301 to the time 1302 when the first fragmentation event occurs.
- a second fragment 1306 is generated as a new fragment.
- the second fragment 1306 stores dust position correction data including lens information, and moving image data from the first fragmentation event generation time 1302 to the second fragmentation event generation time 1303 .
- a third fragment 1307 is generated as a new fragment.
- the third fragment 1307 stores dust position correction data including lens information, and moving image data from the second fragmentation event generation time 1303 to the time 1304 when the user requests the stop of recording.
- one moving image file with a plurality of fragments such as the first fragment 1305 , second fragment 1306 , and third fragment 1307 generated every time a change of lens information is detected is created.
- step S 1209 dust position correction data is always added to moov of metadata.
- step S 1210 a plurality of moving image files are generated.
- FIGS. 17A and 17B are a schematic view for explaining the data structure of the MP4 file format in the first embodiment.
- FIGS. 17A and 17B are a schematic view when two fragmentation events (the time 1302 and time 1303 ) shown in FIG. 16 occur to change lens information and generate three fragments in a moving image file.
- the dust position correction data 1502 is added to the video track 1004 in order to associate dust correction data with each frame of a moving image.
- the MP4 file format permits recording data unique to a system by using an extended BOX with a type ‘uuid’, or using User Data Box (‘udta’).
- uuid 1501 is set in the video track of moov or moof of each fragment to write dust position correction data as unique data, as shown in FIG. 17A .
- the dust position correction data is stored in association with each frame until lens information is updated.
- the MP4 file format also permits not only recoding ‘uuid’ in the video tracks of moov and moof, but also recording it in parallel to media data and metadata, like ‘uuid’ 2001 in FIG. 18A . Dust position correction data may also be recorded as shown in FIGS. 18A and 18B .
- ‘uuid’ 2101 may also be set at the end of a moving image file.
- dust position correction data 2102 , 2103 , and 2104 corresponding to the first, second, and third fragments are described in time series.
- Dust position correction data may also be stored as a separate file, like a dust position correction data file 2201 in FIG. 20A .
- the MP4 file 1001 needs to have the same file name with different extensions, as shown in FIGS. 20A and 20B .
- the MP4 file 1001 needs to describe the name of the dust position correction data file as unique data udta. Needless to say, udta can also record dust position correction data.
- FIG. 21 is a block diagram showing the schematic system configuration of the image processing apparatus.
- a CPU 1601 controls the overall system, and executes a program stored in a primary storage 1602 .
- the primary storage 1602 is mainly a memory.
- the primary storage 1602 loads a program from a secondary storage 1603 , and stores it.
- the secondary storage 1603 is, e.g., a hard disk. In general, the primary storage is smaller in capacity than the secondary storage.
- the secondary storage stores programs, data, and the like which cannot be completely stored in the primary storage.
- the secondary storage also stores data which need to be stored for a long time.
- the secondary storage 1603 stores programs. When executing a program, it is loaded to the primary storage 1602 and executed by the CPU 1601 .
- An input device 1604 includes a mouse and keyboard used to control the system, and a card reader, scanner, and film scanner necessary to input image data.
- An output device 1605 is, e.g., a monitor or printer.
- the apparatus can take other various arrangements, but this is not a gist of the present invention and a description thereof will be omitted.
- the image processing apparatus incorporates an operating system capable of parallel-executing a plurality of programs.
- the user can use a GUI (Graphical User Interface) to operate a program running on the apparatus.
- GUI Graphic User Interface
- FIG. 22 is a view showing the GUI of an image edit program in the image processing apparatus.
- the window has a close button 1700 and title bar 1701 .
- the user ends the program by pressing the close button.
- the user designates a moving image file to be corrected by dragging and dropping it to an image display area 1702 .
- the title bar 1701 displays the file name.
- the image display area 1702 displays first frames 2301 of respective fragments side by side as thumbnails.
- the image display area 1702 displays all frames in the fragment including the first frame side by side as thumbnails.
- the frame to be corrected is displayed to be fitted in the image display area 1702 .
- dust removal processing (to be described later) is executed.
- the image display area 1702 displays the processed image.
- a step execution button 1704 a step of the dust removal processing (to be described later) is executed.
- the image display area 1702 displays the processed image.
- a save button 1705 the target frame is replaced with the processed one to save the resultant moving image file.
- the method of designating a frame to be corrected by dust removal processing is not limited to this.
- all frames may also be displayed first as thumbnails, like the first frames 2301 , to prompt the user to select a frame to be corrected.
- the user may also designate a fragment while fragments are displayed as thumbnails. In this case, all frames in the designated fragment are automatically extracted one by one. As frames to be corrected, the extracted frames sequentially undergo dust removal processing.
- the user may designate a moving image file. Also in this case, all frames are automatically extracted one by one. As frames to be corrected, the extracted frames sequentially undergo dust removal processing.
- step S 1801 the dust position correction data 1502 added to a fragment containing the designated frame to be corrected is obtained.
- step S 1802 Dust correction data is extracted from the obtained dust position correction data 1502 to perform processing in step S 1803 .
- step S 1804 an aperture value and lens pupil position in shooting are obtained from the dust position correction data.
- step S 1805 is executed based on the information.
- step S 1806 correction processing is repetitively done until dust removal is completed (step S 1807 ).
- the dust removal processing using a separately prepared image processing apparatus has been described, but the dust removal processing may also be done within the digital camera body.
- the system control circuit 50 performs the same processing as that shown in the flowchart of FIG. 10 by executing a dust removal processing program stored in the nonvolatile memory 56 .
- the system control circuit 50 reads out still image data stored in the memory 30 to the image processing circuit 20 .
- the image processing circuit 20 performs the processing shown in FIG. 10 , and executes dust pixel interpolation processing.
- the recording medium 200 records the interpolation processing result as a new moving image file.
- Dust position correction data including lens information and dust position information during shooting is attached to each fragment.
- Dust position correction data is compact data formed from the dust position, size, and conversion data (aperture value and lens pupil position information), and does not excessively increase the size of media data such as moov and moof. Interpolation processing is done for only a region containing pixels designated by dust position correction data, so the probability of detection errors can greatly decrease.
- a moving image file is fragmented every time lens information is updated upon a lens operation during moving image shooting.
- the second embodiment will explain a method of fragmenting a moving image file in accordance with the change amount of lens information.
- FIG. 23 is a flowchart showing control to fragment a moving image file in accordance with the change amount of lens information.
- the same step numbers as those in FIG. 14 denote the same operations as those in FIG. 14 , and a difference from FIG. 14 will be mainly explained.
- a new file is created (step S 1201 ), and dust position correction data is created (step S 2301 ).
- a memory 52 stores, as P, a lens pupil position in the current lens information.
- the file stores the created dust position correction data (step S 1203 ).
- step S 1204 moving image shooting, image processing, and compression processing are performed to store the moving image in the file (step S 1205 ).
- step S 1206 it is checked whether the user has requested the end of moving image recording. If the user has not requested the end, it is checked whether the lens information has been updated (step S 1207 ).
- a lens pupil position in the updated lens information is compared with P stored in step S 2301 to check whether the change amount of the lens pupil position is equal to or larger than a predetermined value P 0 , i.e., the change amount ⁇ P 0 (step S 2302 ).
- P 0 is an arbitrary value, or a range in which the center coordinates Di′ are not greatly different between the pupil position P and an updated pupil position when the center coordinates of dust are calculated using equation (1) in step S 1804 of FIG. 10 .
- the change amount ⁇ P 0 the center coordinates Di′ greatly change if it is calculated using P in step S 1804 .
- dust region determination step S 1901 in FIG. 11
- none of the conditions are satisfied, and it is determined that dust does not exist though it actually exists. To prevent this, the lens information in the dust position correction data is overwritten to set the current lens pupil position as P (step S 2303 ).
- step S 1209 the fragment position is updated, and the process returns to step S 1203 . If the change amount ⁇ P 0 , no fragmentation is executed, and the process returns to step S 1204 to perform moving image shooting, image processing, and compression processing.
- the change amount of the lens pupil position is obtained.
- the change amount of the lens aperture value or a combination of these two change amounts may also be obtained.
- the moving image file is fragmented in accordance with the change amount of lens information. This can prevent an increase in the size of a moving image file and deterioration of file accessibility in playback that are caused by unnecessary file fragmentation.
- a system control circuit 50 performs this processing by executing a moving image shooting processing program stored in a nonvolatile memory 56 .
- the user needs to change the shooting mode from a still image shooting mode to a moving image shooting mode using a mode dial 60 or the like.
- the system control circuit 50 When the moving image shooting routine starts, the system control circuit 50 operates a quick return mirror 130 shown in FIG. 3 to flip it up and retract it from the imaging optical path.
- the system control circuit 50 opens a shutter 12 to expose an image sensor 14 to object light. Image data obtained by exposing the image sensor 14 are successively written at a predetermined rate in a memory 30 serving as a frame buffer.
- An LCD monitor 417 functions as an electronic viewfinder (EVF) to sequentially display the written image data.
- EVF electronic viewfinder
- it is detected whether the user has pressed the moving image recording start button (e.g., he has pressed a SET button 117 in the moving image shooting mode). If so, moving image shooting starts to sequentially record image data on a recording medium 200 in the MP4 file format.
- the dust position correction data stores an aperture value and lens pupil position information serving as lens information of a lens used in moving image shooting, and the dust correction data shown in FIG. 5 .
- a memory 52 stores the created dust position correction data.
- the dust position correction data stored in the memory 52 is read and written in the moov of metadata of the current fragment.
- FIG. 24 is a flowchart showing an operation when a lens unit 300 is driven to zoom during moving image shooting.
- the system control circuit 50 when the system control circuit 50 detects that a zoom control unit 344 has started zoom driving during moving image shooting, it performs the following processing.
- the system control circuit 50 Upon detecting the start of zoom driving, the system control circuit 50 newly creates a fragment (step S 1401 ).
- the shot moving image data is fragmented to record the fragments.
- the system control circuit 50 exposes the image sensor 14 to perform moving image shooting processing.
- the memory 30 stores the generated moving image data.
- An image processing circuit 20 performs image processing sequentially for respective frames of the moving image data, and the memory 30 records them (step S 1402 ).
- the system control circuit 50 receives, from the zoom control unit 344 , information representing whether the lens is during zoom driving.
- the system control circuit 50 determines whether the lens unit 300 is during zoom driving (zoom operation) (step S 1403 ).
- the system control circuit 50 determines in step S 1403 that the lens unit 300 is during zoom driving, it obtains lens information (step S 1404 ).
- the lens information includes an aperture value and pupil position.
- the system control circuit 50 determines whether the lens information of the current frame that has been obtained in step S 1404 has changed from that of a previous frame (step S 1405 ).
- step S 1405 If the system control circuit 50 determines in step S 1405 that the lens information has changed, it records the lens information of the current frame in moof of metadata of the current fragment (step S 1406 ).
- step S 1405 If the system control circuit 50 determines in step S 1405 that no lens information has changed, it performs moving image shooting, image processing, and compression processing without fragmentation, and writes the moving image data in mdat of the current fragment.
- changed lens information is additionally written in moof of metadata of one fragment.
- information on the number of frames or the like representing the range of frames corresponding to the same lens information is also written.
- the information representing the range of frames corresponding to the same lens information is not limited to the number of frames, and may also be another one as far as it can specify lens information and corresponding frames.
- step S 1403 If the system control circuit 50 determines in step S 1403 that the lens unit 300 is not during zoom driving, it performs fragmentation to newly create a fragment, and ends the operation during zoom driving.
- step S 1402 to S 1406 The series of processes (steps S 1402 to S 1406 ) is repeated until the system control circuit 50 determines that zoom driving has ended.
- lens information corresponding to each frame is read out from moof of the fragment to perform dust removal.
- the third embodiment provides the following effects.
- Dust correction data is attached to an image in the above-described manner, this obviates the need to pay attention to the correspondence between dust correction image data and shot image data.
- Dust correction data is compact data formed from the position, size, and conversion data (aperture value and lens pupil position information), and does not excessively increase the size of shot image data. Interpolation processing is done for only a region containing pixels designated by dust correction data, so the probability of detection errors can greatly decrease.
- lens information is obtained when recording each frame of a moving image during zoom driving of the lens. If the lens information changes, it is recorded.
- lens information is recorded at only the start and end of zoom driving of the lens.
- FIG. 25 is a flowchart showing moving image recording processing during zoom driving in the fourth embodiment.
- a system control circuit 50 when a system control circuit 50 detects that a zoom control unit 344 has started zoom driving during moving image shooting, it performs the following processing.
- the system control circuit 50 Upon detecting that zoom driving has started, the system control circuit 50 newly creates a fragment (step S 1501 ).
- the system control circuit 50 obtains lens information.
- the obtained lens information is recorded in moof of metadata of the current fragment (step S 1502 ).
- the lens information includes an aperture value and pupil position.
- the system control circuit 50 exposes an image sensor 14 to perform moving image shooting processing.
- a memory 30 stores the shot moving image.
- An image processing circuit 20 performs image processing sequentially for respective frames of the shot moving image, and the memory 30 records them (step S 1503 ).
- the system control circuit 50 receives, from the zoom control unit 344 , information representing whether the lens is during zoom driving.
- the system control circuit 50 determines whether the lens is during zoom driving (step S 1504 ).
- the processes in steps S 1503 and S 1504 are repeated until the system control circuit 50 determines that the lens is not during zoom driving.
- step S 1504 If the system control circuit 50 determines in step S 1504 that the lens is not during zoom driving, i.e., zoom driving has ended, it obtains lens information (step S 1505 ). As lens information at the end of zoom driving, the obtained lens information is recorded in moof of metadata of the current fragment.
- the lens information includes an aperture value and pupil position.
- the system control circuit 50 performs fragmentation to newly create a fragment, and ends the sequence during zoom driving (step S 1506 ).
- step S 1805 in dust removal processing of FIG. 10 When converting dust correction parameters in step S 1805 in dust removal processing of FIG. 10 , the pieces of lens information at the start and end of zoom driving are read out from moof of the fragment. For intermediate frames during zoom driving, lens information is interpolated based on the difference between the pieces of lens information, thereby performing dust removal.
- the fourth embodiment can achieve almost the same effects as those of the third embodiment.
- the fourth embodiment can reduce the data amount because lens information is recorded at only the start and end of zoom driving.
- lens information is recorded together with a moving image at a predetermined frame interval during zoom driving of a lens.
- FIG. 26 is a flowchart showing moving image recording processing during zoom driving in the fifth embodiment.
- a system control circuit 50 when a system control circuit 50 detects that a zoom control unit 344 has started zoom driving during moving image shooting, it performs the following processing.
- the system control circuit 50 Upon detecting that zoom driving has started, the system control circuit 50 newly creates a fragment (step S 1601 ).
- the system control circuit 50 obtains lens information, and records it in moof of metadata of the current fragment (step S 1602 ).
- the lens information includes an aperture value and pupil position.
- the system control circuit 50 starts counting frame intervals. More specifically, the system control circuit 50 substitutes “1” into the count value (step S 1603 ).
- the system control circuit 50 exposes an image sensor 14 to perform moving image shooting processing.
- a memory 30 stores the shot moving image.
- An image processing circuit 20 performs image processing sequentially for respective frames of the shot moving image, and the memory 30 records them (step S 1604 ).
- the system control circuit 50 increments, by one, the count value for counting frame intervals (step S 1605 ).
- the system control circuit 50 receives, from the zoom control unit 344 , information representing whether the lens is during zoom driving.
- the system control circuit 50 determines whether the lens is during zoom driving (step S 1606 ).
- step S 1602 to S 1607 The series of processes (steps S 1602 to S 1607 ) is repeated until the system control circuit 50 determines that the lens is not during zoom driving.
- step S 1607 the system control circuit 50 determines whether the count value has reached a predetermined frame count (10 frames in FIG. 26 ). The series of processes (steps S 1604 to S 1607 ) is repeated until the count value reaches the predetermined frame count. If the count value reaches the predetermined frame count, the system control circuit 50 records lens information in step S 1602 , starts counting again in step S 1603 , and performs the series of processes (steps S 1604 to S 1607 ).
- step S 1606 If the system control circuit 50 determines in step S 1606 that the lens is not during zoom driving, i.e., zoom driving has ended, the system control circuit 50 performs fragmentation to newly create a fragment in step S 1608 , and ends the sequence during zoom driving.
- step S 1805 in dust removal processing of FIG. 10 pieces of lens information of frames at predetermined intervals are read out from moov of the fragment. For intermediate frames for which no lens information is recorded, lens information is interpolated based on the difference between preceding and succeeding pieces of lens information, thereby performing dust removal.
- the fifth embodiment can attain almost the same effects as those of the third embodiment.
- the fifth embodiment can reduce the data amount because lens information is recorded at predetermined frame intervals.
- a storage medium (or recording medium) which stores software program codes to implement the functions of the above-described embodiments is supplied to a system or apparatus.
- the computer or CPU or MPU
- the computer reads out and executes the program codes stored in the storage medium.
- the program codes read out from the storage medium implement the functions of the above-described embodiments.
- the storage medium that stores the program codes constitutes the present invention.
- the functions of the above-described embodiments are implemented by executing the readout program codes by the computer.
- the present invention also includes a case wherein the operating system (OS) or the like running on the computer executes part or all of actual processing on the basis of the instructions of the program codes, thereby implementing the functions of the above-described embodiments.
- OS operating system
- the present invention also includes the following case. More specifically, the program codes read out from the storage medium are written in the memory of a function expansion card inserted into the computer or the memory of a function expansion unit connected to the computer.
- the CPU of the function expansion card or function expansion unit executes part or all of actual processing on the basis of the instructions of the program codes, thereby implementing the functions of the above-described embodiments.
- the storage medium stores program codes corresponding to the above-described procedures.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
Description
T1=Lave×0.6+Lmax×0.4
ri=√[{(Xmax−Xmin)/2}2+{(Ymax−Ymin)/2}2]
Di′(x,y)=(L2×(L1−H)×d/((L2−H)×L1))×Di(x,y)
Ri′=(Ri×f1/f2+3) (1)
where d is the distance from the image center to the coordinates Di, and H is the distance from the surface of the
T2=Yave×0.6+Ymax×0.4
-
- Size: the size of the entire BOX including the size field itself.
- Type: a 4-byte type identifier representing the type of BOX. In general, the type identifier is made up of four alphanumeric characters.
Claims (4)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/061,279 US9013608B2 (en) | 2008-06-05 | 2013-10-23 | Image sensing apparatus comprising foreign substance detection control method thereof, and program |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008148319A JP5014262B2 (en) | 2008-06-05 | 2008-06-05 | Imaging apparatus, control method thereof, and program |
JP2008-148319 | 2008-06-05 | ||
JP2008174954A JP5241348B2 (en) | 2008-07-03 | 2008-07-03 | Imaging apparatus, control method thereof, and program |
JP2008-174954 | 2008-07-03 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/061,279 Division US9013608B2 (en) | 2008-06-05 | 2013-10-23 | Image sensing apparatus comprising foreign substance detection control method thereof, and program |
Publications (2)
Publication Number | Publication Date |
---|---|
US20090303339A1 US20090303339A1 (en) | 2009-12-10 |
US8593537B2 true US8593537B2 (en) | 2013-11-26 |
Family
ID=41399946
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/466,839 Expired - Fee Related US8593537B2 (en) | 2008-06-05 | 2009-05-15 | Image sensing apparatus, control method thereof, and program for suppressing image deterioration caused by foreign substance |
US14/061,279 Expired - Fee Related US9013608B2 (en) | 2008-06-05 | 2013-10-23 | Image sensing apparatus comprising foreign substance detection control method thereof, and program |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/061,279 Expired - Fee Related US9013608B2 (en) | 2008-06-05 | 2013-10-23 | Image sensing apparatus comprising foreign substance detection control method thereof, and program |
Country Status (1)
Country | Link |
---|---|
US (2) | US8593537B2 (en) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9680892B2 (en) * | 2009-06-26 | 2017-06-13 | Adobe Systems Incorporated | Providing integration of multi-bit-rate media streams |
JP5400504B2 (en) * | 2009-07-03 | 2014-01-29 | キヤノン株式会社 | Imaging apparatus, image processing apparatus, control method, and program |
JP5350140B2 (en) * | 2009-08-26 | 2013-11-27 | キヤノン株式会社 | Imaging device |
JP4983961B2 (en) | 2010-05-25 | 2012-07-25 | 株式会社ニコン | Imaging device |
JP5997276B2 (en) * | 2012-07-27 | 2016-09-28 | 日産自動車株式会社 | Three-dimensional object detection device and foreign object detection device |
US9420217B2 (en) | 2012-08-17 | 2016-08-16 | Microsoft Technology Licensing, Llc | Program identifier based recording |
JP6249638B2 (en) * | 2013-05-28 | 2017-12-20 | キヤノン株式会社 | Image processing apparatus, image processing method, and program |
US10682870B2 (en) * | 2016-06-09 | 2020-06-16 | Ricoh Company, Ltd. | Conveyed object detector, conveyance device, device including movable head, conveyed object detecting method, and non-transitory recording medium storing program of same |
CN109963054B (en) * | 2017-12-25 | 2024-08-13 | 深圳点石创新科技有限公司 | Image recording equipment |
JP2022023783A (en) * | 2020-07-27 | 2022-02-08 | キヤノン株式会社 | Display device and image pickup device |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004222231A (en) | 2002-12-27 | 2004-08-05 | Nikon Corp | Image processing apparatus and image processing program |
JP2005328279A (en) | 2004-05-13 | 2005-11-24 | Canon Inc | Recording device |
US20060115177A1 (en) | 2002-12-27 | 2006-06-01 | Nikon Corporation | Image processing device and image processing program |
WO2007032145A1 (en) | 2005-09-13 | 2007-03-22 | Sony Corporation | Imaging device and recording method |
US20070159551A1 (en) * | 2006-01-12 | 2007-07-12 | Takuya Kotani | Image capturing apparatus, control method thereof, and program |
JP2007189467A (en) | 2006-01-12 | 2007-07-26 | Canon Inc | Image processing apparatus, image processing method, and program |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100197609B1 (en) * | 1996-07-27 | 1999-06-15 | 윤종용 | How to perform zoom function of video camera and its device |
US20050168485A1 (en) * | 2004-01-29 | 2005-08-04 | Nattress Thomas G. | System for combining a sequence of images with computer-generated 3D graphics |
KR101156115B1 (en) * | 2005-05-18 | 2012-06-21 | 삼성전자주식회사 | Method for controlling digital image processing apparatus |
-
2009
- 2009-05-15 US US12/466,839 patent/US8593537B2/en not_active Expired - Fee Related
-
2013
- 2013-10-23 US US14/061,279 patent/US9013608B2/en not_active Expired - Fee Related
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004222231A (en) | 2002-12-27 | 2004-08-05 | Nikon Corp | Image processing apparatus and image processing program |
US20060115177A1 (en) | 2002-12-27 | 2006-06-01 | Nikon Corporation | Image processing device and image processing program |
JP2005328279A (en) | 2004-05-13 | 2005-11-24 | Canon Inc | Recording device |
WO2007032145A1 (en) | 2005-09-13 | 2007-03-22 | Sony Corporation | Imaging device and recording method |
US20070159551A1 (en) * | 2006-01-12 | 2007-07-12 | Takuya Kotani | Image capturing apparatus, control method thereof, and program |
CN101001318A (en) | 2006-01-12 | 2007-07-18 | 佳能株式会社 | Image capturing apparatus and control method thereof |
JP2007189467A (en) | 2006-01-12 | 2007-07-26 | Canon Inc | Image processing apparatus, image processing method, and program |
Non-Patent Citations (2)
Title |
---|
The above foreign patent document was cited in Jun. 25, 2012 Japanese Office Action, which is enclosed without an English Translation, that issued in Japanese Patent Application No. 2008-174854. |
The above foreign patent documents were cited in a Feb. 24, 2011 Chinese Office Action, which is enclosed with an English Translation, that issued in Chinese Patent Application No. 200910145963.1. |
Also Published As
Publication number | Publication date |
---|---|
US20140226045A1 (en) | 2014-08-14 |
US9013608B2 (en) | 2015-04-21 |
US20090303339A1 (en) | 2009-12-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9013608B2 (en) | Image sensing apparatus comprising foreign substance detection control method thereof, and program | |
CN101686331B (en) | Camera device and control method thereof | |
US7796169B2 (en) | Image processing apparatus for correcting captured image | |
US8335399B2 (en) | Image processing apparatus, control method therefor, and program | |
US7706674B2 (en) | Device and method for controlling flash | |
JP5276308B2 (en) | Imaging apparatus and control method thereof | |
US8531561B2 (en) | Image processing apparatus, method for controlling image processing apparatus, and storage medium | |
US20100194963A1 (en) | Display control apparatus, image capturing apparatus, display control method, and program | |
US20070279512A1 (en) | Imaging apparatus | |
JP5014262B2 (en) | Imaging apparatus, control method thereof, and program | |
JP2002209135A (en) | Digital image pickup device and recording medium | |
JP5063372B2 (en) | Image processing apparatus, control method, and program | |
JP2006311340A (en) | Image display device and digital camera | |
JP3943470B2 (en) | Digital camera | |
JP5094665B2 (en) | Imaging apparatus, control method thereof, and program | |
JP5241348B2 (en) | Imaging apparatus, control method thereof, and program | |
US7233357B1 (en) | Image pickup apparatus with image evaluation | |
US10284783B2 (en) | Imaging apparatus and control method of imaging apparatus | |
US8902124B2 (en) | Digital image signal processing apparatus for displaying different images respectively on display units and method of controlling the same | |
JP4948011B2 (en) | Imaging apparatus, control method therefor, computer program, and storage medium | |
JP2007336522A (en) | Imaging apparatus | |
JP2006217510A (en) | Image display device | |
JP2016046611A (en) | Imaging apparatus, control method therefor and program | |
JP2006101408A (en) | Imaging device | |
JP2014150362A (en) | Image pickup apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CANON KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KUDO, KEISUKE;GYOTOKU, TAKASHI;REEL/FRAME:023199/0188 Effective date: 20090512 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20211126 |