US20190191064A1 - Imaging device, display system, and imaging system - Google Patents
Imaging device, display system, and imaging system Download PDFInfo
- Publication number
- US20190191064A1 US20190191064A1 US16/190,716 US201816190716A US2019191064A1 US 20190191064 A1 US20190191064 A1 US 20190191064A1 US 201816190716 A US201816190716 A US 201816190716A US 2019191064 A1 US2019191064 A1 US 2019191064A1
- Authority
- US
- United States
- Prior art keywords
- image
- region
- resolution
- view angle
- imaging device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 149
- 230000003287 optical effect Effects 0.000 claims abstract description 92
- 238000010276 construction Methods 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 19
- 238000010191 image analysis Methods 0.000 description 14
- 238000009826 distribution Methods 0.000 description 10
- 238000012545 processing Methods 0.000 description 10
- 238000000034 method Methods 0.000 description 8
- 230000002093 peripheral effect Effects 0.000 description 8
- 239000000470 constituent Substances 0.000 description 6
- 238000012937 correction Methods 0.000 description 6
- 238000013500 data storage Methods 0.000 description 4
- 239000004065 semiconductor Substances 0.000 description 3
- 238000007792 addition Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- 229910044991 metal oxide Inorganic materials 0.000 description 2
- 150000004706 metal oxides Chemical class 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000000465 moulding Methods 0.000 description 1
- 239000011347 resin Substances 0.000 description 1
- 229920005989 resin Polymers 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- H04N5/2254—
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B13/00—Optical objectives specially designed for the purposes specified below
- G02B13/001—Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
- G02B13/0015—Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design
- G02B13/002—Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design having at least one aspherical surface
- G02B13/0045—Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design having at least one aspherical surface having five or more lenses
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/26—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/02—Mountings, adjusting means, or light-tight connections, for optical elements for lenses
- G02B7/04—Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H04N5/23293—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8066—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring rearward traffic
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
-
- H04N5/374—
Definitions
- the present disclosure relates to an imaging device that captures an image of a subject, and a display system and an imaging system that use the imaging device.
- Japanese Patent Publication No. 2014-126918 discloses a camera module that is mounted on a vehicle to output an image around the vehicle to a display unit.
- the camera module disclosed in PTL1 includes an imaging unit that captures a surrounding environment of the vehicle, an image generation unit that processes a first image captured by the imaging unit to generate a second image, and a control unit that outputs the second image to the display unit when the vehicle moves forward or stops and outputs the first image to the display unit when the vehicle reverses.
- the control unit recognizes moving forward, stopping, and reversing of the vehicle based on images captured by the imaging unit.
- the present disclosure provides an imaging device that can obtain an image that has high resolution in a region at its center part, which is important for sensing, while achieving a wide view angle.
- An imaging device includes an image sensor that includes an imaging surface on which a plurality of pixels are two-dimensionally arranged and that generates image data from a subject image formed on the imaging surface and an optical system that images a subject in a predetermined range of a vertical view angle and in a predetermined range of a horizontal view angle on the imaging surface.
- a number of pixels used for capturing the subject image included in a unit view angle is defined as resolution.
- the imaging surface includes a first region including an intersection with an optical axis and a second region different from the first region.
- the optical system forms the subject image on the imaging surface so as to cause resolution of the first region to he higher than resolution of the second region.
- a relationship between a vertical view angle and resolution is different from a relationship between a horizontal view angle and resolution in the subject image.
- a display system includes the imaging device and a display that displays an image based on the image data generated by the imaging device.
- An imaging system includes the imaging device and a control device that analyzes the image data generated by the imaging device.
- the present disclosure provides an imaging device that can obtain an image that has high resolution in a region at its center part, which is important for sensing, while achieving a ride view angle.
- FIG. 1 shows a configuration of an imaging system, which is mounted on an automotive vehicle, according to a first exemplary embodiment of the present disclosure
- FIG. 2 shows a configuration of an image processing device in the imaging system
- FIG. 3 shows a configuration of an imaging device in the imaging system
- FIG. 4 is an explanatory diagram of functions of the imaging device
- FIG. 5 is an explanatory diagram of a vertical view angle of the imaging device
- FIG. 6 is an explanatory diagram of problems solved by the present disclosure.
- FIG. 7 is an explanatory diagram of concept of means for solving the problems
- FIG. 8 shows an example of a configuration of an optical system in the imaging device
- FIG. 9 is an explanatory diagram of a distribution of resolution of an image formed by the optical system of the imaging device.
- FIG. 10 is an explanatory diagram of the resolution of an image formed on an image sensor by the optical system of the imaging device
- FIG. 11 shows resolution (angle resolution) characteristics of a free-form lens used for the optical system of the imaging device
- FIG. 12(A) is an explanatory diagram of a view angle and a magnification ratio of a fisheye lens
- FIG. 12(B) is an explanatory diagram of the view angle and the magnification ratio of the optical system including a free-form lens;
- FIG. 13 is an explanatory diagram of the optical system and the image sensor in the imaging device according to the first exemplary embodiment and of a captured image formed by the optical system and the image sensor;
- FIG. 14 is an explanatory diagram of an optical system and an image sensor in an imaging device according to a second exemplary embodiment and of a captured image formed by the optical system and the image sensor;
- FIG. 15 shows resolution (angle resolution) characteristics of the optical system in the imaging device according to the second exemplary embodiment.
- FIG. 16 shows a pixel density with respect to a view angle of the image sensor in the imaging device according to the second exemplary embodiment.
- FIG. 1 shows an example of using an imaging device according to a first exemplary embodiment of the present disclosure as a rear camera for an automobile (an example of a moving body).
- imaging device 10 captures a subject behind a vehicle to generate image data.
- Imaging device 10 is mounted on vehicle 100 so as to capture a scene behind the vehicle.
- Vehicle 100 includes control device 20 that processes the image data from imaging device 10 , display 30 that displays an image based on the image data processed by control device 20 , and control target 60 that is controlled by control device 20 .
- Imaging device 10 and control device 20 constitute an imaging system.
- Imaging device 10 and display 30 constitute a display system.
- Control target 60 is at least one of a brake, an accelerator, and an alarm, for example.
- Display 30 includes a display device such as a liquid crystal display panel or an organic electro luminescence (EL) display and a drive circuit for driving the display device.
- Display 30 is an electronic room mirror, an in-vehicle display, or the like and is capable of displaying various information (maps, route guides, radio station selections, various settings, and the like).
- Display 30 also displays an image of a scene behind the vehicle captured by imaging device 10 (hereinafter, “rear view image”) when vehicle 100 reverses. As a driver checks the rear view image when reversing vehicle 100 , the driver can grasp a situation behind the vehicle and safely reverse the vehicle.
- imaging device 10 hereinafter, “rear view image”
- Control device 20 receives image data from imaging device 10 .
- Control device 20 analyzes the received image data (that is, performs image analysis on the received image data). As a result of image analysis, control device 20 recognizes an object (a person, an automobile, or other obstacles) behind the vehicle and controls control target 60 as needed. Control device 20 also performs predetermined image processing on the image data from imaging device 10 to generate image data to be displayed on display 30 .
- FIG. 2 is a block diagram of a configuration of control device 20 .
- Control device 20 includes first interface 23 (for example, a circuit) that inputs image data from imaging device 10 , controller 21 that performs image processing and image analysis on the input image data, and data storage unit 29 that stores data and the like.
- Control device 20 also includes second interface 25 (for example, a circuit) that transmits the image data generated to display 30 and third interface 27 (for example, a circuit) that transmits a control signal for controlling control target 60 to control target 60 .
- Controller 21 includes a central processing unit (CPU) and a random access memory (RAM). As controller 21 performs programs stored in data storage unit 29 , various functions are achieved. Controller 21 may include a dedicated hardware circuit designed so as to achieve desired functions. In other words, controller 21 may include the CPU, a micro processing unit (MPU), a field-programmable gate array (FPGA), a digital signal processor (DSP), or an application specific integrated circuit (ASIC), for example.
- MPU micro processing unit
- FPGA field-programmable gate array
- DSP digital signal processor
- ASIC application specific integrated circuit
- Data storage unit 29 is a recording medium such as a hard disk device, a solid state drive (SSD), or a semiconductor memory. Data storage unit 29 stores programs performed by controller 21 , data, and the like.
- SSD solid state drive
- FIG. 3 is a block diagram of a configuration of imaging device 10 .
- Imaging device 10 is a camera that captures a subject to generate image data.
- Imaging device 10 includes optical system 11 , image sensor 12 that captures a subject image generated by receiving light through optical system 11 to generate an image signal, signal processing circuit 13 that performs predetermined image processing (for example, gamma correction and distortion correction) on the image signal, and interface 14 (a circuit) that outputs the image signal processed by signal processing circuit 13 to an external apparatus.
- predetermined image processing for example, gamma correction and distortion correction
- Optical system 11 is an optical element for forming an image on an imaging surface of image sensor 12 .
- Optical system 11 includes a lens, a diaphragm, and a filter, for example.
- Image sensor 12 is an imaging element that converts an optical signal into an electric signal.
- a plurality of pixels are two-dimensionally arranged on the imaging surface of image sensor 12 at equal intervals.
- Image sensor 12 is a charge coupled device (CCD) image sensor, a complementary metal oxide semiconductor (CMOS) image sensor, or an n-channel metal-oxide semiconductor (NMOS) image sensor, for example.
- CCD charge coupled device
- CMOS complementary metal oxide semiconductor
- NMOS n-channel metal-oxide semiconductor
- FIGS. 4 and 5 are explanatory diagrams of a range of a subject region that can be captured by imaging device 10 .
- FIG. 4 shows a capturable range in a horizontal direction whereas FIG. 5 shows a capturable range in a vertical direction.
- Imaging device 10 can capture a range of 200° in the horizontal direction and a range of 180° in the vertical direction.
- Imaging device 10 according to the present exemplary embodiment can perform capturing at a very wide view angle.
- Imaging device 10 is mounted on rear of vehicle 100 (for example, a rear bumper) at a predetermined depression angle so as to face slightly vertically downward. It is thus possible to detect a subject behind the vehicle more reliably.
- FIG. 5 in a case where person 210 is behind vehicle 100 at a close distance (for example, at a distance of 0.5 m) from vehicle 100 , if the capturable range of view angle in the vertical direction is small, a part including a face of person 210 is outside the range of view angle. Consequently, the face of person 210 is not included in a captured image and the person may not be detected in image analysis based on the captured image.
- imaging device 10 is mounted on vehicle 100 at the depression angle, and thus if the person is present near vehicle 100 , the person's face is hardly captured. It is thus considered to increase the vertical view angle of imaging device 10 for the purpose of including the part including the person's face in the range of view angle more reliably. If the vertical view angle is simply increased, however, the following problem may occur.
- an upper part of person 210 is outside the range of view angle at a vertical view angle of 150°. It is thus considered that the vertical view angle is increased by 30° to 180° as shown in part (B) of FIG. 6 so that the upper part of person 210 is included in the range of view angle.
- the upper part of person 210 above the shoulder is thus included. in the range of view angle and the face of person 210 is included in a captured image.
- a captured image shown in part (B) of FIG. 6 is equal in size to a captured image shown in part (A) of FIG. 6 , but includes a wider subject range in the vertical direction.
- the vertical size of person 220 far away from vehicle 100 in the captured image shown in part (B) of FIG. 6 is less than that in the captured image shown in part (A) of FIG. 6 .
- resolution of an image of a face of person 220 is insufficient and if control device 20 performs image analysis, the face of person 220 cannot be detected.
- this problem is more serious because the face of the child is small.
- the inventors of the present disclosure have devised an optical system that forms a subject image on image sensor 12 so as to obtain sufficient resolution of an image region of interest (for example, a center part) while increasing the vertical view angle as a whole. That is to say, the inventors of the present disclosure have devised an optical system that forms an image shown in part (B) of FIG. 7 on image sensor 12 .
- An image shown in part (A) of FIG. 7 is the same as the image shown in part (B) of FIG. 6 and is obtained by being imaged at a uniform magnification ratio.
- the vertical view angle is 180° as in the image shown in part (A) of
- FIG. 7 and the same time, an image in the center part of the image is more magnified in the vertical direction and images in upper and lower end portions (in a range of view angle of 30°) are more compressed in the vertical direction as compared to the image shown in part (A) of FIG. 7 . It is thus possible to obtain an image that has high resolution in a region of the center part of interest while achieving a wide view angle in the vertical direction, thus solving the problems during sensing. A configuration of optical system 11 with such optical characteristics will be specifically described below.
- FIG. 8 shows an example of a specific configuration of optical system 11 .
- FIG. 8 schematically shows a cross-section when optical system 11 is virtually cut by a vertical plane including optical axis Z (a plane in which a horizontal direction of the image sensor is a normal).
- Optical axis Z is a virtual line that passes through a center of the imaging surface of image sensor 12 and orthogonally intersects the imaging surface.
- optical system 11 includes, for example, a mirror or a prism that reflects light, its optical axis is bent by reflection.
- optical system 11 includes a plurality of lenses and diaphragm 115 .
- optical system 11 includes free-form lenses 111 , 112 .
- the free-form lens is a lens in which a surface for refracting light to form an image has a non-arc shape and is not rotation symmetry.
- a cylindrical lens is also defined as a type of an arc lens. That is to say, the cylindrical lens is defined as a lens different from the free-form lens in the present disclosure.
- the free-form lens has the non-arc shape that is not a part of a perfect circle, does not depend on a horizontal axis or a vertical axis, and has a diagonal surface shape that can be freely set.
- the free-form lens is, for example, a lens represented by an XY polynomial surface. Materials of the free-form lens include, but are not particularly limited to, glass, resin, and the like. Examples of a method of manufacturing; the free-form lens include, but are not particularly limited to, a method of molding the free-form lens using a mold such as a metal mold.
- a set of free-form lenses 111 . 112 constitutes a lens that can cause the magnification ratio of an image to be formed to vary depending on the view angle.
- Free-form lenses 111 , 112 are designed such that, as shown in part (B) of FIG. 7 , in a captured image formed on the imaging surface of image sensor 12 through optical system 11 , an image formed in the region at the center part has a large magnification ratio and an image formed in the upper and lower end regions has a small magnification ratio.
- FIG. 9 shows the resolution of captured image 300 formed on the imaging surface of image sensor 12 by optical system 11 .
- the resolution of an image in region R 1 at the center part is high (dense) whereas the resolution of an image in region R 2 outside region R 1 is low (sparse) as compared to region R 1 .
- Optical axis Z of optical system 11 passes through a center of region R 1 . That is to say, the center of region R 1 is an intersection with optical axis Z.
- the image of region R 1 has high resolution, and thus it is possible to increase precision of sensing by using the image of region R 1 .
- Optical system 11 is designed such that the resolution changes not intermittently but successively and monotonically in region R 1 and region R 2 .
- the image resolution is defined as the number of pixels in image sensor 12 used to capture an image with a unit view angle, the image being formed on image sensor 12 through optical system 11 .
- the resolution is defined by the following formula.
- FIG. 10 is a schematic explanatory diagram of an imaging state on image sensor 12 when virtually cut by a vertical plane including optical axis Z (a plane orthogonally intersecting a horizontal plane).
- a subject image in first region r 1 in a range of view angle ⁇ x in a vertical direction including optical axis Z and a subject image in second region r 2 having identical view angle ⁇ x are formed onto image sensor 12 through optical system 11 .
- Region r 1 is a part of a subject region corresponding to region R 1 in a captured image
- region r 2 is a part of the subject region corresponding to region R 2 in the captured image.
- Optical system 11 is designed such that magnification ratio (M 1 ) of region R 1 at the center part is high whereas magnification ratio (M 2 ) of region R 2 including upper and lower end portions is low in the captured image.
- image sensor 12 captures a subject in first region r 1 including optical axis Z through optical system 11 , the image in first region r 2 is formed on the imaging surface of image sensor 12 while being magnified with magnification ratio M 1 , as shown in FIG. 10 .
- image sensor 12 captures a subject in second region r 2 apart from optical axis Z, the image is formed on the imaging surface of image sensor 12 while being magnified with magnification ratio M 2 that is lower than magnification ratio M 1 at the center part. Therefore, length L 2 of the image in second region r 2 on the imaging surface is less than length L 1 of the image in first region r 1 .
- FIG. 11 shows resolution (angle resolution) characteristics with respect to a half view angle (a view angle determined by using optical axis Z as a reference) in optical system 11 according to the present exemplary embodiment.
- FIG. 11 shows the resolution characteristics with respect to a vertical view angle and a horizontal view angle.
- the resolution of a region with a small view angle is higher than the resolution of a region with a large view angle (for example, with a half view angle of 25° or larger).
- the resolution of the region with a small view angle is also higher than the resolution of the region with a large view angle (for example, with a half view angle of 25° or larger).
- Free-form lenses 111 and 112 are designed so as to have the optical characteristics described above and the range in which the horizontal view angle and the vertical view angle are from 0° to 50°, both inclusive, is defined as first region R 1 .
- the resolution of the image formed in region R 1 at the center part can thus be higher (that is to say, denser) than the resolution of the image formed in region R 2 other than region R 1 , as shown in FIG. 9 .
- the relationship between the vertical viewangle and the resolution is different from the relationship between the horizontal view angle and the resolution as shown in FIG. 11 . That is to say, the resolution characteristics in the vertical direction are different from those in the horizontal direction in the common range of view angle (for example, the range of half view angle from 0° to 90°, both inclusive, in FIG. 11 ).
- the average of the vertical resolution can be higher than the average of the horizontal resolution in a subject image of region R 1 , for example.
- FIG. 12 is a comparison explanatory diagram of capturable ranges of view angle for a fisheye lens and for optical system 11 according to the present exemplary embodiment.
- Part (A) of FIG. 12 is an explanatory diagram of the capturable range of view angle of a fisheye lens.
- Part (B) of FIG. 12 is an explanatory diagram of the capturable range of view angle of optical system 11 including a free-form lens according to the present exemplary embodiment.
- Region R 10 indicates a region particularly required for sensing behind a vehicle.
- the resolution of region R 1 at the center part can be higher than that of peripheral region R 1 a in the fisheye lens.
- the view angle of the fisheye lens is limited by an aspect ratio of an image sensor, however.
- the aspect ratio (H:V) of the image sensor is 4:3 and the horizontal view angle is 200°
- the vertical view angle is limited to 150°. It is thus impossible to capture a subject in a region with a vertical view angle larger than 150°.
- central region R 1 where a far-off subject is possibly captured behind the vehicle when the size of the subject is still small, the subject cannot be magnified any more. As a result, the precision of sensing region R 1 is degraded.
- optical system 11 includes a free-form lens.
- the horizontal view angle and the vertical view angle are thus set independently.
- the horizontal view angle can be set to 200° and the vertical view angle can be set to 180° larger than 150° without being limited by the aspect ratio of the image sensor, as shown in part (B) of FIG. 12 . It is thus possible to capture a subject with a wider range of view angle.
- a magnification ratio can be freely set based on the view angle.
- the magnification ratio of region R 1 where a far-off subject is possibly captured behind the vehicle can thus be larger than that of other regions.
- the magnification ratio can be set independently for each of the vertical and horizontal view angles. For example, if the vertical view angle is increased., the magnification ratio in the vertical direction is reduced as a whole. In part (B) of FIG. 12 , however, not only the vertical view angle can be increased as a whole but also the average magnification ratio of region R 1 in the vertical direction can be increased more than that of other regions. It is thus possible to increase the precision of sensing region R 1 .
- the magnification ratio of optical system 11 in the horizontal direction can be kept so that the resolution of peripheral region R 1 a is kept. It is thus possible to keep the precision of sensing in a desired region.
- an expression of different resolution in this exemplary embodiment means a difference in resolution, which is generated by a combination of an optical system (for example, an optical system including an ordinary rotationally symmetric spherical lens and an aspherical lens) and a planer image sensor.
- an optical system for example, an optical system including an ordinary rotationally symmetric spherical lens and an aspherical lens
- a planer image sensor for example, a planer image sensor.
- optical system 11 (free-form lenses 111 , 112 ) according to the present exemplary embodiment can form an image in which the resolution (that is to say, magnification ratio) of view angle of the center part is higher than the resolution of a peripheral part. It is thus possible to obtain an image that has high resolution in the region at the center part, which is important for sensing, while a wide view angle is achieved as a whole.
- the relationship between the vertical view angle and the resolution is different from the relationship between the horizontal view angle and the resolution in the present exemplary embodiment. It is thus possible to independently set the vertical size and the horizontal size of a region with high resolution.
- the location and shape of the region with high resolution can be set freely.
- imaging device 10 and control device 20 configured as described above will be described below.
- Imaging device 10 captures an image behind vehicle 100 while vehicle 100 is standing or reversing, generates image data, and transmits the image data to control device 20 .
- Control device 20 receives the image data via first interface 23 .
- Controller 21 of control device 20 performs predetermined image processing on the image data received and generates image data for display.
- Controller 21 transmits the image data for display via second interface 25 to display 30 .
- Display 30 displays an image based on the image data received from control device 20 , A driver of vehicle 100 checks the image displayed on display 30 , thus grasping the situation behind the vehicle (for example, whether a child is present behind the vehicle).
- Controller 21 of control device 20 also performs image analysis (sensing) on the image data received from imaging device 10 to acquire various information about the situation behind vehicle 100 .
- controller 21 can determine whether a person is present behind the vehicle through image analysis (face detection). That is to say, when a person is present behind the vehicle, controller 21 can output a predetermined signal.
- Controller 21 can also determine whether an obstacle is present behind the vehicle through image analysis. That is to say, when an obstacle is present behind the vehicle, controller 21 can output a predetermined signal.
- Controller 21 generates a control signal of control target 60 based on a result of image analysis and transmits the control signal via the third interface to control target 60 . Control target 60 is thus controlled based on the situation behind the vehicle. Examples of control target 60 include a brake, an accelerator, and an alarm.
- controller 21 may control a brake of vehicle 100 to stop reversing vehicle 100 .
- controller 21 may control an accelerator and the brake so as not to start to reverse vehicle 100 .
- controller 21 may control an alarm to output an alarm sound or an alarm message to the person or the obstacle which is detected.
- Imaging device 10 has a wide view angle in the vertical direction, and thus if a person is present behind vehicle 100 at a close distance as shown in FIG. 5 , it is possible to capture an image including the upper part of the person (that is to say, a part including the face) as shown in part (B) of FIG. 7 . Controller 21 can thus detect the person more reliably using face detection through image analysis. Additionally, imaging device 10 according to the present exemplary embodiment can generate an image that has high resolution in the center part of a capturing range, which is important for sensing. If a small child is present in the central region of the capturing range behind the vehicle, for example, the child is captured while being magnified and thus it is possible to obtain image data that is sufficient in size for image analysis.
- imaging device 10 if the vertical view angle is increased, the vertical magnification ratio of the image in the important center part can be further increased. It is thus possible to capture even a small child, which is a target, with high resolution and recognize the small child more reliably.
- imaging device 10 can perform capturing with a wide range of view angle and capture an image of a subject in the center part of the capturing range with high resolution. It is thus possible to increase the precision of image analysis using a captured image.
- imaging device 10 includes image sensor 12 that has an imaging surface on which a plurality of pixels are two-dimensionally arranged and that generates image data from a subject image formed on the imaging surface and optical system 11 that images a subject in a predetermined range of vertical view angle and in a predetermined range of horizontal view angle on the imaging surface of image sensor 12 .
- the number of pixels used for capturing a subject image included in a unit view angle is defined as resolution.
- the imaging surface includes a first region (for example, region R 1 shown in FIG. 9 ) including an intersection with optical axis Z and a second region different from the first region (for example, region R 2 shown in FIG. 9 ).
- Optical system 11 forms a subject image on the imaging surface so that resolution of first region R 1 is higher than resolution of second region R 2 .
- the relationship between the vertical viewangle and the resolution is different from the relationship between the horizontal view angle and the resolution in the subject image formed on the imaging surface.
- the resolution of a region at the center part can be higher than the resolution of a region at a peripheral part. It is thus possible to capture an image that has high resolution in the center part required for sensing or the like while a wide view angle is achieved as a whole. It is thus possible to improve the precision of analysis of the captured image.
- vertical resolution characteristics and horizontal resolution characteristics of the subject image formed on the imaging surface vary independently, and thus it is possible to freely set the horizontal view angle and the vertical view angle of the optical system regardless of the aspect ratio of image sensor 12 . That is to say, the ratio of the vertical view angle to the horizontal view angle for optical system 11 does not need to correspond to the aspect ratio of the imaging surface. It is thus possible to freely set the view angle.
- the average vertical resolution is higher than the average horizontal resolution. It is thus possible to obtain an image that is more magnified in the vertical direction in first region R 1 .
- This image is useful for detecting a subject extending in the vertical direction, for example.
- By intentionally reducing the horizontal resolution to be less than the vertical resolution it is possible to keep the vertical resolution of peripheral region R 1 a far away from vehicle 100 , thus keeping the precision of sensing (see FIG. 12 ).
- Optical system 11 includes free-form lenses 111 , 112 . It is thus possible to freely design the magnification ratio that varies based on the view angle in the optical system. As free-form lenses 111 , 112 are used, the resolution can asymmetrically vary with respect both a horizontal axis and a vertical axis of the imaging surface. Moreover, as free-form lenses 111 , 112 are used, the shape of these lenses does not depend on the horizontal axis or the vertical axis and the shape of a surface in a diagonal direction is freely set. It is thus possible to freely set the resolution of the imaging surface in the diagonal direction (a region other than a region on the vertical axis and the horizontal axis).
- Imaging device 10 may be mounted on the rear of vehicle 100 to capture a subject behind vehicle 100 .
- imaging device 10 is used for checking safety when the vehicle reverses.
- Imaging device 10 and display 30 that displays an image based on image data generated by imaging device 10 may constitute a display system.
- Imaging device 10 and control device 20 that analyzes image data generated by the imaging device may constitute an imaging system.
- Optical system 11 is configured by including free-form lens for the purpose of obtaining a captured image that has high resolution in a central region and low resolution in a peripheral region in the first exemplary embodiment. However, it is not necessary to configure optical system 11 using the free-form lens.
- the captured image can be achieved by modifying a pixel distribution of image sensor 12 b with an ordinary rotationally symmetric optical system.
- a configuration of imaging device 10 b that includes an optical system without a free-form lens will be described below.
- FIG. 13 is an explanatory diagram of a pixel distribution of image sensor 12 in imaging device 10 according to the first exemplary embodiment and of a resolution distribution of captured image 300 captured by a combination of optical system 11 and image sensor 12 .
- FIG. 14 is an explanatory diagram of a pixel distribution of image sensor 12 b in imaging device 10 b according to the second exemplary embodiment and of a resolution distribution of captured image 300 b captured by a combination of optical system 11 b and image sensor 12 b.
- a plurality of pixels are two-dimensionally arranged on an imaging surface of image sensor 12 at equal intervals in the first exemplary embodiment.
- An image that has high resolution in the central region and low resolution in the peripheral region is formed on the imaging surface of image sensor 12 by using free-form lenses 111 , 112 of optical system 11 . It is thus possible to obtain captured image 300 whose resolution is high in the center part and becomes lower toward the peripheral edge.
- optical system 11 uses free-form lenses 111 , 112 , an image is formed on the imaging surface so that the relationship between the vertical view angle and the resolution is different from the relationship between the horizontal view angle and the resolution. As a result, it is possible to freely set the location, shape, and size of a region that is important for sensing.
- imaging device 10 b includes optical system 11 b that is a rotationally symmetric lens and image sensor 12 b with a specific pixel distribution.
- optical system 11 b is a lens that has resolution (angle resolution) characteristics shown in FIG. 15 with respect to vertical resolution and horizontal resolution when an image is formed on an image sensor with a uniform pixel distribution. That is to say, optical system 11 b is configured such that in the range of view angle that is common to the vertical direction and the horizontal direction (for example, in a range of half view angle from 0° to 75°, both inclusive, in FIG. 15 ), the relationship between the vertical view angle and the resolution (that is to say, magnification ratio) is the same as the relationship between the horizontal view angle and the resolution (that is to say, magnification ratio).
- image sensor 12 b has such a pixel distribution that the center part (the part corresponding to region R 1 ) has high pixel density and the region other than region R 1 (the part corresponding to region R 2 ) has low pixel density.
- image sensor 12 b is set such that in the range of view angle that is common to the vertical direction and the horizontal direction (for example, in a range of half view angle from 0° to 90°, both inclusive, in FIG. 16 ), the relationship between the vertical view angle and the pixel density is different from the relationship between the horizontal view angle and the pixel density.
- imaging device 10 b includes image sensor 12 b that has an imaging surface on which a plurality of pixels are two-dimensionally arranged and that generates image data from a subject image formed on the imaging surface and optical system 11 b that images a subject in a predetermined range of vertical view angle and in a predetermined range of horizontal view angle on the imaging surface of image sensor 12 b.
- the imaging surface includes a first region (for example, a region corresponding to region R 1 ) including an intersection with an optical axis and a second region different from the first region (for example, a region corresponding to region R 2 ).
- the imaging surface has such a pixel distribution that the pixel density of the first region is higher than the pixel density of the second region.
- the imaging surface is set such that the vertical pixel density is different from the horizontal pixel density.
- the resolution of region R 1 at the center part in a captured image can be higher than the resolution of region R 2 other than region R 1 , as in the first exemplary embodiment. It is thus possible to capture an image that has high resolution in the center part required. for sensing or the like while a wide view angle is achieved as a whole. It is thus possible to improve the precision of analysis of the captured image. Additionally, as the vertical resolution characteristics of an image formed on the imaging surface are different from the horizontal resolution characteristics thereof, it is possible to freely set the horizontal view angle and the vertical view angle of the optical system regardless of the aspect ratio of image sensor 12 b .
- the ratio of the vertical view angle to the horizontal view angle does not need to correspond to the aspect ratio of the imaging surface. It is thus possible to increase the whole view angle regardless of the resolution of a region that is important for sensing.
- the average of the vertical resolution of a first subject image of a subject image, which is formed in the first region may be higher than the average of the horizontal resolution of the first subject image. It is thus possible to obtain an image that is more magnified in the vertical direction in first region R 1 , for example. This image is useful for detecting a subject extending in the vertical direction, for example.
- imaging device 10 b may be mounted on the rear of vehicle 100 to capture a subject behind vehicle 100 .
- imaging device 10 b is used for checking safety when the vehicle reverses.
- imaging device 10 b and display 30 that displays an image based on image data generated by imaging device 10 b may constitute a display system.
- imaging device 10 b and control device 20 that analyzes image data generated by the imaging device may constitute an imaging system.
- first and second exemplary embodiments have been described above as examples of the technique disclosed in the present application.
- the technique according to the present disclosure is not limited to the first and second exemplary embodiments, but is applicable to other exemplary embodiments including appropriate modifications, replacements, additions, omissions, and the like.
- new exemplary embodiments can be made by combining constituent elements described in the first and second exemplary embodiments. Hence, other exemplary embodiments are described below.
- the electronic room mirror and the in-vehicle display are exemplified as the display.
- the type of the display is not limited to these electronic room mirror and in-vehicle display.
- An idea of the present disclosure can be adapted to display systems that use various types of displays according to uses (for example, a head-up display).
- imaging devices 10 , 10 b are disposed to capture a scene behind the vehicle in the exemplary embodiments described above, imaging devices 10 , 10 b may be disposed to capture a front scene or a side scene of the vehicle.
- imaging devices 10 , 10 b perform the gamma correction and the distortion correction on images in the exemplary embodiments described above, control device 20 may perform these processing.
- imaging devices 10 , 10 b may perform the gamma correction, and control device 20 may perform the distortion correction.
- vehicle 100 of an automobile has been described as an example of a moving body in the exemplary embodiments described above, the moving body is not limited to the automobile.
- the imaging device according to the exemplary embodiments described above may be used for other moving bodies including a railway vehicle, a vessel, an airplane, a robot, a robot arm, a drone, an agricultural machine such as a combine, and a construction machine such as a crane.
- the imaging device may also be used for a monitoring camera.
- the view angle, the resolution and the like described in the exemplary embodiments are only examples and may be appropriately set based on a target (an object or an event) to be subjected to image analysis.
- optical system is configured by including the free-form lens in the first exemplary embodiment described above, other types of lens whose magnification ratio (that is, resolution) can be freely designed according to a view angle may be used instead of the free-form lens.
- constituent elements described in the accompanying drawings and the detailed description may not only include constituent elements that are essential for solving the problems, but may also include constituent elements that are not essential for solving the problems in order to illustrate the technique. It should not be therefore determined that the unessential constituent elements in the accompanying drawings and the detailed description are essential only based on the fact that these constituent elements are included in the drawings and the description.
- one imaging device can provide images with a plurality of view angles including a high resolution image, and the system is applicable to various uses (an imaging system or a display system in a moving body, or the like).
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Mechanical Engineering (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
- The present disclosure relates to an imaging device that captures an image of a subject, and a display system and an imaging system that use the imaging device.
- Unexamined. Japanese Patent Publication No. 2014-126918 (PTL1) discloses a camera module that is mounted on a vehicle to output an image around the vehicle to a display unit. The camera module disclosed in PTL1 includes an imaging unit that captures a surrounding environment of the vehicle, an image generation unit that processes a first image captured by the imaging unit to generate a second image, and a control unit that outputs the second image to the display unit when the vehicle moves forward or stops and outputs the first image to the display unit when the vehicle reverses. The control unit recognizes moving forward, stopping, and reversing of the vehicle based on images captured by the imaging unit.
- When an image behind a vehicle is captured, it is desirable to capture the image at a wide view angle for the purpose of obtaining information as much as possible. As the view angle is increased, however, a number of pixels per unit area of an image is reduced. Consequently, it may be impossible to obtain required image quality in an important part of an image that is required for sensing. To solve such a problem, it is necessary to capture images using two imaging devices, that is, an imaging device capable of capturing images at a wide view angle and an imaging device capable of capturing images of the important part for sensing with high image quality. This brings about large-scale devices and thus the cost increases.
- The present disclosure provides an imaging device that can obtain an image that has high resolution in a region at its center part, which is important for sensing, while achieving a wide view angle.
- An imaging device according to a first aspect of the present disclosure includes an image sensor that includes an imaging surface on which a plurality of pixels are two-dimensionally arranged and that generates image data from a subject image formed on the imaging surface and an optical system that images a subject in a predetermined range of a vertical view angle and in a predetermined range of a horizontal view angle on the imaging surface. A number of pixels used for capturing the subject image included in a unit view angle is defined as resolution. The imaging surface includes a first region including an intersection with an optical axis and a second region different from the first region. The optical system forms the subject image on the imaging surface so as to cause resolution of the first region to he higher than resolution of the second region. A relationship between a vertical view angle and resolution is different from a relationship between a horizontal view angle and resolution in the subject image.
- A display system according to the present disclosure includes the imaging device and a display that displays an image based on the image data generated by the imaging device.
- An imaging system according to the present disclosure includes the imaging device and a control device that analyzes the image data generated by the imaging device.
- The present disclosure provides an imaging device that can obtain an image that has high resolution in a region at its center part, which is important for sensing, while achieving a ride view angle.
-
FIG. 1 shows a configuration of an imaging system, which is mounted on an automotive vehicle, according to a first exemplary embodiment of the present disclosure; -
FIG. 2 shows a configuration of an image processing device in the imaging system; -
FIG. 3 shows a configuration of an imaging device in the imaging system; -
FIG. 4 is an explanatory diagram of functions of the imaging device; -
FIG. 5 is an explanatory diagram of a vertical view angle of the imaging device; -
FIG. 6 is an explanatory diagram of problems solved by the present disclosure; -
FIG. 7 is an explanatory diagram of concept of means for solving the problems; -
FIG. 8 shows an example of a configuration of an optical system in the imaging device; -
FIG. 9 is an explanatory diagram of a distribution of resolution of an image formed by the optical system of the imaging device; -
FIG. 10 is an explanatory diagram of the resolution of an image formed on an image sensor by the optical system of the imaging device; -
FIG. 11 shows resolution (angle resolution) characteristics of a free-form lens used for the optical system of the imaging device; -
FIG. 12(A) is an explanatory diagram of a view angle and a magnification ratio of a fisheye lens; -
FIG. 12(B) is an explanatory diagram of the view angle and the magnification ratio of the optical system including a free-form lens; -
FIG. 13 is an explanatory diagram of the optical system and the image sensor in the imaging device according to the first exemplary embodiment and of a captured image formed by the optical system and the image sensor; -
FIG. 14 is an explanatory diagram of an optical system and an image sensor in an imaging device according to a second exemplary embodiment and of a captured image formed by the optical system and the image sensor; -
FIG. 15 shows resolution (angle resolution) characteristics of the optical system in the imaging device according to the second exemplary embodiment; and -
FIG. 16 shows a pixel density with respect to a view angle of the image sensor in the imaging device according to the second exemplary embodiment. - Exemplary embodiments will be described in detail below with reference to the drawings as appropriate. However, descriptions in more detail than necessary may be omitted. For example, detailed descriptions of well-known matters and duplicate descriptions of substantially identical configurations may be omitted. This is to avoid unnecessarily redundancy in the following description, and to facilitate understanding by those skilled in the art.
- Here, the inventors of the present disclosure provide the accompanying drawings and the following description such that those skilled in the art can fully understand the present disclosure, and therefore, they do not intend to restrict the subject matters of claims by the accompanying drawings and the following description.
-
FIG. 1 shows an example of using an imaging device according to a first exemplary embodiment of the present disclosure as a rear camera for an automobile (an example of a moving body). In the example ofFIG. 1 ,imaging device 10 captures a subject behind a vehicle to generate image data.Imaging device 10 is mounted onvehicle 100 so as to capture a scene behind the vehicle.Vehicle 100 includescontrol device 20 that processes the image data fromimaging device 10, display 30 that displays an image based on the image data processed bycontrol device 20, andcontrol target 60 that is controlled bycontrol device 20.Imaging device 10 andcontrol device 20 constitute an imaging system. Imagingdevice 10 anddisplay 30 constitute a display system.Control target 60 is at least one of a brake, an accelerator, and an alarm, for example. -
Display 30 includes a display device such as a liquid crystal display panel or an organic electro luminescence (EL) display and a drive circuit for driving the display device.Display 30 is an electronic room mirror, an in-vehicle display, or the like and is capable of displaying various information (maps, route guides, radio station selections, various settings, and the like).Display 30 also displays an image of a scene behind the vehicle captured by imaging device 10 (hereinafter, “rear view image”) whenvehicle 100 reverses. As a driver checks the rear view image when reversingvehicle 100, the driver can grasp a situation behind the vehicle and safely reverse the vehicle. -
Control device 20 receives image data fromimaging device 10.Control device 20 analyzes the received image data (that is, performs image analysis on the received image data). As a result of image analysis,control device 20 recognizes an object (a person, an automobile, or other obstacles) behind the vehicle andcontrols control target 60 as needed.Control device 20 also performs predetermined image processing on the image data fromimaging device 10 to generate image data to be displayed ondisplay 30. - Configurations of
imaging device 10 andcontrol device 20 will be specifically described below. -
FIG. 2 is a block diagram of a configuration ofcontrol device 20.Control device 20 includes first interface 23 (for example, a circuit) that inputs image data fromimaging device 10,controller 21 that performs image processing and image analysis on the input image data, anddata storage unit 29 that stores data and the like.Control device 20 also includes second interface 25 (for example, a circuit) that transmits the image data generated to display 30 and third interface 27 (for example, a circuit) that transmits a control signal for controllingcontrol target 60 to controltarget 60. -
Controller 21 includes a central processing unit (CPU) and a random access memory (RAM). Ascontroller 21 performs programs stored indata storage unit 29, various functions are achieved.Controller 21 may include a dedicated hardware circuit designed so as to achieve desired functions. In other words,controller 21 may include the CPU, a micro processing unit (MPU), a field-programmable gate array (FPGA), a digital signal processor (DSP), or an application specific integrated circuit (ASIC), for example. -
Data storage unit 29 is a recording medium such as a hard disk device, a solid state drive (SSD), or a semiconductor memory.Data storage unit 29 stores programs performed bycontroller 21, data, and the like. -
FIG. 3 is a block diagram of a configuration ofimaging device 10.Imaging device 10 is a camera that captures a subject to generate image data.Imaging device 10 includesoptical system 11,image sensor 12 that captures a subject image generated by receiving light throughoptical system 11 to generate an image signal,signal processing circuit 13 that performs predetermined image processing (for example, gamma correction and distortion correction) on the image signal, and interface 14 (a circuit) that outputs the image signal processed bysignal processing circuit 13 to an external apparatus. -
Optical system 11 is an optical element for forming an image on an imaging surface ofimage sensor 12.Optical system 11 includes a lens, a diaphragm, and a filter, for example.Optical system 11 will be described later in detail.Image sensor 12 is an imaging element that converts an optical signal into an electric signal. A plurality of pixels are two-dimensionally arranged on the imaging surface ofimage sensor 12 at equal intervals.Image sensor 12 is a charge coupled device (CCD) image sensor, a complementary metal oxide semiconductor (CMOS) image sensor, or an n-channel metal-oxide semiconductor (NMOS) image sensor, for example. -
FIGS. 4 and 5 are explanatory diagrams of a range of a subject region that can be captured by imagingdevice 10.FIG. 4 shows a capturable range in a horizontal direction whereasFIG. 5 shows a capturable range in a vertical direction.Imaging device 10 can capture a range of 200° in the horizontal direction and a range of 180° in the vertical direction.Imaging device 10 according to the present exemplary embodiment can perform capturing at a very wide view angle.Imaging device 10 is mounted on rear of vehicle 100 (for example, a rear bumper) at a predetermined depression angle so as to face slightly vertically downward. It is thus possible to detect a subject behind the vehicle more reliably. - A description will be given of problems when an image is captured at a view angle that is large in the vertical direction. As shown in
FIG. 5 , in a case whereperson 210 is behindvehicle 100 at a close distance (for example, at a distance of 0.5 m) fromvehicle 100, if the capturable range of view angle in the vertical direction is small, a part including a face ofperson 210 is outside the range of view angle. Consequently, the face ofperson 210 is not included in a captured image and the person may not be detected in image analysis based on the captured image. In particular,imaging device 10 is mounted onvehicle 100 at the depression angle, and thus if the person is present nearvehicle 100, the person's face is hardly captured. It is thus considered to increase the vertical view angle ofimaging device 10 for the purpose of including the part including the person's face in the range of view angle more reliably. If the vertical view angle is simply increased, however, the following problem may occur. - For example, as shown in part (A) of
FIG. 6 , an upper part ofperson 210 is outside the range of view angle at a vertical view angle of 150°. It is thus considered that the vertical view angle is increased by 30° to 180° as shown in part (B) ofFIG. 6 so that the upper part ofperson 210 is included in the range of view angle. The upper part ofperson 210 above the shoulder is thus included. in the range of view angle and the face ofperson 210 is included in a captured image. In this case, a captured image shown in part (B) ofFIG. 6 is equal in size to a captured image shown in part (A) ofFIG. 6 , but includes a wider subject range in the vertical direction. The vertical size of each object included in the captured image shown in part (B) ofFIG. 6 is thus less than that included in the captured image shown in part (A) ofFIG. 6 . For example, the vertical size ofperson 220 far away fromvehicle 100 in the captured image shown in part (B) ofFIG. 6 is less than that in the captured image shown in part (A) ofFIG. 6 . As a result, resolution of an image of a face ofperson 220 is insufficient and ifcontrol device 20 performs image analysis, the face ofperson 220 cannot be detected. In particular, whenperson 220 is a small child, this problem is more serious because the face of the child is small. - The inventors of the present disclosure have devised an optical system that forms a subject image on
image sensor 12 so as to obtain sufficient resolution of an image region of interest (for example, a center part) while increasing the vertical view angle as a whole. That is to say, the inventors of the present disclosure have devised an optical system that forms an image shown in part (B) ofFIG. 7 onimage sensor 12. An image shown in part (A) ofFIG. 7 is the same as the image shown in part (B) ofFIG. 6 and is obtained by being imaged at a uniform magnification ratio. In the image shown in part (B) ofFIG. 7 , the vertical view angle is 180° as in the image shown in part (A) of -
FIG. 7 and the same time, an image in the center part of the image is more magnified in the vertical direction and images in upper and lower end portions (in a range of view angle of 30°) are more compressed in the vertical direction as compared to the image shown in part (A) ofFIG. 7 . It is thus possible to obtain an image that has high resolution in a region of the center part of interest while achieving a wide view angle in the vertical direction, thus solving the problems during sensing. A configuration ofoptical system 11 with such optical characteristics will be specifically described below. -
FIG. 8 shows an example of a specific configuration ofoptical system 11.FIG. 8 schematically shows a cross-section whenoptical system 11 is virtually cut by a vertical plane including optical axis Z (a plane in which a horizontal direction of the image sensor is a normal). Optical axis Z is a virtual line that passes through a center of the imaging surface ofimage sensor 12 and orthogonally intersects the imaging surface. Whenoptical system 11 includes, for example, a mirror or a prism that reflects light, its optical axis is bent by reflection. As shown inFIG. 8 ,optical system 11 includes a plurality of lenses anddiaphragm 115. In particular,optical system 11 includes free-form lenses - The free-form lens is a lens in which a surface for refracting light to form an image has a non-arc shape and is not rotation symmetry. In the present disclosure, a cylindrical lens is also defined as a type of an arc lens. That is to say, the cylindrical lens is defined as a lens different from the free-form lens in the present disclosure. The free-form lens has the non-arc shape that is not a part of a perfect circle, does not depend on a horizontal axis or a vertical axis, and has a diagonal surface shape that can be freely set. The free-form lens is, for example, a lens represented by an XY polynomial surface. Materials of the free-form lens include, but are not particularly limited to, glass, resin, and the like. Examples of a method of manufacturing; the free-form lens include, but are not particularly limited to, a method of molding the free-form lens using a mold such as a metal mold.
- A set of free-
form lenses 111. 112 constitutes a lens that can cause the magnification ratio of an image to be formed to vary depending on the view angle. Free-form lenses FIG. 7 , in a captured image formed on the imaging surface ofimage sensor 12 throughoptical system 11, an image formed in the region at the center part has a large magnification ratio and an image formed in the upper and lower end regions has a small magnification ratio. -
FIG. 9 shows the resolution of capturedimage 300 formed on the imaging surface ofimage sensor 12 byoptical system 11. As shown inFIG. 9 , in capturedimage 300, the resolution of an image in region R1 at the center part is high (dense) whereas the resolution of an image in region R2 outside region R1 is low (sparse) as compared to region R1. Optical axis Z ofoptical system 11 passes through a center of region R1. That is to say, the center of region R1 is an intersection with optical axis Z. The image of region R1 has high resolution, and thus it is possible to increase precision of sensing by using the image of region R1.Optical system 11 is designed such that the resolution changes not intermittently but successively and monotonically in region R1 and region R2. - The image resolution is defined as the number of pixels in
image sensor 12 used to capture an image with a unit view angle, the image being formed onimage sensor 12 throughoptical system 11. The resolution is defined by the following formula. -
Resolution=number of pixels required to capture image with predetermined view angle/predetermined view angle (1) - The resolution of an image formed by
optical system 11 is specifically described with reference toFIG. 10 .FIG. 10 is a schematic explanatory diagram of an imaging state onimage sensor 12 when virtually cut by a vertical plane including optical axis Z (a plane orthogonally intersecting a horizontal plane). As shown inFIG. 10 , it is considered that a subject image in first region r1 in a range of view angle θx in a vertical direction including optical axis Z and a subject image in second region r2 having identical view angle θx are formed ontoimage sensor 12 throughoptical system 11. Region r1 is a part of a subject region corresponding to region R1 in a captured image, and region r2 is a part of the subject region corresponding to region R2 in the captured image. -
Optical system 11 is designed such that magnification ratio (M1) of region R1 at the center part is high whereas magnification ratio (M2) of region R2 including upper and lower end portions is low in the captured image. Whenimage sensor 12 captures a subject in first region r1 including optical axis Z throughoptical system 11, the image in first region r2 is formed on the imaging surface ofimage sensor 12 while being magnified with magnification ratio M1, as shown inFIG. 10 . Whenimage sensor 12 captures a subject in second region r2 apart from optical axis Z, the image is formed on the imaging surface ofimage sensor 12 while being magnified with magnification ratio M2 that is lower than magnification ratio M1 at the center part. Therefore, length L2 of the image in second region r2 on the imaging surface is less than length L1 of the image in first region r1. - Pixels are two-dimensionally arranged on
image sensor 12 at equal intervals. Consequently, as the vertical length of an image increases, the number of pixels required to capture the image also increases. In other words, number N1 of pixels required to capture the image in first region r1 having length L1 is larger than number N2 of pixels required to capture the image in second region r2 having length L2 (<L1). View angle of first region r1 and the view angle of second region r2 are equal (θx). Accordingly, the resolution of the image for first region r1 (=N1/θx) (the number of pixels per unit view angle) is higher than the resolution of the image for second region r2 (=N2/θx). -
FIG. 11 shows resolution (angle resolution) characteristics with respect to a half view angle (a view angle determined by using optical axis Z as a reference) inoptical system 11 according to the present exemplary embodiment.FIG. 11 shows the resolution characteristics with respect to a vertical view angle and a horizontal view angle. - As shown in
FIG. 11 , in the vertical direction, the resolution of a region with a small view angle (for example, with a half view angle ranging from 0° to 25°, both inclusive) is higher than the resolution of a region with a large view angle (for example, with a half view angle of 25° or larger). In the horizontal direction, the resolution of the region with a small view angle (for example, with a half view angle ranging from 0° to 25°, both inclusive) is also higher than the resolution of the region with a large view angle (for example, with a half view angle of 25° or larger). - Free-
form lenses image 300 generated byimage sensor 12, the resolution of the image formed in region R1 at the center part can thus be higher (that is to say, denser) than the resolution of the image formed in region R2 other than region R1, as shown inFIG. 9 . - In the present exemplary embodiment, the relationship between the vertical viewangle and the resolution is different from the relationship between the horizontal view angle and the resolution as shown in
FIG. 11 . That is to say, the resolution characteristics in the vertical direction are different from those in the horizontal direction in the common range of view angle (for example, the range of half view angle from 0° to 90°, both inclusive, inFIG. 11 ). Asoptical system 11 is designed so as to have such optical characteristics, the average of the vertical resolution can be higher than the average of the horizontal resolution in a subject image of region R1, for example. -
FIG. 12 is a comparison explanatory diagram of capturable ranges of view angle for a fisheye lens and foroptical system 11 according to the present exemplary embodiment. Part (A) ofFIG. 12 is an explanatory diagram of the capturable range of view angle of a fisheye lens. Part (B) ofFIG. 12 is an explanatory diagram of the capturable range of view angle ofoptical system 11 including a free-form lens according to the present exemplary embodiment. Region R10 indicates a region particularly required for sensing behind a vehicle. - As shown in part (A) of
FIG. 12 , the resolution of region R1 at the center part can be higher than that of peripheral region R1 a in the fisheye lens. The view angle of the fisheye lens is limited by an aspect ratio of an image sensor, however. For example, when the aspect ratio (H:V) of the image sensor is 4:3 and the horizontal view angle is 200°, the vertical view angle is limited to 150°. It is thus impossible to capture a subject in a region with a vertical view angle larger than 150°. Additionally, in central region R1 where a far-off subject is possibly captured behind the vehicle, when the size of the subject is still small, the subject cannot be magnified any more. As a result, the precision of sensing region R1 is degraded. - Meanwhile,
optical system 11 according to the present exemplary embodiment includes a free-form lens. The horizontal view angle and the vertical view angle are thus set independently. For this reason, the horizontal view angle can be set to 200° and the vertical view angle can be set to 180° larger than 150° without being limited by the aspect ratio of the image sensor, as shown in part (B) ofFIG. 12 . It is thus possible to capture a subject with a wider range of view angle. - As
optical system 11 includes the free-form lens, a magnification ratio can be freely set based on the view angle. The magnification ratio of region R1 where a far-off subject is possibly captured behind the vehicle can thus be larger than that of other regions. Additionally, the magnification ratio can be set independently for each of the vertical and horizontal view angles. For example, if the vertical view angle is increased., the magnification ratio in the vertical direction is reduced as a whole. In part (B) ofFIG. 12 , however, not only the vertical view angle can be increased as a whole but also the average magnification ratio of region R1 in the vertical direction can be increased more than that of other regions. It is thus possible to increase the precision of sensing region R1. The magnification ratio ofoptical system 11 in the horizontal direction can be kept so that the resolution of peripheral region R1 a is kept. It is thus possible to keep the precision of sensing in a desired region. - Note that, an expression of different resolution in this exemplary embodiment means a difference in resolution, which is generated by a combination of an optical system (for example, an optical system including an ordinary rotationally symmetric spherical lens and an aspherical lens) and a planer image sensor.
- As described above, optical system 11 (free-
form lenses 111, 112) according to the present exemplary embodiment can form an image in which the resolution (that is to say, magnification ratio) of view angle of the center part is higher than the resolution of a peripheral part. It is thus possible to obtain an image that has high resolution in the region at the center part, which is important for sensing, while a wide view angle is achieved as a whole. In a common range of view angle, the relationship between the vertical view angle and the resolution is different from the relationship between the horizontal view angle and the resolution in the present exemplary embodiment. It is thus possible to independently set the vertical size and the horizontal size of a region with high resolution. In addition, the location and shape of the region with high resolution can be set freely. - Operations of
imaging device 10 andcontrol device 20 configured as described above will be described below. -
Imaging device 10 captures an image behindvehicle 100 whilevehicle 100 is standing or reversing, generates image data, and transmits the image data to controldevice 20. -
Control device 20 receives the image data viafirst interface 23.Controller 21 ofcontrol device 20 performs predetermined image processing on the image data received and generates image data for display.Controller 21 transmits the image data for display viasecond interface 25 to display 30.Display 30 displays an image based on the image data received fromcontrol device 20, A driver ofvehicle 100 checks the image displayed ondisplay 30, thus grasping the situation behind the vehicle (for example, whether a child is present behind the vehicle). -
Controller 21 ofcontrol device 20 also performs image analysis (sensing) on the image data received fromimaging device 10 to acquire various information about the situation behindvehicle 100. For example,controller 21 can determine whether a person is present behind the vehicle through image analysis (face detection). That is to say, when a person is present behind the vehicle,controller 21 can output a predetermined signal.Controller 21 can also determine whether an obstacle is present behind the vehicle through image analysis. That is to say, when an obstacle is present behind the vehicle,controller 21 can output a predetermined signal.Controller 21 generates a control signal ofcontrol target 60 based on a result of image analysis and transmits the control signal via the third interface to controltarget 60.Control target 60 is thus controlled based on the situation behind the vehicle. Examples ofcontrol target 60 include a brake, an accelerator, and an alarm. - For example, when is detected during reversing of
vehicle 100 that a person or an obstacle is present behind the vehicle by performing image analysis on the image data received fromimaging device 10,controller 21 may control a brake ofvehicle 100 to stop reversingvehicle 100. Alternatively, when it is detected before the start of reversing of the vehicle that a person or an obstacle is present behind the vehicle,controller 21 may control an accelerator and the brake so as not to start to reversevehicle 100. Alternatively, when it is detected that a person or an obstacle is present behind the vehicle,controller 21 may control an alarm to output an alarm sound or an alarm message to the person or the obstacle which is detected. -
Imaging device 10 according to the present exemplary embodiment has a wide view angle in the vertical direction, and thus if a person is present behindvehicle 100 at a close distance as shown inFIG. 5 , it is possible to capture an image including the upper part of the person (that is to say, a part including the face) as shown in part (B) ofFIG. 7 .Controller 21 can thus detect the person more reliably using face detection through image analysis. Additionally,imaging device 10 according to the present exemplary embodiment can generate an image that has high resolution in the center part of a capturing range, which is important for sensing. If a small child is present in the central region of the capturing range behind the vehicle, for example, the child is captured while being magnified and thus it is possible to obtain image data that is sufficient in size for image analysis. Consequently, it is possible to recognize even a small child. Moreover, inimaging device 10 according to the present exemplary embodiment, if the vertical view angle is increased, the vertical magnification ratio of the image in the important center part can be further increased. It is thus possible to capture even a small child, which is a target, with high resolution and recognize the small child more reliably. - As described above,
imaging device 10 according to the present exemplary embodiment can perform capturing with a wide range of view angle and capture an image of a subject in the center part of the capturing range with high resolution. It is thus possible to increase the precision of image analysis using a captured image. - As described above,
imaging device 10 according to the present exemplary embodiment includesimage sensor 12 that has an imaging surface on which a plurality of pixels are two-dimensionally arranged and that generates image data from a subject image formed on the imaging surface andoptical system 11 that images a subject in a predetermined range of vertical view angle and in a predetermined range of horizontal view angle on the imaging surface ofimage sensor 12. The number of pixels used for capturing a subject image included in a unit view angle is defined as resolution. The imaging surface includes a first region (for example, region R1 shown inFIG. 9 ) including an intersection with optical axis Z and a second region different from the first region (for example, region R2 shown inFIG. 9 ).Optical system 11 forms a subject image on the imaging surface so that resolution of first region R1 is higher than resolution of second region R2. The relationship between the vertical viewangle and the resolution is different from the relationship between the horizontal view angle and the resolution in the subject image formed on the imaging surface. - With the configuration described above, in a captured image, the resolution of a region at the center part can be higher than the resolution of a region at a peripheral part. It is thus possible to capture an image that has high resolution in the center part required for sensing or the like while a wide view angle is achieved as a whole. It is thus possible to improve the precision of analysis of the captured image.
- Additionally, vertical resolution characteristics and horizontal resolution characteristics of the subject image formed on the imaging surface vary independently, and thus it is possible to freely set the horizontal view angle and the vertical view angle of the optical system regardless of the aspect ratio of
image sensor 12. That is to say, the ratio of the vertical view angle to the horizontal view angle foroptical system 11 does not need to correspond to the aspect ratio of the imaging surface. It is thus possible to freely set the view angle. - In an image formed in first region R1 of the imaging surface, the average vertical resolution is higher than the average horizontal resolution. It is thus possible to obtain an image that is more magnified in the vertical direction in first region R1. This image is useful for detecting a subject extending in the vertical direction, for example. By intentionally reducing the horizontal resolution to be less than the vertical resolution, it is possible to keep the vertical resolution of peripheral region R1 a far away from
vehicle 100, thus keeping the precision of sensing (seeFIG. 12 ). -
Optical system 11 includes free-form lenses form lenses form lenses -
Imaging device 10 may be mounted on the rear ofvehicle 100 to capture a subject behindvehicle 100. For example,imaging device 10 is used for checking safety when the vehicle reverses. -
Imaging device 10 anddisplay 30 that displays an image based on image data generated by imagingdevice 10 may constitute a display system. -
Imaging device 10 andcontrol device 20 that analyzes image data generated by the imaging device may constitute an imaging system. -
Optical system 11 is configured by including free-form lens for the purpose of obtaining a captured image that has high resolution in a central region and low resolution in a peripheral region in the first exemplary embodiment. However, it is not necessary to configureoptical system 11 using the free-form lens. The captured image can be achieved by modifying a pixel distribution ofimage sensor 12 b with an ordinary rotationally symmetric optical system. A configuration ofimaging device 10 b that includes an optical system without a free-form lens will be described below. -
FIG. 13 is an explanatory diagram of a pixel distribution ofimage sensor 12 inimaging device 10 according to the first exemplary embodiment and of a resolution distribution of capturedimage 300 captured by a combination ofoptical system 11 andimage sensor 12.FIG. 14 is an explanatory diagram of a pixel distribution ofimage sensor 12 b inimaging device 10 b according to the second exemplary embodiment and of a resolution distribution of capturedimage 300 b captured by a combination ofoptical system 11 b andimage sensor 12 b. - As shown in
FIG. 13 , a plurality of pixels are two-dimensionally arranged on an imaging surface ofimage sensor 12 at equal intervals in the first exemplary embodiment. An image that has high resolution in the central region and low resolution in the peripheral region is formed on the imaging surface ofimage sensor 12 by using free-form lenses optical system 11. It is thus possible to obtain capturedimage 300 whose resolution is high in the center part and becomes lower toward the peripheral edge. Asoptical system 11 uses free-form lenses - Meanwhile, in the present exemplary embodiment,
imaging device 10 b includesoptical system 11 b that is a rotationally symmetric lens andimage sensor 12 b with a specific pixel distribution. - That is to say,
optical system 11 b according to the second exemplary embodiment is a lens that has resolution (angle resolution) characteristics shown inFIG. 15 with respect to vertical resolution and horizontal resolution when an image is formed on an image sensor with a uniform pixel distribution. That is to say,optical system 11 b is configured such that in the range of view angle that is common to the vertical direction and the horizontal direction (for example, in a range of half view angle from 0° to 75°, both inclusive, inFIG. 15 ), the relationship between the vertical view angle and the resolution (that is to say, magnification ratio) is the same as the relationship between the horizontal view angle and the resolution (that is to say, magnification ratio). - Meanwhile, as shown in
FIG. 16 ,image sensor 12 b according to the second exemplary embodiment has such a pixel distribution that the center part (the part corresponding to region R1) has high pixel density and the region other than region R1 (the part corresponding to region R2) has low pixel density. Moreover,image sensor 12 b is set such that in the range of view angle that is common to the vertical direction and the horizontal direction (for example, in a range of half view angle from 0° to 90°, both inclusive, inFIG. 16 ), the relationship between the vertical view angle and the pixel density is different from the relationship between the horizontal view angle and the pixel density. - By using the combination of
optical system 11 b andimage sensor 12 b described above, it is possible to obtain capturedimage 300 b that has high resolution in region R1 at the center part and low resolution in region R2 other than region R1, as in the first exemplary embodiment. - As described above,
imaging device 10 b according to the present exemplary embodiment includesimage sensor 12 b that has an imaging surface on which a plurality of pixels are two-dimensionally arranged and that generates image data from a subject image formed on the imaging surface andoptical system 11 b that images a subject in a predetermined range of vertical view angle and in a predetermined range of horizontal view angle on the imaging surface ofimage sensor 12 b. The imaging surface includes a first region (for example, a region corresponding to region R1) including an intersection with an optical axis and a second region different from the first region (for example, a region corresponding to region R2). The imaging surface has such a pixel distribution that the pixel density of the first region is higher than the pixel density of the second region. In addition, the imaging surface is set such that the vertical pixel density is different from the horizontal pixel density. - With the configuration described above, the resolution of region R1 at the center part in a captured image can be higher than the resolution of region R2 other than region R1, as in the first exemplary embodiment. It is thus possible to capture an image that has high resolution in the center part required. for sensing or the like while a wide view angle is achieved as a whole. It is thus possible to improve the precision of analysis of the captured image. Additionally, as the vertical resolution characteristics of an image formed on the imaging surface are different from the horizontal resolution characteristics thereof, it is possible to freely set the horizontal view angle and the vertical view angle of the optical system regardless of the aspect ratio of
image sensor 12 b. That is to say, as in the first exemplary embodiment, the ratio of the vertical view angle to the horizontal view angle does not need to correspond to the aspect ratio of the imaging surface. It is thus possible to increase the whole view angle regardless of the resolution of a region that is important for sensing. - As in the first exemplary embodiment, the average of the vertical resolution of a first subject image of a subject image, which is formed in the first region, may be higher than the average of the horizontal resolution of the first subject image. It is thus possible to obtain an image that is more magnified in the vertical direction in first region R1, for example. This image is useful for detecting a subject extending in the vertical direction, for example.
- As in the first exemplary embodiment,
imaging device 10 b may be mounted on the rear ofvehicle 100 to capture a subject behindvehicle 100. For example,imaging device 10 b is used for checking safety when the vehicle reverses. - As in the first exemplary embodiment,
imaging device 10 b anddisplay 30 that displays an image based on image data generated by imagingdevice 10 b may constitute a display system. - As in the first exemplary embodiment,
imaging device 10 b andcontrol device 20 that analyzes image data generated by the imaging device may constitute an imaging system. - The first and second exemplary embodiments have been described above as examples of the technique disclosed in the present application. However, the technique according to the present disclosure is not limited to the first and second exemplary embodiments, but is applicable to other exemplary embodiments including appropriate modifications, replacements, additions, omissions, and the like. In addition, new exemplary embodiments can be made by combining constituent elements described in the first and second exemplary embodiments. Hence, other exemplary embodiments are described below.
- In the exemplary embodiments described above, the electronic room mirror and the in-vehicle display are exemplified as the display. However, the type of the display is not limited to these electronic room mirror and in-vehicle display. An idea of the present disclosure can be adapted to display systems that use various types of displays according to uses (for example, a head-up display).
- While
imaging devices imaging devices - While
imaging devices control device 20 may perform these processing. Alternatively,imaging devices control device 20 may perform the distortion correction. - While
vehicle 100 of an automobile has been described as an example of a moving body in the exemplary embodiments described above, the moving body is not limited to the automobile. The imaging device according to the exemplary embodiments described above may be used for other moving bodies including a railway vehicle, a vessel, an airplane, a robot, a robot arm, a drone, an agricultural machine such as a combine, and a construction machine such as a crane. The imaging device may also be used for a monitoring camera. - The view angle, the resolution and the like described in the exemplary embodiments are only examples and may be appropriately set based on a target (an object or an event) to be subjected to image analysis.
- While the optical system is configured by including the free-form lens in the first exemplary embodiment described above, other types of lens whose magnification ratio (that is, resolution) can be freely designed according to a view angle may be used instead of the free-form lens.
- The exemplary embodiments have been described as examples of the technique in the present disclosure. The accompanying drawings and the detailed description have been provided for this purpose.
- Accordingly, the constituent elements described in the accompanying drawings and the detailed description may not only include constituent elements that are essential for solving the problems, but may also include constituent elements that are not essential for solving the problems in order to illustrate the technique. It should not be therefore determined that the unessential constituent elements in the accompanying drawings and the detailed description are essential only based on the fact that these constituent elements are included in the drawings and the description.
- The above exemplary embodiments are provided to exemplify the technique according to the present disclosure, and thus various changes, replacements, additions, omissions, and the like can be made within the scope of the claims and equivalents thereof.
- According to the system of the present disclosure, one imaging device can provide images with a plurality of view angles including a high resolution image, and the system is applicable to various uses (an imaging system or a display system in a moving body, or the like).
Claims (8)
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-242758 | 2017-12-19 | ||
JP2017242758 | 2017-12-19 | ||
JP2018171030A JP7170167B2 (en) | 2017-12-19 | 2018-09-13 | Imaging device, display system, and imaging system |
JP2018-171030 | 2018-09-13 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20190191064A1 true US20190191064A1 (en) | 2019-06-20 |
US10623618B2 US10623618B2 (en) | 2020-04-14 |
Family
ID=64661091
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/190,716 Active US10623618B2 (en) | 2017-12-19 | 2018-11-14 | Imaging device, display system, and imaging system |
Country Status (2)
Country | Link |
---|---|
US (1) | US10623618B2 (en) |
EP (1) | EP3501898B1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10909668B1 (en) * | 2019-07-31 | 2021-02-02 | Nxp Usa, Inc. | Adaptive sub-tiles for distortion correction in vision-based assistance systems and methods |
WO2021180679A1 (en) * | 2020-03-13 | 2021-09-16 | Valeo Schalter Und Sensoren Gmbh | Determining a current focus area of a camera image on the basis of the position of the vehicle camera on the vehicle and on the basis of a current motion parameter |
US20210304708A1 (en) * | 2020-03-26 | 2021-09-30 | Sumitomo Heavy Industries Construction Cranes Co., Ltd. | Periphery display device for work machine |
US20220171275A1 (en) * | 2020-11-30 | 2022-06-02 | Toyota Jidosha Kabushiki Kaisha | Image pickup system and image pickup device |
US11507097B2 (en) * | 2018-02-05 | 2022-11-22 | Pixart Imaging Inc. | Control apparatus for auto clean machine and auto clean machine control method |
WO2023022908A1 (en) * | 2021-08-16 | 2023-02-23 | Gentex Corporation | Dual-function imaging system |
US20230367098A1 (en) * | 2022-05-16 | 2023-11-16 | GM Global Technology Operations LLC | Methods and systems for automated dynamic lens utilization |
US20240025343A1 (en) * | 2022-07-21 | 2024-01-25 | GM Global Technology Operations LLC | Rearview displays for vehicles |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170264843A1 (en) * | 2014-09-09 | 2017-09-14 | Beijing Zhigu Tech Co., Ltd. | Light field capture control methods and apparatuses, light field capture devices |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6252517A (en) | 1985-08-30 | 1987-03-07 | Mitsubishi Electric Corp | Solid-state image pickup device |
JP4727340B2 (en) | 2005-08-02 | 2011-07-20 | パナソニック株式会社 | In-vehicle imaging device and in-vehicle camera system |
US7839446B2 (en) | 2005-08-30 | 2010-11-23 | Olympus Corporation | Image capturing apparatus and image display apparatus including imparting distortion to a captured image |
JP2007067677A (en) | 2005-08-30 | 2007-03-15 | Olympus Corp | Image display apparatus |
EP2150437B1 (en) | 2007-04-30 | 2014-06-18 | Mobileye Technologies Limited | Rear obstruction detection |
WO2013067083A1 (en) | 2011-11-02 | 2013-05-10 | Magna Electronics, Inc. | Vehicle vision system with asymmetric anamorphic lens |
JP5884439B2 (en) | 2011-11-24 | 2016-03-15 | アイシン精機株式会社 | Image generation device for vehicle periphery monitoring |
JP6054738B2 (en) | 2012-12-25 | 2016-12-27 | 京セラ株式会社 | Camera module, camera system, and image display method |
DE102013221882A1 (en) | 2013-10-28 | 2015-04-30 | Conti Temic Microelectronic Gmbh | Camera with non-uniform angular resolution for a vehicle |
JP2016126254A (en) | 2015-01-07 | 2016-07-11 | コニカミノルタ株式会社 | Imaging lens, imaging apparatus, and projection device |
-
2018
- 2018-11-14 US US16/190,716 patent/US10623618B2/en active Active
- 2018-12-06 EP EP18210796.1A patent/EP3501898B1/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170264843A1 (en) * | 2014-09-09 | 2017-09-14 | Beijing Zhigu Tech Co., Ltd. | Light field capture control methods and apparatuses, light field capture devices |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11880205B2 (en) | 2018-02-05 | 2024-01-23 | Pixart Imaging Inc. | Auto clean machine and auto clean machine control method |
US11507097B2 (en) * | 2018-02-05 | 2022-11-22 | Pixart Imaging Inc. | Control apparatus for auto clean machine and auto clean machine control method |
US12181896B2 (en) | 2018-02-05 | 2024-12-31 | Pixart Imaging Inc. | Auto clean machine and auto clean machine control method |
US20210035271A1 (en) * | 2019-07-31 | 2021-02-04 | Nxp Usa, Inc. | Adaptive Sub-Tiles For Distortion Correction In Vision-Based Assistance Systems And Methods |
US10909668B1 (en) * | 2019-07-31 | 2021-02-02 | Nxp Usa, Inc. | Adaptive sub-tiles for distortion correction in vision-based assistance systems and methods |
WO2021180679A1 (en) * | 2020-03-13 | 2021-09-16 | Valeo Schalter Und Sensoren Gmbh | Determining a current focus area of a camera image on the basis of the position of the vehicle camera on the vehicle and on the basis of a current motion parameter |
US20210304708A1 (en) * | 2020-03-26 | 2021-09-30 | Sumitomo Heavy Industries Construction Cranes Co., Ltd. | Periphery display device for work machine |
US11735145B2 (en) * | 2020-03-26 | 2023-08-22 | Sumitomo Heavy Industries Construction Cranes Co., Ltd. | Periphery display device for work machine |
US20220171275A1 (en) * | 2020-11-30 | 2022-06-02 | Toyota Jidosha Kabushiki Kaisha | Image pickup system and image pickup device |
US11760275B2 (en) * | 2020-11-30 | 2023-09-19 | Toyota Jidosha Kabushiki Kaisha | Image pickup system and image pickup device |
WO2023022908A1 (en) * | 2021-08-16 | 2023-02-23 | Gentex Corporation | Dual-function imaging system |
US11912202B2 (en) | 2021-08-16 | 2024-02-27 | Gentex Corporation | Dual-function imaging system |
US20230367098A1 (en) * | 2022-05-16 | 2023-11-16 | GM Global Technology Operations LLC | Methods and systems for automated dynamic lens utilization |
US20240025343A1 (en) * | 2022-07-21 | 2024-01-25 | GM Global Technology Operations LLC | Rearview displays for vehicles |
US12139073B2 (en) * | 2022-07-21 | 2024-11-12 | GM Global Technology Operations LLC | Rearview displays for vehicles |
Also Published As
Publication number | Publication date |
---|---|
EP3501898B1 (en) | 2021-06-16 |
US10623618B2 (en) | 2020-04-14 |
EP3501898A1 (en) | 2019-06-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11310461B2 (en) | Imaging apparatus, imaging system, and display system | |
US10623618B2 (en) | Imaging device, display system, and imaging system | |
US10447948B2 (en) | Imaging system and display system | |
WO2015182457A1 (en) | Vehicle exterior observation device, and imaging device | |
US20180137629A1 (en) | Processing apparatus, imaging apparatus and automatic control system | |
WO2018207393A1 (en) | Image pickup system and display system | |
US10489666B2 (en) | Imaging device and imaging system | |
JP7170167B2 (en) | Imaging device, display system, and imaging system | |
EP3404911A1 (en) | Imaging system and moving body control system | |
JP7170168B2 (en) | Imaging device and imaging system | |
US20240114253A1 (en) | Movable apparatus and installation method for imaging device | |
CN117774832A (en) | Method for installing movable equipment and camera device | |
WO2020129398A1 (en) | Observation apparatus | |
US20230098424A1 (en) | Image processing system, mobile object, image processing method, and storage medium | |
US20230007190A1 (en) | Imaging apparatus and imaging system | |
WO2019187221A1 (en) | Lens system, imaging device, and imaging system | |
JP2023057644A (en) | Imaging apparatus, image processing system, movable body, control method and program of image processing system | |
US20240314455A1 (en) | Movable apparatus, image processing apparatus, storage medium, and installation method for imaging apparatus | |
US20240177492A1 (en) | Image processing system, image processing method, and storage medium | |
US20230328195A1 (en) | Image processing device that can measure distance to object, movable apparatus, image processing method, and storage medium | |
US20250074306A1 (en) | Imaging system, movable unit, and imaging method | |
US20200065987A1 (en) | Signal processing apparatus, moving body, and stereo camera | |
US20170094150A1 (en) | Image capture system and focusing method thereof | |
JP2020160334A (en) | Imaging apparatus and imaging system | |
JP2006323270A (en) | Camera device and focus adjustment method for camera device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LT Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AIHARA, MASAYUKI;YATSURI, SHIGENORI;MATSUMURA, YOSHIO;AND OTHERS;SIGNING DATES FROM 20181022 TO 20181027;REEL/FRAME:048787/0867 Owner name: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AIHARA, MASAYUKI;YATSURI, SHIGENORI;MATSUMURA, YOSHIO;AND OTHERS;SIGNING DATES FROM 20181022 TO 20181027;REEL/FRAME:048787/0867 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |