+

CN111756974B - Image sensors and electronics - Google Patents

Image sensors and electronics Download PDF

Info

Publication number
CN111756974B
CN111756974B CN202010637147.7A CN202010637147A CN111756974B CN 111756974 B CN111756974 B CN 111756974B CN 202010637147 A CN202010637147 A CN 202010637147A CN 111756974 B CN111756974 B CN 111756974B
Authority
CN
China
Prior art keywords
units
row
column
filter
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010637147.7A
Other languages
Chinese (zh)
Other versions
CN111756974A (en
Inventor
程祥
王迎磊
张玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Goodix Technology Co Ltd
Original Assignee
Shenzhen Goodix Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Goodix Technology Co Ltd filed Critical Shenzhen Goodix Technology Co Ltd
Publication of CN111756974A publication Critical patent/CN111756974A/en
Application granted granted Critical
Publication of CN111756974B publication Critical patent/CN111756974B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Color Television Image Signal Generators (AREA)
  • Solid State Image Pick-Up Elements (AREA)

Abstract

提供了一种图像传感器和电子设备,能够提高图像传感器的性能。该图像传感器,包括:滤光单元阵列,包括多个滤光单元组,该多个滤光单元组中每个滤光单元组包括4×4个滤光单元;其中,在4×4个滤光单元中,每一行、每一列以及每条对角线上均包括2个白色滤光单元和2个彩色滤光单元;像素单元阵列,包括多个像素单元,该像素单元阵列位于该滤光单元阵列下方,且该像素单元阵列中多个像素单元与该滤光单元阵列中多个滤光单元一一对应。通过该方案,白色滤光单元与彩色滤光单元在滤光单元阵列中均匀分布,不仅便于后续图像算法进行图像处理,有利于图像颜色恢复,避免产生彩色莫尔条纹,还能够进一步提高图像的SNR、分辨率等图像质量参数。

An image sensor and an electronic device are provided, which can improve the performance of the image sensor. The image sensor includes: a filter unit array, including a plurality of filter unit groups, each of the plurality of filter unit groups includes 4×4 filter units; wherein, in the 4×4 filter units, each row, each column and each diagonal line includes 2 white filter units and 2 color filter units; a pixel unit array, including a plurality of pixel units, the pixel unit array is located below the filter unit array, and a plurality of pixel units in the pixel unit array correspond one to one with a plurality of filter units in the filter unit array. Through this solution, the white filter units and the color filter units are evenly distributed in the filter unit array, which is not only convenient for subsequent image algorithms to perform image processing, but also conducive to image color restoration and avoidance of color moiré fringes, and can further improve image quality parameters such as SNR and resolution of the image.

Description

Image sensor and electronic device
The present application claims priority from chinese patent office, application number 202010410639.2, chinese application application entitled "image sensor and electronic device," filed 5/15/2020, the entire contents of which are incorporated herein by reference.
Technical Field
The present application relates to the field of sensors, and more particularly, to an image sensor and an electronic device.
Background
An image sensor is an electronic device that converts an optical image into a digital signal, and typically includes a pixel array composed of a plurality of pixel cells, each pixel cell in the pixel array being used to form a pixel value in the image. In order to enable the image sensor to collect a color image, a Color Filter (CF) may be disposed over the pixel unit so that the pixel unit may receive a light signal of a specific color, forming a pixel value corresponding to the light signal of the specific color.
However, the amount of light received by each pixel unit becomes smaller while the color filter is disposed, resulting in a smaller signal-to-noise ratio (SNR) and affecting the image quality. And if the image sensor is applied to a mobile device, the size of the image sensor is limited, and the photosensitive area of the corresponding pixel array is also limited, so that the image quality is further limited in a low-illumination environment.
Disclosure of Invention
The embodiment of the application provides an image sensor and electronic equipment, which aim to solve the problem of poor quality of an image acquired by the image sensor in a low-light environment.
In a first aspect, an image sensor is provided, comprising a filter unit array including a plurality of filter unit groups, each filter unit group of the plurality of filter unit groups including 4×4 filter units, wherein each row, each column, and each diagonal line of the 4×4 filter units includes 2 white filter units and 2 color filter units;
the pixel unit array comprises a plurality of pixel units, the pixel unit array is positioned below the light filtering unit array, and the pixel units in the pixel unit array are in one-to-one correspondence with the light filtering units in the light filtering unit array.
According to the scheme of the embodiment of the application, each row, each column and each diagonal line of each filter unit group comprises 2 white filter units and 2 color filter units, so that the filter unit arrays formed by the filter unit groups also comprise two types of filter units, namely the white filter units and the color filter units, in each row, each column and the diagonal direction, the white filter units and the color filter units are uniformly distributed in the filter unit arrays, the subsequent image algorithm is convenient for image processing, the image color recovery is facilitated, the image color loss is reduced, the color moire fringes are avoided, the more accurate full-resolution gray level image can be obtained, and the image quality parameters such as SNR (signal to noise ratio), resolution ratio and the like of the image are further improved.
In some possible embodiments, the filter unit group includes a first color filter unit, a second color filter unit, and a third color filter unit of different colors, and the number of the first color filter units is equal to the sum of the number of the second color filter units and the number of the third color filter units.
In some possible embodiments, in the filter unit group, the number of the second color filter units is equal to the number of the third color filter units.
In some possible embodiments, in the filter unit group, 2 color filter units in each row are different in color, and 2 color filter units in each column are different in color.
In some possible embodiments, 2 white filter cells are arranged at intervals from 2 color filter cells in each row of the filter cell group, and 2 white filter cells are arranged consecutively and 2 color filter cells are arranged consecutively in each column of the filter cell group.
In some possible embodiments, the color filter units of the same color in the filter unit group are arranged in a central symmetry manner.
In some possible embodiments, 2 target filter unit sets are arranged in the middle two rows of the filter unit group, and the target filter unit sets comprise 2 color filter units with the same color and arranged at the common vertex angle.
In some possible embodiments, in the filter unit group, the white filter unit is located in a first row, a second column, a first row, a fourth column, a second row, a fourth column, a third upper first column, a third row, a third column, a fourth row, a first column, and a fourth row, respectively, the second color filter unit is located in a second row, a first column, a third row, a third column, a fourth row, and a second column, respectively, the first color filter unit is located in a first row, a second row, a third column, a third row, a fourth column, and a fourth row, respectively, and the third color filter unit is located in a first row, a third column, and a fourth row, respectively.
In some possible embodiments, 2 white filter units are arranged at intervals from 2 color filter units in each row of the filter unit group, 2 white filter units are arranged continuously and 2 color filter units are arranged separately in two columns of the filter unit group interval, and 2 white filter units are arranged separately and 2 color filter units are arranged continuously in the other two columns of the filter unit group interval.
In some possible embodiments, the middle two columns of the filter unit group are provided with 2 target filter unit sets, and the target filter unit sets comprise 2 color filter units with the same color and arranged at the common vertex angle.
In some possible embodiments, the filter unit group is provided with 4 target filter unit sets, and the target filter unit sets include 2 color filter units of the same color arranged at a common vertex angle.
In some possible embodiments, in the filter unit group, the white filter unit is located in a first row, a second column, a first row, a fourth column, a second row, a first column, a second row, a third column, a third upper first column, a third row, a third column, a fourth row, a second column, and a fourth row, the second color filter unit is located in a first row, a third column, and a second row, the first color filter unit is located in a first row, a second column, a third row, and a fourth row, and the third color filter unit is located in a third row, a first column, and a fourth row, respectively.
In some possible embodiments, the first color filter unit, the second color filter unit and the third color filter unit are configured to pass three colors of optical signals, respectively, where the three colors of optical signal bands cover the visible light band.
In some possible embodiments, the colors of the first color filter unit, the second color filter unit and the third color filter unit are respectively three colors of red, green, blue, cyan, magenta and yellow.
In some possible embodiments, the first color filter unit is a green color filter unit, the second color filter unit and the third color filter unit are a red color filter unit and a blue color filter unit, respectively.
In some possible embodiments, the image sensor further comprises a micro lens array, including a plurality of micro lenses, the micro lens array is located above the optical filtering unit array and is used for converging the optical signals returned by the shooting object to the optical filtering unit array, wherein the micro lenses in the micro lens array are in one-to-one correspondence with the plurality of optical filtering units in the optical filtering unit array.
In some possible embodiments, the pixel values of the white pixel units in the pixel array are used for generating first image data of a shooting object, the pixel values of the color pixel units in the pixel array are used for generating second image data of the shooting object, and the first image data and the second image data are used for synthesizing a target image of the shooting object, wherein the white pixel units are pixel units corresponding to the white filter units, and the color pixel units are pixel units corresponding to the color filter units.
In some possible embodiments, the pixel values of the color pixel units in the pixel array are used to generate an intermediate image through interpolation processing, and the intermediate image is used to generate the second image data in Bayer format through a re-mosaic Remosaic processing.
In some possible embodiments, among the 2×2 pixel values of the intermediate image, 2 pixel values are original pixel values of the color pixel unit, and the other 2 pixel values are pixel values obtained through interpolation processing.
In some possible embodiments, the first image data and the second image data have the same resolution.
In some possible embodiments, the image sensor is a complementary metal oxide semiconductor CMOS image sensor, or a charge coupled device CCD image sensor.
In a second aspect, an electronic device is provided, including the image sensor in the first aspect or any possible implementation manner of the first aspect.
Drawings
Fig. 1 is a schematic structural diagram of an image sensor according to an embodiment of the present application.
Fig. 2 is a schematic top view of another image sensor according to an embodiment of the present application.
FIG. 3 is a schematic cross-sectional view of the image sensor of FIG. 2 taken along the direction A-A'.
Fig. 4 is another cross-sectional schematic view of the image sensor of fig. 2 along A-A'.
Fig. 5 is a schematic arrangement diagram of filter units in a filter unit array according to an embodiment of the present application.
Fig. 6 to 14 are schematic arrangement diagrams of filter units in several filter unit groups according to an embodiment of the present application.
Fig. 15 is a schematic flowchart of an image processing method provided in an embodiment of the present application.
Fig. 16 is an image schematic diagram of the image processing method in fig. 15.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
It should be understood that the specific examples herein are intended merely to facilitate a better understanding of the embodiments of the application by those skilled in the art and are not intended to limit the scope of the embodiments of the application.
It should also be understood that the various embodiments described in this specification may be implemented alone or in combination, and that the present embodiments are not limited in this regard.
The technical solution of the embodiment of the present application may be applied to various image sensors, such as a complementary metal oxide semiconductor (complementary metal oxide semiconductor, CMOS) image sensor (CMOS image sensor, CIS), or a charge coupled device (charge coupled device, CCD) image sensor, but the embodiment of the present application is not limited thereto.
As a common application scenario, the image sensor provided by the embodiment of the application can be applied to smart phones, cameras, tablet computers and other mobile terminals or other terminal devices with imaging functions.
Fig. 1 shows a schematic structure of an image sensor. As shown in fig. 1, the image sensor 100 includes a pixel array 110, a row selection circuit 120, a column selection circuit 130, a control circuit 140, and analog-to-digital conversion (analog to digital converter, ADC) circuits 150, a front-end signal processing circuit 160, and a back-end signal processing circuit 170.
Specifically, as shown in fig. 1, a plurality of square pixel units in the pixel unit array 110 are arranged in M rows×n columns, where M, N is a positive integer. Generally, the row direction of the M rows and the column direction of the N columns are perpendicular to each other in the plane of the pixel cell array 110. In some cases, two directions perpendicular to each other, such as a row direction and a column direction in the present application, may be referred to as a horizontal direction and a vertical direction in one plane for convenience of description.
In the pixel cell array 110 shown in fig. 1, either side of each square pixel cell is parallel or perpendicular to the row or column direction.
Optionally, devices such as a photodiode and a field effect switch tube may be included in the pixel unit, for receiving an optical signal and converting the optical signal into a corresponding electrical signal.
Alternatively, if the image sensor needs to acquire a color image, a color filter array (color FILTER ARRAY, CFA) may be disposed above the pixel unit array 110, where a color filter unit may be disposed above each pixel unit correspondingly, for the sake of description, hereinafter, a pixel unit having a color filter unit disposed above is also referred to as a color pixel unit, for example, a pixel unit having a red filter unit disposed above is referred to as a red pixel unit (denoted by R in fig. 1), a pixel unit having a green filter unit disposed above is referred to as a green pixel unit (denoted by G in fig. 1), and a pixel unit having a blue filter unit disposed above is referred to as a blue pixel unit (denoted by B in fig. 1).
Currently, most CFAs of image sensors use Bayer (Bayer) format based on RGB three primary colors, for example, as shown in fig. 1, if a Bayer format CFA is disposed above a pixel unit array 110, the pixel unit array 110 uses 2×2 pixel units as basic units, and each basic unit includes 1 red pixel unit, 1 blue pixel unit, and 2 green pixel units, where 2 green pixel units are disposed adjacent to each other with common vertex angle.
The row selection circuit 120 is connected to each row of pixel cells in the pixel cell array 110 through M row control lines, and may be used to turn on and off each pixel cell in each row of pixel cells. For example, the row selection circuit 120 is connected to the gate of the field effect switch tube of each pixel unit in the first row in the pixel unit array 110 through a row control line, and controls the operation state of the photodiode by turning on or off the field effect switch tube. Wherein, M row control lines are parallel to the horizontal direction.
The column selection circuit 130 is connected to each column of pixel cells in the pixel cell array 110 through N column control lines, and may be used to select a signal value output of each pixel cell in each column. For example, the column selection circuit 130 is connected to the source of the field effect switch transistor of each pixel cell of the first column in the pixel cell array 110 through a column control line, and controls the output photodiode to convert the resulting electrical signal. Wherein, N column control lines are parallel to the vertical direction.
The control circuit 140 is connected to the row selection circuit 120 and the column selection circuit 130, and is configured to provide timing for the row selection circuit 120 and the column selection circuit 130, control the row selection circuit 120 and the column selection circuit 130 to select a pixel unit in the pixel unit array 110, and output a pixel value of the pixel unit.
Optionally, after the row selection circuit 120, the column selection circuit 130 and the control circuit 140 cooperate with outputting the pixel values generated by the pixel unit array 110, the pixel values of the pixel unit array 110 are transmitted to the ADC circuit 150 for analog-to-digital conversion, and the analog pixel values are converted into digital pixel values to form a digital image, so that the subsequent signal processing circuit 160 can conveniently perform image processing to output the optimized color image.
Alternatively, the signal processing circuit 160 may include, but is not limited to, an image signal processor (IMAGE SIGNAL processor, ISP) for linearizing a digital image, removing dead spots, removing noise, color correction, demosaicing (demosaic), automatic exposure control (automatic exposure control, AEC), automatic gain control (automatic gain control, AGC), automatic white balancing (auto white balance, AWB), and the like.
For the image sensor 100 using the Bayer format CFA, the red pixel unit can only receive the red light signal, the green pixel unit can only receive the green light signal, the blue pixel unit can only receive the blue light signal, and the intensity of the light signal received by each pixel unit is smaller, so that the SNR of the image is larger, thereby affecting the image quality.
In addition, with respect to the image sensor of the Bayer format CFA, high-frequency information of luminance and chromaticity information in an image are likely to overlap, color aliasing (color aliasing) is likely to occur, and color moire (color moire) is likely to occur.
Based on the above problems, the present application provides an image sensor, in which a white filter unit is added in a CFA, a part of pixel units in a pixel unit array receive color light signals, and a part of pixel units receive white light signals, so as to increase the intensity of the light signals received by the part of pixel units, and on the basis of that, the pixel values of a plurality of pixel units in the pixel unit array are processed, and on the basis of ensuring the color information of the image, the image quality parameters such as SNR and resolution of the image are improved, so as to obtain an optimized color image.
Fig. 2 is a schematic top view of an image sensor 200 according to an embodiment of the present application, and fig. 3 is a schematic cross-sectional view of the image sensor 200 along A-A'.
As shown in fig. 2 and 3, the image sensor 200 includes:
a filter cell array 210 including a plurality of filter cell groups 211, each filter cell group of the plurality of filter cell groups 211 including 4×4 filter cells;
In the 4×4 filter units, 2 white filter units and 2 color filter units are included on each row, each column, and each diagonal.
The pixel unit array 220 is located below the filter unit array 210, and includes a plurality of pixel units, where the plurality of pixel units in the pixel unit array 220 are in one-to-one correspondence with the plurality of filter units in the filter unit array 210.
In one possible embodiment, as shown in fig. 3, the plurality of filter units in the filter unit array 210 may be disposed on the upper surfaces of the plurality of pixel units in the pixel unit array 220, and in another possible embodiment, the plurality of filter units in the filter unit array 210 may also be disposed above the plurality of pixel units in the pixel unit array 220 in a suspended manner.
Further, as shown in fig. 3, as an example, each filter unit in the filter unit array 210 is correspondingly disposed directly above each pixel unit in the pixel unit array 220, in other words, the center of each filter unit coincides with the center of its corresponding pixel unit in the vertical direction. In addition to this manner, each filter unit in the filter unit array 210 is correspondingly disposed obliquely above each pixel unit in the pixel unit array 220, and at this time, each pixel unit in the pixel unit array 220 may receive an optical signal in an oblique direction, and the specific position of the filter unit array 210 is not limited in the embodiment of the present application.
The pixel units corresponding to the color filter units in the pixel unit array 220 are used for receiving the color light signals passing through the color filter units and correspondingly outputting color pixel values, the pixel units corresponding to the white filter units in the pixel unit array 220 are used for receiving the white light signals passing through the white filter units and correspondingly outputting white pixel values, and the color light signals and the white light signals are jointly used for generating a target image of a shooting object. For example, a pixel unit corresponding to a red filter unit receives a red light signal, a pixel value corresponding to the output may be referred to as a red pixel value, and a pixel unit corresponding to a white filter unit receives a white light signal, and a pixel corresponding to the output may be referred to as a white pixel value.
Further, as shown in fig. 4, fig. 4 shows a schematic cross-sectional view of another image sensor 200 along the A-A' direction.
As shown in fig. 4, the image sensor 200 includes, in addition to the above-described filter cell array 210 and pixel cell array 220:
And a micro lens array 230 disposed above the filter unit array 210, for converging the optical signals returned by the photographing object to the filter unit array 210 and reducing the crosstalk of the optical signals between the adjacent pixel units, wherein the micro lens array 230 includes a plurality of micro lenses, and the plurality of micro lenses are in one-to-one correspondence with the plurality of filter units in the filter unit array 210 and the plurality of pixel units in the pixel unit array 220.
In some embodiments, the pixel structure in the image sensor may be referred to as an on chip microlens (OCL) pixel structure.
Optionally, as shown in fig. 4, in the image sensor 200, a dielectric layer 240 may be further included between the filter unit array 210 and the pixel unit array 220 for connecting the pixel unit array 220 and the filter unit array 210.
In addition, the filter cell array 210 may further include a dielectric 215 and a reflective grid 216 positioned around the filter cell array to reflect the light signal incident at a large angle, preventing the loss of the light signal.
The pixel cell array 220 may include a semiconductor substrate 221 and a photosensor 222, wherein the photosensor 222 is located in the semiconductor substrate 221, and the photosensor 222 includes, but is not limited to, a Photodiode (PD). Optionally, the pixel cell array 220 may further include an isolation region 223 between two photosensitive elements 222 to prevent electrical signal interference between adjacent two photosensitive elements.
It will be appreciated that the image sensor 200 may include other stacked structures, such as at least one metal interconnect layer, to electrically connect a plurality of pixel cells in a pixel cell array, etc., in addition to the basic structure shown in fig. 4, and the embodiment of the present application is not limited thereto.
Alternatively, as shown in the top view of fig. 2, the arrangement of the filter unit array 210 is also shown in practice.
As shown in fig. 2, in the embodiment of the present application, each filter unit in one filter unit group 211 is a quadrangular filter unit, for example, each filter unit may be a square filter unit, and 16 square filter units form a square filter unit group. In the filter unit group 211, the number of color filter units (shown as a shaded block in the figure) and the number of white filter units (shown as a blank block in the figure) are equal, that is, 8 color filter units and 8 white filter units are included. In the plane of the image sensor, one filter unit group includes two types of filter units, namely a white filter unit and a color filter unit, in a horizontal direction (row direction), a vertical direction (column direction), and a 45 ° direction (diagonal direction).
In the present application, the white filter unit refers to a filter or a filter material for transmitting white light, and in some embodiments, the white filter unit may be a transparent material or an air gap for transmitting all light signals including white light in the environment. In particular, the white light may be a mixture of colored light. For example, light of three primary colors in the spectrum, blue, red and green, may be mixed in a proportion to produce white light, or the mixture of all visible light in the spectrum may be white light.
Correspondingly, the color filter unit refers to a filter or a filter material for transmitting color light. In particular, the colored light may be a light signal in any band range of the visible spectrum, for example, a red filter unit may be used to transmit red light, which may be a light signal in the visible spectrum having a wavelength range between 620nm and 750 nm. Similarly, color filter units of other colors are also used to transmit light signals of the corresponding colors.
According to the scheme of the embodiment of the application, each row, each column and each diagonal line of each filter unit group comprises 2 white filter units and 2 color filter units, so that the filter unit arrays formed by the filter unit groups also comprise two types of filter units, namely the white filter units and the color filter units, in each row, each column and the diagonal direction, the white filter units and the color filter units are uniformly distributed in the filter unit arrays, the subsequent image algorithm is convenient for image processing, the image color recovery is facilitated, the image color loss is reduced, the color moire fringes are avoided, the more accurate full-resolution gray level image can be obtained, and the image quality parameters such as SNR (signal to noise ratio), resolution ratio and the like of the image are further improved.
Alternatively, in some embodiments, the filter unit group may include filter units of three colors (a first color filter unit, a second color filter unit, and a third color filter unit), for example, may be a filter unit of three primary colors, that is, a filter unit of three colors of red, green, and blue (RGB), or may also be a filter unit of three complementary colors, that is, a filter unit of three colors of cyan, magenta, and yellow (CYAN MAGENTA yellow, CMY), or may also be a filter unit of two complementary colors of one primary color, or a filter unit of two complementary colors of one complementary color.
In the case that the filter unit group includes color filter units of three colors, the color light signals of the three colors passing through the color filter units of three colors may cover a visible light band, and the specific colors of the color filter units are not limited in the embodiment of the present application.
For convenience of description, hereinafter, the color filter units including three colors of red, green and blue in the filter unit group 211 will be described as an example, and when the color filter units including other three colors are included in the filter unit group, the following technical solutions will be referred to.
Fig. 5 shows a schematic arrangement diagram of the filter units in the filter unit group 211.
As shown in fig. 5, the filter unit group 211 includes the white filter units 201, the red filter units 202, the green filter units 203 and the blue filter units 204, and optionally, in one filter unit group 211, the ratio of the number of the white filter units 201, the number of the green filter units 203, the number of the red filter units 202 and the number of the blue filter units 204 is 4:2:1:1.
Alternatively, in the filter unit group 211, the white filter units in each row are arranged at intervals from the color filter units, 2 white filter units in each column are arranged consecutively, and 2 color filter units are arranged consecutively.
As an example, as shown in fig. 5, 2 white filter units 201 in the 1 st and 2 nd rows are disposed at intervals in the 2 nd and 4 th columns, and 2 white filter units 201 in the 3 rd and 4 th rows are disposed at intervals in the 1 st and 3 rd columns. Correspondingly, 2 white filter units 201 in the 1 st and 3 rd columns are continuously arranged in the 3 rd and 4 th rows, and 2 white filter units 201 in the 2 nd and 4 th columns are continuously arranged in the 1 st and 2 nd rows.
Alternatively, the colors of 2 color filter units in each row and 2 color filter units in each column are different.
For example, in fig. 5, the 2 color filter units of the 1 st and 4 th rows are a green filter unit 203 and a blue filter unit 204, and the 2 color filter units of the 2 nd and 3 rd rows are a red filter unit 202 and a green filter unit 203. Correspondingly, the 2 color filter units in the 1 st and 2 nd columns are a green filter unit 203 and a red filter unit 202, and the 2 color filter units in the 3 rd and 3 rd columns are a blue filter unit 204 and a green filter unit 203.
Specifically, in the filter unit group shown in fig. 5, 2 red filter units 202 may be located in the 2 nd row, the 1 st column, and the 3 rd row, the 2 nd column, respectively, and the 2 red filter units 202 may be denoted as R 21 and R 32;
The 4 green filter units 203 may be located in the 1 st row and 1 st column, the 2 nd row and 3 rd column, the 3 rd row and 4 th column, and the 4 th row and 2 nd column, respectively, and the 4 green filter units 203 may be denoted as G 11,G23,G34 and G 42;
The 2 blue filter units 204 may be located at row 1, column 3 and row 4, column 4, respectively, and the 2 blue filter units 204 may be denoted B 13 and B 44.
Alternatively, based on the unchanged positions of the white filter units 201 in fig. 5, the positions of the color filter units of different colors in fig. 5 may be changed according to the above-mentioned requirement that the colors of the 2 color filter units in each row and the 2 color filter units in each column are different.
For example, based on the filter unit group in fig. 5, the positions of the white filter unit and the green filter unit are kept unchanged, and the positions of the red filter unit and the blue filter unit therein are changed.
Optionally, in some embodiments, the filter unit group includes at least two target filter unit sets, and each of the target filter unit sets includes 2 color filter units of the same color disposed at a common vertex angle.
In the case where the above condition is satisfied, fig. 6 shows a schematic arrangement diagram of filter units in several transformed filter unit groups.
In the various filter cell groups shown in fig. 5 and 6, two target filter cell sets are located in the middle two rows, wherein one target filter cell set is a green filter cell set with 2 common vertex angles, and the other target filter cell set is a blue filter cell or a red filter cell set with 2 common vertex angles.
For another example, based on the filter unit group in fig. 5, only the white filter unit positions are kept unchanged, and the positions of the green filter unit, the red filter unit, and the blue filter unit therein are changed.
Alternatively, in other embodiments, color filter units of the same color in the filter unit group are arranged in a central symmetry manner.
In the case where the above condition is satisfied, fig. 7 shows a schematic arrangement diagram of filter units in several transformed filter unit groups.
In the above-mentioned multiple filter unit groups shown in fig. 5 to 7, the filter unit group includes at least two target filter unit sets, or color filter units with the same color are arranged in a central symmetry manner, so that the subsequent image pixel interpolation processing is facilitated, more original image color information can be retained, the accuracy of color recovery is increased, and a specific pixel interpolation process will be described in detail below.
It is understood that for each of the filter element groups in the embodiments of the application described above, the filter element groups after geometric transformation, such as rotation, are also within the scope of the present application.
For example, as for the filter cell group shown in fig. 5 to 7, after rotating it clockwise by 90 °, various filter cell groups shown in fig. 8 can be formed.
It is understood that the filter unit groups in fig. 5 to 7 and the filter unit groups in fig. 8 may be equivalent to the same filter unit groups, and the difference is that the arrangement rule of the filter units in the row direction in fig. 5 to 7 is the same as the arrangement rule of the filter units in the column direction in fig. 8, and correspondingly, the arrangement rule of the filter units in the column direction in fig. 5 to 7 is the same as the arrangement rule of the filter units in the row direction in fig. 8.
As shown in fig. 8, 2 white filter units are arranged at intervals from 2 color filter units in each column of the filter unit group, and 2 white filter units are arranged continuously and 2 color filter units are arranged continuously in each row of the filter unit group.
It should be further understood that, in addition to the filter unit set shown in fig. 8, any one of the filter unit sets shown in fig. 5 to 7 is rotated by other angles, or the filter unit sets formed by other geometric transformations are also included in the protection scope of the present application, which is not shown here.
The filter unit groups of the above embodiments are all obtained by conversion based on fig. 5, and the filter unit groups of the present application may be the filter unit groups 211 shown by several dashed boxes in fig. 9 to 11, in addition to the above-listed structures.
It is understood that the filter unit group 211 in fig. 9 to 11 may form the same filter unit array in the central area as the filter unit group 211 in fig. 2, and the difference is only that the arrangement form of the filter units at the outermost periphery of the filter unit array 210 is different, and thus, the filter unit arrays formed by the filter unit group 211 in fig. 2, 9 to 11 may be equivalent to the same filter unit array.
Therefore, in the present application, any 4×4 filter unit in the filter unit array 210 in fig. 2 may be divided into one filter unit group, and the filter unit group in any case of division is within the scope of the present application.
Similarly, the filter unit array formed by any one of the filter unit groups is also within the scope of the present application.
Referring to the filter cell group shown in fig. 9, it can be seen from a comparison of fig. 9 with fig. 5 that the positions of the white filter cells and the color filter cells in fig. 5 are interchanged, i.e., the form of the filter cell group shown in fig. 9 is formed.
Similar to the above embodiment, the positions of the color filter units of different colors in fig. 9 may be changed based on the unchanged positions of the white filter units in fig. 9, and a plurality of filter unit groups as shown in fig. 12 may be obtained.
For example, as shown in fig. 12 (a) to (c), two target filter cell sets are included in the several filter cell groups. As shown in fig. 12 (d) to (g), among the several filter cell groups, color filter cells of the same color are arranged in a central symmetry.
Further, after the filter cell groups shown in (a) to (g) of fig. 9 and 12 are rotated clockwise by 90 °, the filter cell groups shown in (a ') to (h') of fig. 12 are formed.
Similarly, it is within the scope of the present application to perform other rotational transformations on the various filter element sets of fig. 9 and 12, or to geometrically transform the resulting filter element set structure.
Referring to the filter cell group shown in fig. 10 and 11, in the two embodiments, 2 white filter cells are arranged at intervals from 2 color filter cells in each row of the filter cell group, 2 white filter cells are arranged continuously and 2 color filter cells are arranged separately in two columns of the filter cell group interval, and 2 white filter cells are arranged separately and 2 color filter cells are arranged continuously in the other two columns of the filter cell group interval.
Comparing fig. 10 and 11, the filter unit group of fig. 11 can be obtained by exchanging the white filter unit and the color filter unit of fig. 10.
Based on the unchanged positions of the white filter units in fig. 10, the positions of the color filter units of different colors in fig. 10 can be changed, and a plurality of filter unit groups as shown in fig. 13 can be obtained.
Also, based on the unchanged positions of the white filter units in fig. 11, the positions of the color filter units of different colors in fig. 11 may be changed, and a plurality of filter unit groups as shown in fig. 14 may be obtained.
As shown in fig. 10, fig. 13 (a) to (c), and fig. 14 (d) to (g), among the several filter cell groups, color filter cells are arranged as four target filter cell sets.
As shown in fig. 11, fig. 13 (d) to (g), and fig. 14 (a) to (c), the several filter cell groups include two target filter cell sets, and one green filter cell set is included in the two target cell sets.
Further, after the filter cell groups shown in (a) to (g) of fig. 10 and 13 are rotated clockwise by 90 °, the filter cell groups shown in (a ') to (h') of fig. 13 are formed. After the filter cell groups shown in fig. 11 and (a) to (g) of fig. 14 are rotated clockwise by 90 °, the filter cell groups shown in (a ') to (h') of fig. 14 are formed.
Similarly, it is within the scope of the present application to provide a filter element array structure that is obtained by performing other rotational or geometric transformations on the various filter element arrays of fig. 10, 11, 13 and 14.
Specifically, as shown in the graphs (a ') to (h') in fig. 13 and the graphs (a ') to (h') in fig. 14, in each column of the filter unit group, 2 white filter units are arranged at intervals from 2 color filter units, correspondingly, in two rows of the filter unit group interval, 2 white filter units are arranged continuously and 2 color filter units are arranged separately, and in the other two rows of the filter unit group interval, 2 white filter units are arranged separately and 2 color filter units are arranged continuously.
The basic structure of the image sensor 200 of the present application and the arrangement of the plurality of filter unit groups 211 therein are described above with reference to fig. 2 to 14, and the image processing method for the image sensor 200 of the present application is described below with reference to fig. 15 and 16.
Fig. 15 shows a schematic flow chart of an image processing method. Fig. 16 shows an image schematic diagram through the image processing method in fig. 15.
As shown in fig. 15, the image processing method 10 includes:
and S110, sub-sampling the image formed by the pixel unit array to obtain a first sampling graph comprising color pixel values and a second sampling graph comprising white pixel values.
As an example, the # 1 in fig. 16 is an image generated by one 4×4 pixel unit in the pixel unit array, which is a pixel unit corresponding to the filter unit group shown in fig. 10 above.
In fig. 16, 1# shows that the white pixel value 101 is a pixel value generated after the white pixel unit receives the white light signal, the upper side is correspondingly provided with a white filter unit 201, the red pixel value 102 is a pixel value generated after the red pixel unit receives the red light signal, the upper side is correspondingly provided with a red filter unit 202, the upper side is correspondingly provided with a green filter unit 203, the upper side is correspondingly provided with a blue filter unit 104, and the upper side is correspondingly provided with a blue filter unit 204.
The 2# plot in fig. 16 is a first sample plot after sub-sampling the red pixel value, the green pixel value, and the blue pixel value in the 1# plot, and the 3# plot in fig. 16 is a second sample plot after sub-sampling the white pixel value in the 1# plot. In the first sample map and the second sample map, the relative positional relationship of the pixel values coincides with the relative positional relationship of the pixel values in the original image 1# map.
And S120, carrying out interpolation processing on the first sampling graph and the second sampling graph to acquire a first image and a second image.
As an example, the 4# map in fig. 16 is a first image of a first sample map (2 # map) subjected to interpolation processing, specifically, pixel values of the ith row and jth column in the 4# map may be represented as X ij, where X represents color information of the pixel values, for example, green pixel values of the 1 st row and 1 st column in the upper left corner may be represented as G 11, G 12 and G 21 may be interpolated according to G 11 and G 22, R 14 and R 23 may be interpolated according to R 13 and R 24, B 31 and B 42 may be interpolated according to B 32 and B 41, and G 33 and G 44 may be interpolated according to G 34 and G 43, thereby obtaining the first image by interpolation according to the pixel values in the first sample map.
It can be seen that, in the first image, half of the pixel values (as shown by the dashed line frame in the figure) retain the color information in the original image, which is more beneficial to color reduction in the subsequent image processing process and improves the image quality. The interpolation processing process is simpler, the algorithm in the image processing process can be simplified, and the image processing efficiency is improved.
The 5# graph in fig. 16 is a second image obtained by performing interpolation processing on a second sample graph (3 # graph), and any interpolation algorithm in the prior art may be used in the interpolation process, which is not limited in the embodiment of the present application. It will be appreciated that since the second sample map and the second image are not used to characterize color information in the image, they are used to enhance the brightness of the image. In the embodiment of the application, the white pixel values are uniformly distributed in the second sampling image, and the white pixel values are reserved in the horizontal direction, the vertical direction and the 45-degree oblique direction of the second sampling image, so that the second image with full resolution can be obtained more accurately, and the resolution and the image quality of the final image are improved.
And S130, performing mosaic rearrangement processing on the first image to obtain a third image.
As an example, after the 4# map in fig. 16 is subjected to the demosaicing (remosaic) process, a third image shown in the 6# map is obtained, and it can be seen that the third image is a Bayer (Bayer) -format data image. In the embodiment of the present application, any rearrangement method in the prior art may be used in the process of rearranging the mosaic, which is not particularly limited in the embodiment of the present application.
The third image after the mosaic rearrangement is in a Bayer format which is more commonly used in the current image processing field, so that the third image can be suitable for more Image Signal Processors (ISPs), and the image sensor can be adapted to more ISPs and is suitable for more application scenes.
And S140, fusing the third image and the second image to obtain an optimized color image.
Optionally, the resolution of the third image is the same as the resolution of the second image.
As an example, the image fusion of the 5# image and the 6# image in fig. 16 is performed to obtain an optimized color image 7#, and the color image after the image fusion can better maintain brightness information while guaranteeing image color information, so that image quality under low illumination conditions can be effectively improved. Meanwhile, the color image is not easy to overlap brightness and chromaticity information in a frequency domain, so moire fringes are not easy to generate.
It will be appreciated that the above-described image processing method 10 may be implemented by a processor or processing circuit, in other words, optionally, in the above-described image sensor, a processing unit may be further included, which is configured to implement the above-described image processing method 10.
In addition to the image sensor 200 provided in the embodiments of the application, the present application also provides an electronic device, which may include the image sensor 200 in any of the embodiments described above.
The electronic device may be any electronic device having an image capturing function, for example, it may be a mobile terminal such as a mobile phone or a computer, or may be a photographing device such as a camera or a video camera, or may be an automatic teller machine (automatic TELLER MACHINE, ATM), etc., which does not limit an application scenario of the electronic device.
Alternatively, the processing unit for executing the image processing method 10 may not be located in the image sensor 200, but may be located in a processing unit in an electronic device where the image sensor 200 is located, for example, if the electronic device is a mobile phone, the processing unit may be an image signal processing unit in a processor in the mobile phone, or the processing unit may also be a separate image signal processing chip in the mobile phone, and the specific hardware form of the processing unit is not limited in the embodiments of the present application.
It should be appreciated that the processor or processing unit of embodiments of the present application may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method embodiments may be implemented by integrated logic circuits of hardware in a processor or instructions in software form. The Processor may be a general purpose Processor, a digital signal Processor (DIGITAL SIGNAL Processor, DSP), an Application SPECIFIC INTEGRATED Circuit (ASIC), an off-the-shelf programmable gate array (Field Programmable GATE ARRAY, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory, and the processor reads the information in the memory and, in combination with its hardware, performs the steps of the above method.
It will be appreciated that the image sensor of embodiments of the present application may also include memory, which may be volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. It should be noted that the memory of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. The storage medium includes a U disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (22)

1. An image sensor is provided, which is capable of detecting a light source, characterized by comprising the following steps:
the light filtering unit array comprises a plurality of light filtering unit groups, wherein each light filtering unit group in the plurality of light filtering unit groups comprises 4 multiplied by 4 light filtering units, and each row, each column and each diagonal line in the 4 multiplied by 4 light filtering units comprises 2 white light filtering units and 2 color light filtering units;
The pixel unit array comprises a plurality of pixel units, the pixel unit array is positioned below the light filtering unit array, the pixel units in the pixel unit array are in one-to-one correspondence with the light filtering units in the light filtering unit array, and each light filtering unit in the light filtering unit array is correspondingly arranged right above each pixel unit in the pixel unit array.
2. The image sensor of claim 1, wherein the set of filter elements includes first, second, and third color filter elements of different colors, the number of first color filter elements being equal to the sum of the number of second and third color filter elements.
3. The image sensor of claim 2, wherein the number of second color filter units is equal to the number of third color filter units in the filter unit group.
4. The image sensor of claim 3, wherein in the filter cell group, 2 color filter cells in each row are different in color, and 2 color filter cells in each column are different in color.
5. The image sensor of claim 4, wherein in each row of the filter unit group, 2 white filter units are disposed at intervals from 2 color filter units;
In each column of the filter unit group, 2 white filter units are disposed consecutively, and 2 color filter units are disposed consecutively.
6. The image sensor of claim 5, wherein the color filter units of the same color in the filter unit group are arranged in a central symmetry.
7. The image sensor of claim 5, wherein 2 sets of target filter units are arranged in two middle rows of the filter unit group, and the set of target filter units includes 2 color filter units of the same color arranged at a common vertex angle.
8. The image sensor of claim 7, wherein in the set of filter units, the white filter units are located in a first row and a second column, a first row and a fourth column, a second row and a fourth column, a third row and a first column, a third row and a third column, a fourth row and a first column, and a fourth row and a third column, respectively;
the second color filter units are respectively positioned in a second row and a first column and a third row and a second column;
the first color filter unit is respectively positioned in a first row and a first column, a second row and a third column, a third row and a fourth column and a fourth row and a second column;
the third color filter unit is respectively positioned in a third column of the first row and a fourth column of the fourth row.
9. The image sensor of claim 4, wherein in each row of the filter unit group, 2 white filter units are disposed at intervals from 2 color filter units;
in the other two columns of the filter unit group interval, 2 white filter units are arranged separately, and 2 color filter units are arranged continuously.
10. The image sensor of claim 9, wherein two middle columns of the filter unit group are provided with 2 target filter unit sets comprising 2 color filter units of the same color arranged at common vertex angles.
11. The image sensor of claim 9, wherein 4 sets of target filter units are provided in the filter unit group, the set of target filter units including 2 color filter units of the same color arranged at a common vertex angle.
12. The image sensor of claim 11, wherein in the set of filter units, the white filter units are located in a first row and a second column, a first row and a fourth column, a second row and a third column, a third row and a first column, a third row and a third column, a fourth row and a second column, and a fourth row and a fourth column, respectively;
the second color filter units are respectively positioned in a third column of the first row and a fourth column of the second row;
the first color filter unit is respectively positioned in a first row and a first column, a second row and a second column, a third row and a fourth column, and a fourth row and a third column;
the third color filter unit is respectively positioned in the third row and the first column and the fourth row and the first column.
13. The image sensor of any one of claims 2 to 12, wherein the first color filter unit, the second color filter unit, and the third color filter unit are configured to pass three colors of optical signals, respectively, whose optical signal bands cover a visible light band.
14. The image sensor of claim 13, wherein the colors of the first color filter unit, the second color filter unit, and the third color filter unit are three colors of red, green, blue, cyan, magenta, and yellow, respectively.
15. The image sensor of claim 14, wherein the first color filter unit is a green filter unit, the second color filter unit and the third color filter unit are a red filter unit and a blue filter unit, respectively.
16. The image sensor according to any one of claims 1 to 12, characterized in that the image sensor further comprises:
The micro lens array comprises a plurality of micro lenses, wherein the micro lens array is positioned above the optical filtering unit array and is used for converging optical signals returned by a shooting object to the optical filtering unit array, and the micro lenses in the micro lens array are in one-to-one correspondence with the optical filtering units in the optical filtering unit array.
17. The image sensor according to any one of claims 1 to 12, wherein pixel values of color pixel units in the pixel unit array are used to generate first image data of a photographic subject, pixel values of white pixel units in the pixel unit array are used to generate second image data of a photographic subject, and the first image data and the second image data are used to synthesize a target image of the photographic subject;
the white pixel unit is a pixel unit corresponding to the white filter unit, and the color pixel unit is a pixel unit corresponding to the color filter unit.
18. The image sensor according to claim 17, wherein pixel values of color pixel units in the pixel unit array are used to generate an intermediate image by interpolation processing, the intermediate image being used to generate the first image data in bayer format by demosaicing processing.
19. The image sensor of claim 18, wherein among the 2 x 2 pixel values of the intermediate image, 2 pixel values are original pixel values of the color pixel unit, and the other 2 pixel values are pixel values obtained by interpolation processing.
20. The image sensor of claim 17, wherein the first image data and the second image data have the same resolution.
21. The image sensor of any one of claims 1 to 12, wherein the image sensor is a complementary metal oxide semiconductor, CMOS, image sensor or a charge coupled device, CCD, image sensor.
22. An electronic device, comprising:
the image sensor of any one of claims 1 to 21.
CN202010637147.7A 2020-05-15 2020-07-03 Image sensors and electronics Active CN111756974B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010410639 2020-05-15
CN2020104106392 2020-05-15

Publications (2)

Publication Number Publication Date
CN111756974A CN111756974A (en) 2020-10-09
CN111756974B true CN111756974B (en) 2025-03-18

Family

ID=72202804

Family Applications (11)

Application Number Title Priority Date Filing Date
CN202021303789.5U Active CN212752379U (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202021297709.XU Active CN212435794U (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202021297708.5U Active CN212435793U (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202010635332.2A Active CN111756972B (en) 2020-05-15 2020-07-03 Image sensors and electronics
CN202010636571.XA Pending CN111756973A (en) 2020-05-15 2020-07-03 Image Sensors and Electronics
CN202010637147.7A Active CN111756974B (en) 2020-05-15 2020-07-03 Image sensors and electronics
CN202010708333.5A Active CN111614886B (en) 2020-05-15 2020-07-22 Image sensor and electronic device
CN202021508422.7U Active CN212785522U (en) 2020-05-15 2020-07-24 Image sensor and electronic device
CN202010724146.6A Pending CN111654615A (en) 2020-05-15 2020-07-24 Image Sensors and Electronics
CN202010724148.5A Active CN111629140B (en) 2020-05-15 2020-07-24 Image sensors and electronics
CN202021510460.6U Active CN212752389U (en) 2020-05-15 2020-07-24 Image sensor and electronic device

Family Applications Before (5)

Application Number Title Priority Date Filing Date
CN202021303789.5U Active CN212752379U (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202021297709.XU Active CN212435794U (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202021297708.5U Active CN212435793U (en) 2020-05-15 2020-07-03 Image sensor and electronic device
CN202010635332.2A Active CN111756972B (en) 2020-05-15 2020-07-03 Image sensors and electronics
CN202010636571.XA Pending CN111756973A (en) 2020-05-15 2020-07-03 Image Sensors and Electronics

Family Applications After (5)

Application Number Title Priority Date Filing Date
CN202010708333.5A Active CN111614886B (en) 2020-05-15 2020-07-22 Image sensor and electronic device
CN202021508422.7U Active CN212785522U (en) 2020-05-15 2020-07-24 Image sensor and electronic device
CN202010724146.6A Pending CN111654615A (en) 2020-05-15 2020-07-24 Image Sensors and Electronics
CN202010724148.5A Active CN111629140B (en) 2020-05-15 2020-07-24 Image sensors and electronics
CN202021510460.6U Active CN212752389U (en) 2020-05-15 2020-07-24 Image sensor and electronic device

Country Status (2)

Country Link
CN (11) CN212752379U (en)
WO (1) WO2021227250A1 (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN212752379U (en) * 2020-05-15 2021-03-19 深圳市汇顶科技股份有限公司 Image sensor and electronic device
CN112235494B (en) * 2020-10-15 2022-05-20 Oppo广东移动通信有限公司 Image sensor, control method, imaging device, terminal, and readable storage medium
CN112312097B (en) * 2020-10-29 2023-01-24 维沃移动通信有限公司 Sensor with a sensor element
CN114584725A (en) * 2020-11-30 2022-06-03 华为技术有限公司 Image sensor and imaging device
CN114650343A (en) * 2020-12-15 2022-06-21 超聚变数字技术有限公司 Image sensor and imaging device
CN112822466A (en) * 2020-12-28 2021-05-18 维沃移动通信有限公司 Image sensor, camera module and electronic equipment
CN113037980A (en) * 2021-03-23 2021-06-25 北京灵汐科技有限公司 Pixel sensing array and vision sensor
CN115225832A (en) * 2021-04-21 2022-10-21 海信集团控股股份有限公司 An image acquisition device and image encryption processing method, device and medium
CN113540138B (en) * 2021-06-03 2024-03-12 奥比中光科技集团股份有限公司 Multispectral image sensor and imaging module thereof
CN113676652B (en) * 2021-08-25 2023-05-26 维沃移动通信有限公司 Image sensor, control method, control device, electronic apparatus, and storage medium
CN113676651B (en) * 2021-08-25 2023-05-26 维沃移动通信有限公司 Image sensor, control method, control device, electronic apparatus, and storage medium
CN113852797A (en) * 2021-09-24 2021-12-28 昆山丘钛微电子科技股份有限公司 Color filter arrays, image sensors, and camera modules
KR20230046816A (en) * 2021-09-30 2023-04-06 에스케이하이닉스 주식회사 Image sensing device
CN114125318B (en) * 2021-11-12 2024-08-02 Oppo广东移动通信有限公司 Image sensor, camera module, electronic device, image generation method and device
CN114125240A (en) * 2021-11-30 2022-03-01 维沃移动通信有限公司 Image sensor, camera module, electronic device and shooting method
CN114363486B (en) * 2021-12-14 2024-08-02 Oppo广东移动通信有限公司 Image sensor, camera module, electronic device, image generation method and device
CN114157795B (en) * 2021-12-14 2024-08-16 Oppo广东移动通信有限公司 Image sensor, camera module, electronic device, image generation method and device
CN114494706B (en) * 2022-01-24 2025-10-03 惠州同为数码科技有限公司 Image sensor signal-to-noise ratio evaluation method based on specific flat area
CN115022562A (en) * 2022-05-25 2022-09-06 Oppo广东移动通信有限公司 Image Sensors, Cameras and Electronics
CN118412399A (en) * 2022-05-31 2024-07-30 深圳市聚飞光电股份有限公司 A photoelectric sensor and packaging method thereof
CN115696078B (en) * 2022-08-01 2023-09-01 荣耀终端有限公司 Color filter arrays, image sensors, camera modules and electronics

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN212752379U (en) * 2020-05-15 2021-03-19 深圳市汇顶科技股份有限公司 Image sensor and electronic device

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7978240B2 (en) * 2005-10-03 2011-07-12 Konica Minolta Photo Imaging, Inc. Enhancing image quality imaging unit and image sensor
KR100772910B1 (en) * 2006-06-26 2007-11-05 삼성전기주식회사 Digital camera module
KR100967651B1 (en) * 2008-03-12 2010-07-07 주식회사 동부하이텍 CMOS image sensor device and its formation method
JP4626706B2 (en) * 2008-12-08 2011-02-09 ソニー株式会社 Solid-state imaging device, signal processing method for solid-state imaging device, and imaging device
TWI422020B (en) * 2008-12-08 2014-01-01 Sony Corp Solid-state imaging device
JP4683121B2 (en) * 2008-12-08 2011-05-11 ソニー株式会社 Solid-state imaging device, signal processing method for solid-state imaging device, and imaging device
KR20110075397A (en) * 2009-12-28 2011-07-06 주식회사 동부하이텍 How to improve the sensitivity of the image sensor
CN104412581B (en) * 2012-07-06 2016-04-13 富士胶片株式会社 Color image sensor and camera head
CN104412580B (en) * 2012-07-06 2016-04-06 富士胶片株式会社 Color image sensor and camera head
JP6012375B2 (en) * 2012-09-28 2016-10-25 株式会社メガチップス Pixel interpolation processing device, imaging device, program, and integrated circuit
JP2015008343A (en) * 2013-06-24 2015-01-15 コニカミノルタ株式会社 Imaging device, and method for forming imaging image
US9692992B2 (en) * 2013-07-01 2017-06-27 Omnivision Technologies, Inc. Color and infrared filter array patterns to reduce color aliasing
JP5884847B2 (en) * 2014-03-12 2016-03-15 ソニー株式会社 Solid-state imaging device, signal processing method for solid-state imaging device, and imaging device
CN104241309B (en) * 2014-09-19 2018-01-02 上海集成电路研发中心有限公司 A kind of CMOS image pixel array for simulating random pixel effect
US9479745B2 (en) * 2014-09-19 2016-10-25 Omnivision Technologies, Inc. Color filter array with reference pixel to reduce spectral crosstalk
TWI552594B (en) * 2014-10-27 2016-10-01 聯詠科技股份有限公司 Color filter array for image sensing device and manufacturing method thereof
CN104581100A (en) * 2015-02-12 2015-04-29 张李静 Color filter array and image processing method
CN104735327B (en) * 2015-04-08 2019-07-26 联想(北京)有限公司 Imaging device and imaging method
CN105282529B (en) * 2015-10-22 2018-01-16 浙江宇视科技有限公司 A kind of digital wide dynamic approach and device based on RAW spaces
CN105578071B (en) * 2015-12-18 2018-03-20 广东欧珀移动通信有限公司 imaging method of image sensor, imaging device and electronic device
CN105578078B (en) * 2015-12-18 2018-01-19 广东欧珀移动通信有限公司 Imaging sensor, imaging device, mobile terminal and imaging method
CN105516697B (en) * 2015-12-18 2018-04-17 广东欧珀移动通信有限公司 Image sensor, imaging device, mobile terminal and imaging method
JP6461429B2 (en) * 2015-12-18 2019-01-30 広東欧珀移動通信有限公司 Image sensor, control method, and electronic apparatus
CN105516700B (en) * 2015-12-18 2018-01-19 广东欧珀移动通信有限公司 Imaging method, imaging device and the electronic installation of imaging sensor
CN105430359B (en) * 2015-12-18 2018-07-10 广东欧珀移动通信有限公司 Imaging method, imaging sensor, imaging device and electronic device
JP2017175500A (en) * 2016-03-25 2017-09-28 学校法人成蹊学園 Color image pickup method, color image interpolation processing method, and imaging apparatus
CN107105140B (en) * 2017-04-28 2020-01-24 Oppo广东移动通信有限公司 Dual-core focus image sensor, focus control method and imaging device thereof
CN108305883A (en) * 2018-01-30 2018-07-20 德淮半导体有限公司 Imaging sensor
JP7349806B2 (en) * 2018-03-28 2023-09-25 ブラックマジック デザイン ピーティーワイ リミテッド Image processing method and filter array
CN109003995A (en) * 2018-08-10 2018-12-14 德淮半导体有限公司 Imaging sensor, electronic device and its manufacturing method
CN109905681B (en) * 2019-02-01 2021-07-16 华为技术有限公司 Image sensor, method for acquiring image data therefrom, and imaging device
CN110649057B (en) * 2019-09-30 2021-03-05 Oppo广东移动通信有限公司 Image sensor, camera assembly and mobile terminal
CN110649056B (en) * 2019-09-30 2022-02-18 Oppo广东移动通信有限公司 Image sensor, camera assembly and mobile terminal

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN212752379U (en) * 2020-05-15 2021-03-19 深圳市汇顶科技股份有限公司 Image sensor and electronic device

Also Published As

Publication number Publication date
CN111629140A (en) 2020-09-04
CN111756972A (en) 2020-10-09
CN212435793U (en) 2021-01-29
CN111654615A (en) 2020-09-11
CN111756974A (en) 2020-10-09
CN111629140B (en) 2025-03-18
CN111756973A (en) 2020-10-09
CN212435794U (en) 2021-01-29
CN212785522U (en) 2021-03-23
WO2021227250A1 (en) 2021-11-18
CN212752379U (en) 2021-03-19
CN212752389U (en) 2021-03-19
CN111756972B (en) 2025-03-18
CN111614886B (en) 2021-10-19
CN111614886A (en) 2020-09-01

Similar Documents

Publication Publication Date Title
CN111756974B (en) Image sensors and electronics
CN111757006B (en) Image acquisition method, camera assembly and mobile terminal
EP1908302B1 (en) Image sensor with improved lightsensitivity
JP5149279B2 (en) Image sensor with improved light sensitivity
EP1977614B1 (en) Image sensor with improved light sensitivity
CN101006731B (en) Image processing device, image processing method, and imaging device
CN111314592B (en) Image processing method, camera assembly and mobile terminal
JP2010512049A (en) Processing images with color and panchromatic pixels
US9332199B2 (en) Imaging device, image processing device, and image processing method
US9185375B2 (en) Color imaging element and imaging device
JP2009027488A (en) Imaging circuit and imaging device
US20140022446A1 (en) Color imaging element, imaging device, and storage medium storing an imaging program
JP7612986B2 (en) Image sensor and imaging device
CN210143059U (en) Image sensor integrated circuit, image sensor, and imaging system
WO2022007215A1 (en) Image acquisition method, camera assembly, and mobile terminal
WO2013100094A1 (en) Imaging device, method for controlling imaging device, and control program
JP5624228B2 (en) IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND CONTROL PROGRAM
CN114080795A (en) Image sensor and electronic device
US20220279108A1 (en) Image sensor and mobile terminal
WO2023015425A1 (en) Pixel array, image sensor, and electronic device without demosaicing and methods of operation thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载