CN100492395C - Fingerprint feature fast matching method, device and application thereof - Google Patents
Fingerprint feature fast matching method, device and application thereof Download PDFInfo
- Publication number
- CN100492395C CN100492395C CNB2006100658774A CN200610065877A CN100492395C CN 100492395 C CN100492395 C CN 100492395C CN B2006100658774 A CNB2006100658774 A CN B2006100658774A CN 200610065877 A CN200610065877 A CN 200610065877A CN 100492395 C CN100492395 C CN 100492395C
- Authority
- CN
- China
- Prior art keywords
- point
- fingerprint
- matching
- size
- template
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000000034 method Methods 0.000 title claims abstract description 73
- 238000004422 calculation algorithm Methods 0.000 claims description 33
- 239000013598 vector Substances 0.000 claims description 20
- 230000008569 process Effects 0.000 claims description 17
- 230000008859 change Effects 0.000 claims description 8
- 238000001514 detection method Methods 0.000 claims description 6
- 230000001186 cumulative effect Effects 0.000 claims description 3
- 238000003064 k means clustering Methods 0.000 claims description 3
- 230000008878 coupling Effects 0.000 claims 21
- 238000010168 coupling process Methods 0.000 claims 21
- 238000005859 coupling reaction Methods 0.000 claims 21
- 230000008676 import Effects 0.000 claims 2
- 238000004321 preservation Methods 0.000 claims 1
- 238000004364 calculation method Methods 0.000 description 32
- 238000005516 engineering process Methods 0.000 description 15
- 238000013519 translation Methods 0.000 description 10
- 230000009466 transformation Effects 0.000 description 6
- 230000003044 adaptive effect Effects 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000000605 extraction Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000003909 pattern recognition Methods 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 238000012795 verification Methods 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000002207 retinal effect Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
Images
Landscapes
- Collating Specific Patterns (AREA)
Abstract
Description
技术领域 technical field
本发明涉及一种利用指纹的全局特征信息(即指纹图像的中心点)和局部细节特征信息(即指纹脊线的分叉点和端点)实现指纹特征快速匹配的方法,也涉及用于实现该指纹特征快速匹配方法的装置,还涉及到一种使用该指纹特征快速匹配装置进行指纹特征匹配的智能卡(smartcard),属于模式识别及智能卡技术领域。The present invention relates to a method for realizing fast matching of fingerprint features by using the global feature information of the fingerprint (that is, the central point of the fingerprint image) and the local detailed feature information (that is, the bifurcation point and the endpoint of the fingerprint ridge line), and also relates to a method for realizing the fingerprint feature. The device of the fingerprint feature fast matching method also relates to a smart card (smartcard) using the fingerprint feature fast matching device for fingerprint feature matching, which belongs to the field of pattern recognition and smart card technology.
背景技术 Background technique
智能卡技术被公认为是企业和政府解决身份识别和安全问题的有效技术手段。Smart card technology is recognized as an effective technical means for enterprises and governments to solve identity and security issues.
当前,智能卡技术已经发展到了利用生物特征进行身份识别的阶段。在与公钥结构和JavaTM技术结合之外,智能卡技术再与生物特征识别技术相结合,可以实现一种最强壮的身份验证方式。因此,作为具有最高安全性的便携式多功能身份识别和通信媒介,采用生物特征识别技术的智能卡被公认为是以在线和离线方式进行可靠的个人身份认证的最佳方案。这里所说的生物特征识别技术是指采集、分析人体的独特特征,如指纹、视网膜和声音模式等,进行身份验证的技术。At present, smart card technology has developed to the stage of using biometrics for identification. In addition to the combination of public key structure and Java TM technology, the combination of smart card technology and biometric identification technology can realize the strongest identity verification method. Therefore, as a portable multifunctional identification and communication medium with the highest security, smart cards with biometric technology are recognized as the best solution for reliable personal authentication both online and offline. The biometric technology mentioned here refers to the technology of collecting and analyzing the unique characteristics of the human body, such as fingerprints, retinal and voice patterns, etc., for identity verification.
在生物特征识别技术中,指纹识别技术发展最早,人们在二十世纪六十年代就开始使用计算机来处理指纹。经过不断的发展,现在指纹识别已经是比较成熟的技术,有很多相关的技术成果问世。例如公开号为CN1735908的发明专利申请就公开了一种指纹匹配设备和方法、记录介质和能够使用少量数据快速执行指纹匹配处理的程序。CPU检测指纹图像的中心点C、分叉点P1到P8和终点Q1到Q10并且通过连接与中心点C最接近的分叉点P1和分叉点附近的两个分叉点P2和P3产生三角形W1。三角形的面积S1和三边的长度存储在闪速存储器中。此外,CPU计算分叉点P1和分叉点P1附近的终点Q1的位置、分叉点P2和分叉点P2附近的终点Q3的位置和分叉点P3和分叉点P3附近的终点Q3的位置,并且在闪速存储器中存储这些位置作为已登记模板。当执行匹配时,CPU判断三角形的面积和边长以及从匹配图像获得的终点的位置是否与已登记模板一致。Among biometric identification technologies, fingerprint identification technology developed the earliest, and people began to use computers to process fingerprints in the 1960s. After continuous development, fingerprint recognition is now a relatively mature technology, and many related technical achievements have come out. For example, the invention patent application with publication number CN1735908 discloses a fingerprint matching device and method, a recording medium and a program capable of quickly performing fingerprint matching processing with a small amount of data. The CPU detects the center point C, bifurcation points P1 to P8, and end points Q1 to Q10 of the fingerprint image and generates a triangle by connecting the bifurcation point P1 closest to the center point C and two bifurcation points P2 and P3 near the bifurcation point W1. The area S1 of the triangle and the lengths of the three sides are stored in the flash memory. In addition, the CPU calculates the position of the branch point P1 and the end point Q1 near the branch point P1, the position of the branch point P2 and the end point Q3 near the branch point P2, and the position of the branch point P3 and the end point Q3 near the branch point P3. locations, and store these locations in flash memory as registered templates. When performing matching, the CPU judges whether the area and side length of the triangle and the position of the end point obtained from the matching image coincide with the registered template.
公开号为CN1716276的发明专利申请也公开了一种指纹识别算法及系统,其特征是:有关指纹特征信息的输入方法以及由上述输入方法所输入的所登录指纹特征信息,由上述输入方法输入的指纹特征的相关信息以及指纹信息的记忆方法;从所输入的两种指纹特征的相关信息来计算两种指纹的类似程度;运用类似程度的计算方法来计算由指纹记忆方法所记忆的登录指纹的位置所存在的差异的概略值称为第一位置计算方法,在修正了由上述第一位置计算方法计算出的登录指纹的位置和输入指纹的位置的差异之后,运用上述类似程度计算方法来计算登录指纹和输入指纹的旋转差异的概略值称为第一旋转计算方法,登录指纹和输入指纹的位置的差异,以及旋转差异的概略值;运用上述第一位置以及其旋转计算方法计算出的登录指纹和输入指纹位置的差异,以及旋转差异的概略值,以及上述类似程度的计算方法,把上述指纹记忆方法所记忆的登录指纹的位置和输入指纹位置的差异进行比较,将其概略值确定在较小的范围内,进行详细地比较,称为第二位置计算方法。运用上述第二位置计算方法来计算登录指纹位置和输入指纹位置的差异以及上述类似程度计算方法,运用指纹记忆方法所登录的旋转和输入指纹旋转的差异匹配其概略值,同上述第一旋转计算方法进行比较,将其确定在较小的范围之内,进行详细的比较,称为第二旋转位置计算方法;运用第二位置计算方法,第二旋转位置计算方法计算所登录的指纹位置和输入指纹的位置以及旋转位置的差异进行修正,再运用上述类似程度的计算方法计算类似程度,依据其结果来判断登录的指纹和所输入的指纹是否为同一人;将上述判断结果进行输出。The invention patent application with the publication number CN1716276 also discloses a fingerprint identification algorithm and system, which is characterized in that: the input method of the fingerprint feature information and the registered fingerprint feature information input by the above input method, the input method input by the above input method The relevant information of fingerprint features and the memory method of fingerprint information; the similarity degree of two fingerprints is calculated from the relevant information of the input two kinds of fingerprint features; The approximate value of the difference in position is called the first position calculation method. After correcting the difference between the position of the registered fingerprint calculated by the above-mentioned first position calculation method and the position of the input fingerprint, it is calculated by using the above-mentioned similarity degree calculation method. The approximate value of the rotation difference between the registered fingerprint and the input fingerprint is called the first rotation calculation method, the difference between the positions of the registered fingerprint and the input fingerprint, and the approximate value of the rotation difference; The difference between the fingerprint and the input fingerprint position, and the approximate value of the rotation difference, and the calculation method of the above-mentioned similarity, compare the difference between the registered fingerprint position memorized by the above-mentioned fingerprint memory method and the input fingerprint position, and determine the approximate value at In a smaller range, a detailed comparison is called the second position calculation method. Use the above-mentioned second position calculation method to calculate the difference between the registered fingerprint position and the input fingerprint position and the above-mentioned similarity calculation method, and use the fingerprint memory method to match the difference between the registered rotation and the input fingerprint rotation to its approximate value, which is the same as the above-mentioned first rotation calculation. method to compare, determine it within a smaller range, and make a detailed comparison, which is called the second rotation position calculation method; using the second position calculation method, the second rotation position calculation method calculates the registered fingerprint position and input The difference between the position and rotation position of the fingerprint is corrected, and then the degree of similarity is calculated by using the calculation method of the degree of similarity mentioned above, and according to the result, it is judged whether the registered fingerprint and the fingerprint entered are the same person; the above judgment result is output.
从技术角度分析,指纹识别是典型的模式识别问题,主要是由指纹的特征提取和特征匹配两大部分组成的。目前,指纹识别比对算法大多数采用细节比对方法,主要分为两大类:有中心点比对算法和无中心点比对算法。无中心点比对算法是借助细节的某些冗余特征辅助匹配,匹配过程中不需要中心点。它的优点是对采集的指纹图像的定位要求不高,但比对效率低下,不适应对时间要求较高的场合,更不适合于在计算资源和计算能力均有限的智能卡内进行。有中心点比对算法是借助中心点辅助定位指纹图像并进行匹配,它的优点是比对的速度非常快,但缺点是对中心点定位的精度要求比较高,如果中心点定位的精度不够准确的话,常常导致比对失败。From a technical point of view, fingerprint recognition is a typical pattern recognition problem, which is mainly composed of two parts: feature extraction and feature matching of fingerprints. At present, most of the fingerprint recognition and comparison algorithms use the detail comparison method, which are mainly divided into two categories: comparison algorithm with center point and comparison algorithm without center point. The non-center point comparison algorithm uses some redundant features of details to assist matching, and the center point is not required in the matching process. Its advantage is that it does not require high positioning of the collected fingerprint images, but the comparison efficiency is low, and it is not suitable for occasions with high time requirements, and it is not suitable for smart cards with limited computing resources and computing capabilities. The center point comparison algorithm uses the center point to assist in positioning and matching fingerprint images. Its advantage is that the comparison speed is very fast, but the disadvantage is that the accuracy of center point positioning is relatively high. If the center point positioning accuracy is not accurate enough Otherwise, the comparison often fails.
在智能卡中使用指纹特征识别技术,必须充分考虑到智能卡本身的特点。与计算机相比,智能卡内的存储空间有限,因此指纹的特征数据不可能太多;另外,智能卡内处理器的运算能力往往有限,因此指纹特征匹配算法必须简单而高效,能够快速判断指纹是否匹配。但是,现有的指纹匹配算法往往忽视智能卡的上述特点所带来的限制,因此在计算机中使用效果良好的算法,在智能卡中就不未必能得到令人满意的效果。目前,就本申请人所知,专门针对智能卡的特点进行优化的指纹匹配算法尚不多见。To use the fingerprint feature recognition technology in the smart card, the characteristics of the smart card itself must be fully considered. Compared with the computer, the storage space in the smart card is limited, so the feature data of fingerprints cannot be too much; in addition, the computing power of the processor in the smart card is often limited, so the fingerprint feature matching algorithm must be simple and efficient, and can quickly determine whether the fingerprints match . However, existing fingerprint matching algorithms often ignore the limitations brought by the above-mentioned characteristics of smart cards, so algorithms that work well in computers may not necessarily achieve satisfactory results in smart cards. At present, as far as the applicant knows, there are not many fingerprint matching algorithms specially optimized for the characteristics of smart cards.
发明内容 Contents of the invention
本发明的第一个目的是提供一种针对智能卡运算和存储能力有限的特点而专门进行优化的指纹特征快速匹配方法。该方法利用指纹的全局特征和细节特征,可以快速、准确地进行指纹特征匹配。The first object of the present invention is to provide a fast fingerprint feature matching method specially optimized for the characteristics of limited computing and storage capabilities of the smart card. This method utilizes the global features and detail features of fingerprints, and can quickly and accurately match fingerprint features.
本发明的第二个目的是提供一种用于实现该指纹特征快速匹配方法的装置。The second object of the present invention is to provide a device for realizing the method for quickly matching fingerprint features.
本发明的第三个目的是提供一种使用上述指纹特征快速匹配装置进行指纹特征匹配的智能卡。The third object of the present invention is to provide a smart card that uses the above-mentioned fingerprint feature rapid matching device to perform fingerprint feature matching.
为实现上述的发明目的,本发明采用下述的技术方案:For realizing above-mentioned purpose of the invention, the present invention adopts following technical scheme:
一种指纹特征快速匹配方法,在极坐标系下进行匹配计算,其特征在于包含如下步骤:A fingerprint feature fast matching method, which performs matching calculations in a polar coordinate system, is characterized in that it includes the following steps:
(1)检测指纹图像的中心点;(1) detect the central point of the fingerprint image;
(2)如果找到中心点,则以指纹图像的中心点为参考对应点进行细节特征匹配;如果找不到中心点,则转入步骤(9);(2) If the center point is found, then carry out detail feature matching with the center point of the fingerprint image as the reference corresponding point; if the center point cannot be found, then proceed to step (9);
(3)匹配是否成功?(3) Is the match successful?
(4)如果匹配成功则判断为是同一指纹,如果不成功则以所述中心点为参考中心,在模板和输入特征点集合中寻找可以作为匹配参考对应点的细节特征点对,并将它们记录下来;(4) If the matching is successful, it is judged to be the same fingerprint. If it is not successful, the center point is used as the reference center, and the detailed feature points that can be used as matching reference corresponding points are searched for in the template and the input feature point set. record it;
(5)以记录下的细节特征点对作为参考对应点,对指纹的细节特征进行匹配;(5) Matching the minutiae features of the fingerprint with the recorded minutiae feature points as the reference corresponding point;
(6)匹配是否成功?(6) Is the match successful?
(7)如果匹配成功则判断为是同一指纹,如果不成功则转入步骤(9);(7) If the matching is successful, it is judged to be the same fingerprint, and if unsuccessful, then proceed to step (9);
(9)将前述步骤中未使用的模板和输入特征点对作为参考对应点,进行细节点匹配;(9) use the unused template and the input feature point pair in the preceding steps as the reference corresponding point, and carry out minutiae point matching;
(10)匹配是否成功?(10) Is the match successful?
(11)如果匹配成功则判断为是同一指纹,如果不成功则判断为非同一指纹。(11) If the matching is successful, it is judged as the same fingerprint, and if it is not successful, it is judged as not the same fingerprint.
其中,所述步骤(1)中,所述中心点通过如下步骤W确定:Wherein, in the step (1), the center point is determined by the following steps W:
W1)将输入指纹图像分成大小为W×W的块,其中,W为整数;W1) Divide the input fingerprint image into blocks of size W×W, where W is an integer;
W2)计算每个块中每个像素的梯度Gx和Gy;W2) Calculate the gradients G x and G y of each pixel in each block;
W3)计算每一块的局部主方向;W3) calculate the local main direction of each block;
W4)将方向场转化到一个连续的向量场中,通过低通滤波器修正脊线方向;W4) transform the direction field into a continuous vector field, and correct the ridge direction through a low-pass filter;
W5)计算指纹图像的方向场;W5) calculate the direction field of the fingerprint image;
W6)计算某一点的Poincare索引,如果该点的Poincare索引值为1/2,则该点为中心点。W6) Calculate the Poincare index of a certain point, if the Poincare index value of this point is 1/2, then this point is the center point.
候选中心点为数个时,通过k—均值聚类算法得到最终的中心点位置。When there are several candidate center points, the final center point position is obtained through the k-means clustering algorithm.
在确定中心点所在区块后,对该区块进一步划分为更小的区块,并使用如步骤W所述方法重新计算中心点的位置。After the block where the center point is located is determined, the block is further divided into smaller blocks, and the method as described in step W is used to recalculate the position of the center point.
所述细节特征匹配分为两步进行,首先是利用一对参考点在特征点集合中的局部邻域结构信息来进行局部结构信息粗匹配,在粗匹配通过之后,以该参考点作为对应点进行全局精确匹配。The detailed feature matching is divided into two steps. First, the local neighborhood structure information of a pair of reference points in the feature point set is used to perform rough matching of local structure information. After the rough matching is passed, the reference point is used as the corresponding point Do a global exact match.
所述细节特征匹配方法分为如下步骤:The detailed feature matching method is divided into the following steps:
1)以模板细节点和输入细节点作为参考细节点,将输入点集和模板点集中的邻域细节点或全局细节点变换到极坐标系;1) Using template minutiae and input minutiae as reference minutiae, transform the neighborhood minutiae or global minutiae in the input point set and template point set to the polar coordinate system;
2)将极坐标中的模板细节点和输入细节点按极角递增方向排序,并连接成串;2) Sorting the template detail points and input detail points in the polar coordinates according to the increasing direction of the polar angle, and connecting them into strings;
3)用自适应限界盒方法匹配所述串,找出并记录匹配分数,将匹配分数的大小作为是否进行全局精确或匹配成功与否的依据;3) Match the string with the adaptive bounding box method, find out and record the matching score, and use the size of the matching score as the basis for whether to carry out global precision or matching success;
4)找出各次匹配分数中的最大值,把它当作输入细节点集与模板细节点集的匹配分数,如果匹配分数高于一个预先设定的阈值,则认为输入图像与模板图像来自同一个指纹,否则认为它们来自不同指纹。4) Find the maximum value of each matching score, and use it as the matching score between the input minutiae point set and the template minutiae point set, if the matching score is higher than a preset threshold, it is considered that the input image and the template image are from the same fingerprint, otherwise they are considered to be from different fingerprints.
所述自适应限界盒的大小用radius_size和angle_size来表示,用如下式来计算极半径为r的模板细节点的radius_size和angle_size:The size of the self-adaptive bounding box is represented by radius_size and angle_size, and the radius_size and angle_size of the template detail points with a polar radius of r are calculated by the following formula:
r_size=α1+β1rr_size=α 1 +β 1 r
a_size=α2-β2ra_size=α 2 -β 2 r
其中α1,β1,α2,β2为预先设定的经验参数,且均大于零,r是模板细节点的极半径,r_small,r_large,a_small,a_large分别是radius_size和angle_size的上界和下界,它们是预先通过试验设定的值。Among them, α 1 , β 1 , α 2 , and β 2 are preset empirical parameters, and they are all greater than zero. r is the polar radius of the template detail point, r_small, r_large, a_small, and a_large are the upper bounds of radius_size and angle_size and Lower bounds, which are pre-set values through experimentation.
所述匹配分数Ms通过下式获得:The matching score Ms is obtained by the following formula:
其中nb_pair为匹配上的细节点对的数目,match_error为各匹配点对的累积匹配误差,Mc,Nc分别是模板和输入指纹图像在公共区域的特征点个数,α,β为预定的加权系数。Where nb_pair is the number of minutiae point pairs on matching, match_error is the cumulative matching error of each matching point pair, M c , N c are the number of feature points in the common area of the template and the input fingerprint image, α, β are predetermined weighting factor.
其中,所述α的值为2,β的值为3。Wherein, the value of α is 2, and the value of β is 3.
一种指纹特征快速匹配装置,其特征在于包括:A fingerprint feature fast matching device is characterized in that it comprises:
中心点判断单元,用于寻找和判断指纹中的中心点;A center point judging unit is used to find and judge the center point in the fingerprint;
匹配对应点选择单元,用于在模板和输入特征点中寻找可能的可以作为匹配对应点的模板和输入特征点对,并将它们记录起来;The matching corresponding point selection unit is used to find possible templates and input feature point pairs that can be used as matching corresponding points in the template and the input feature points, and record them;
匹配判断单元,用于执行特征点的匹配计算;A matching judging unit, configured to perform matching calculation of feature points;
极坐标转换单元,用于将有关特征点与中心点的空间位置转换为极坐标;A polar coordinate conversion unit, used for converting the spatial position of the relevant feature point and the central point into polar coordinates;
首先由中心点判断单元寻找和判断指纹中的中心点,若存在中心点,则以该中心点为对应点,由匹配判断单元进行匹配计算,得出匹配结果;如果中心点未能找到,则由匹配对应点选择单元和匹配判断单元将其他未考虑的模板和输入特征点对作为参考对应点对两枚指纹进行细节点匹配,得出匹配结果;如果中心点进行匹配计算的结果是不匹配,则由匹配对应点选择单元以所述两个中心点为参考,在模板和输入特征点中寻找可能的可以作为匹配对应点的模板和输入特征点对,以这些特征点对作为匹配算法的参考对应点,由匹配判断单元对模板和输入特征点进行特征点匹配,得出匹配结果。First, the central point judging unit searches for and judges the central point in the fingerprint. If there is a central point, the central point is used as the corresponding point, and the matching judgment unit performs matching calculation to obtain a matching result; if the central point cannot be found, then The matching corresponding point selection unit and the matching judging unit use other unconsidered templates and input feature point pairs as reference corresponding points to match the minutiae points of the two fingerprints to obtain the matching result; , then the matching corresponding point selection unit uses the two center points as a reference to find possible templates and input feature point pairs that can be used as matching corresponding points in the template and input feature points, and use these feature point pairs as the matching algorithm. Referring to the corresponding points, the matching judging unit performs feature point matching on the template and the input feature points to obtain a matching result.
所述匹配判断单元进行匹配计算之前,由极坐标转换单元将有关中心点和特征点的空间位置转换为极坐标。Before the matching judgment unit performs matching calculation, the polar coordinate conversion unit converts the spatial positions of the central point and the feature point into polar coordinates.
一种智能卡,具有微处理器、存储器、通信电路,其特征在于:A kind of smart card, has microprocessor, memory, communication circuit, is characterized in that:
所述智能卡中还具有上述的指纹特征快速匹配装置。The smart card also has the above-mentioned fast matching device for fingerprint features.
一种使用上述的智能卡进行身份认证的方法,用户通过读卡器上的指纹采集设备输入其指纹,指纹传感器采集到指纹数据后,提交给特征提取模块,并把指纹特征传递到所述智能卡;所述智能卡对输入的指纹特征与保存的指纹特征进行匹配,若匹配成功,则完成身份认证,其特征在于:A method for identity authentication using the above-mentioned smart card, the user inputs his fingerprint through the fingerprint collection device on the card reader, after the fingerprint sensor collects the fingerprint data, submits it to the feature extraction module, and transfers the fingerprint feature to the smart card; The smart card matches the input fingerprint feature with the preserved fingerprint feature, if the matching is successful, the identity authentication is completed, and it is characterized in that:
所述智能卡采用上述的指纹特征快速匹配方法进行指纹特征匹配。The smart card uses the above-mentioned fast fingerprint feature matching method to perform fingerprint feature matching.
本发明所提供的指纹特征快速匹配方法具有如下的优点:The fingerprint feature fast matching method provided by the present invention has the following advantages:
(1)由于特征比对过程在中心点信息的指导下进行,从而可以快速地确定两枚指纹图像之间的旋转和平移变换关系,因此克服了无中心点匹配算法效率低下的缺点;(1) Since the feature comparison process is carried out under the guidance of the center point information, the rotation and translation transformation relationship between the two fingerprint images can be quickly determined, thus overcoming the inefficiency of the non-center point matching algorithm;
(2)在利用指定参考点对对两枚指纹的细节特征进行比对时,采用了先进行局部邻域粗匹配,再进行全局精确匹配的方法,大大加快了指纹比对的速度;(2) When using the designated reference point to compare the detailed features of two fingerprints, the method of first performing local neighborhood rough matching and then global precise matching has greatly accelerated the speed of fingerprint comparison;
(3)由于在匹配分数的计算上,考虑了细节特征点匹配误差对匹配结果的影响,从而得到可以更为精确匹配结果。(3) Since the calculation of the matching score takes into account the influence of the matching error of the detail feature points on the matching result, a more accurate matching result can be obtained.
采用上述指纹特征快速匹配方法的智能卡可以进一步提高其安全可靠性和使用方便性,从而为生物特征识别技术和智能卡应用技术的融合提供了一种可行的途径。The smart card adopting the above-mentioned fast matching method of fingerprint features can further improve its security, reliability and ease of use, thus providing a feasible way for the integration of biometric identification technology and smart card application technology.
附图说明 Description of drawings
下面结合附图和具体实施方式对本发明作进一步的说明。The present invention will be further described below in conjunction with the accompanying drawings and specific embodiments.
图1是闭曲线的离散表示示意图。Figure 1 is a schematic diagram of a discrete representation of a closed curve.
图2显示的是求奇异点方向的表格。Figure 2 shows the table for finding the direction of the singular point.
图3是局部特征向量示意图。Fig. 3 is a schematic diagram of local feature vectors.
图4显示了可变大小的限界盒的一个示例。Figure 4 shows an example of a variable-sized bounding box.
图5是本发明所提供的指纹特征快速匹配方法的基本流程图。Fig. 5 is a basic flow chart of the fingerprint feature fast matching method provided by the present invention.
图6是一个典型的智能卡的逻辑结构图。Fig. 6 is a logical structure diagram of a typical smart card.
图7介绍了内置指纹特征快速匹配装置的智能卡进行指纹特征匹配的典型过程。Fig. 7 introduces a typical process of fingerprint feature matching for a smart card with a built-in fingerprint feature fast matching device.
具体实施方式 Detailed ways
为便于理解,下面首先集中介绍本发明所提供的指纹特征快速匹配方法的具体实施方式及实施装置,再具体介绍该方法在智能卡中的应用。For ease of understanding, the following first focuses on the specific implementation and implementation device of the fingerprint feature fast matching method provided by the present invention, and then specifically introduces the application of this method in smart cards.
本指纹特征快速匹配方法主要包括以下几方面的技术内容:指纹图像中心点检测,指纹图像配准,基于局部结构信息的细节粗匹配,基于全局结构信息的细节精确匹配,基于奇异点信息的特征匹配和匹配分数计算。下面对其逐一加以介绍。This fast fingerprint feature matching method mainly includes the following technical contents: fingerprint image center point detection, fingerprint image registration, rough matching of details based on local structural information, precise matching of details based on global structural information, and feature matching based on singular point information. Matches and match score calculations. Let's introduce them one by one.
1.指纹图像中心点检测1. Fingerprint image center point detection
中心点的计算是以方向场为基础的,为了准确的得到指纹图像的方向场,采用如下步骤:The calculation of the center point is based on the direction field. In order to obtain the direction field of the fingerprint image accurately, the following steps are adopted:
1)将输入指纹图像分成大小为W×W的块。其中,W为整数;1) Divide the input fingerprint image into blocks of size W×W. Wherein, W is an integer;
2)计算每个块中每个像素的梯度Gx和Gy;2) Calculate the gradients G x and G y of each pixel in each block;
3)计算每一块的局部主方向:3) Compute the local principal directions for each block:
其中Gx和Gy分别是x和y方向上的梯度,W是用来估计方向场的块的宽度,θ(i,j)是点(i,j)所在块的主方向。where G x and G y are the gradients in the x and y directions, respectively, W is the width of the block used to estimate the direction field, and θ(i, j) is the main direction of the block where the point (i, j) is located.
4)由于噪声、断裂的脊线和谷线的存在,估计的脊线方向θ(i,j)可能不是总正确。在没有奇异点的邻域内,局部脊线方向是缓慢变化的,可以用一个低通滤波器来修改不正确的脊线方向。为了做这件事情,方向场需要转化到一个连续的向量场中:4) Due to the presence of noise, broken ridges and valleys, the estimated ridge direction θ(i, j) may not always be correct. In the neighborhood without singularity, the local ridge direction changes slowly, and a low-pass filter can be used to modify the incorrect ridge direction. In order to do this, the direction field needs to be transformed into a continuous vector field:
φx(i,j)=cos(2θ(,j)) (2)φ x (i, j) = cos(2θ(, j)) (2)
φy(i,j)=sin(2θ(,j)) (3)φ y (i, j) = sin (2θ (, j)) (3)
φx,φy是向量场的x,y分量,低通滤波可以如下表示:φ x , φ y are the x and y components of the vector field, and the low-pass filter can be expressed as follows:
其中h是一个二维低通滤波器,其积分为1,wφ×wφ是滤波器的大小。这个平滑操作是在块上执行的。其缺省大小为5×5。where h is a two-dimensional low-pass filter whose integral is 1, and w φ × w φ is the size of the filter. This smoothing operation is performed on blocks. Its default size is 5×5.
5)计算在(i,j)处的局部方向场:5) Compute the local direction field at (i, j):
这样我们就得到了指纹图像的方向场。In this way we get the direction field of the fingerprint image.
6)奇异点的计算6) Calculation of singular points
设O是指纹图像的方向场,点(i,j)处的Poincare索引的如下计算:Let O be the orientation field of the fingerprint image, the Poincare index at point (i, j) is calculated as follows:
其中ψx(i)和ψy(i)分别是以给定点为中心的具有Nψ个像素的封闭曲线上第i个点的x和y坐标.如果Poincare索引值为1/2,那么该给定点(i,j)就被确定为中心点.where ψ x (i) and ψ y (i) are the x and y coordinates of the i-th point on the closed curve with N ψ pixels centered on the given point, respectively. If the Poincare index value is 1/2, then the A given point (i, j) is determined as the center point.
在本发明中,闭曲线是在一个5×5方格中取。如图1所示,则在曲线D1D2...D12D1上其Poincare索引的值如下:In the present invention, the closed curve is taken in a 5×5 grid. As shown in Figure 1, the value of the Poincare index on the curve D 1 D 2 ... D 12 D 1 is as follows:
这样得到相邻的几个候选中心点,用k—均值聚类算法可以得到最终的中心点位置。In this way, several adjacent candidate center points are obtained, and the final center point position can be obtained by k-means clustering algorithm.
在一些指纹图像中,由于噪声等因素的影响,可能存在伪中心点。为了消除伪中心点,还计算了同一点的里面曲线d1d2...d8d1的Poincare值。只有当在3×3和5×5的Poincare值(1/2)相同的时候,这个候选中心点才作为真正的中心点。In some fingerprint images, due to the influence of noise and other factors, there may be false central points. In order to eliminate the false center point, the Poincare value of the inner curve d 1 d 2 ... d 8 d 1 at the same point is also calculated. Only when the Poincare value (1/2) in 3×3 and 5×5 is the same, this candidate center point is used as the true center point.
对每一个中心点,确定其方向如下:在5×5的方向场中,计算八个方向(即图2中的0、1…7)上的方向相互之间的差值的和,最小的方向就为中心点的方向,如图2所示。在计算过程中,可能会有两个方向的和是一样的,此时可以取这两个方向的平均值作为中心点的方向。For each center point, determine its direction as follows: In the direction field of 5×5, calculate the sum of the difference between the directions on the eight directions (
指纹图像的中心点为指纹细节特征点的匹配提供了可供利用的信息,但这与中心点检测的精确度有关。上面给出的中心点检测方法,得到的只能是基于分块的中心点位置,若要利用这些中心点位置作为两枚指纹的对应点,精度显然是不够的.为了得到更为精确的中心点位置,还需要一个中心点位置的精确定位过程,即在上述位置的局部邻域以更小的分块尺寸重新计算中心点的位置。由于这种计算在很小的图像范围内进行,因此速度可以很快。The center point of the fingerprint image provides available information for the matching of fingerprint minutiae feature points, but this is related to the accuracy of center point detection. The center point detection method given above can only obtain the center point position based on the block. If these center point positions are used as the corresponding points of the two fingerprints, the accuracy is obviously not enough. In order to obtain a more accurate center point point position, a precise positioning process of the center point position is also required, that is, the position of the center point is recalculated with a smaller block size in the local neighborhood of the above position. Since this calculation is done on a small image scale, it can be very fast.
2.指纹图像配准2. Fingerprint image registration
对于待匹配的模板指纹和输入指纹,我们得到的只是从指纹图像中检测到的细节点和中心点的坐标位置,方向和类型信息。由于事先不知道这两幅指纹间的对应关系,首先要找到合适的变换将它们对应起来,这个过程就是指纹的配准。我们使用一对特征点(一个来自于模板指纹,一个来自于输入指纹)作为参考点,利用它们之间的坐标平移和方向旋转关系来构造相似变换,将输入指纹特征点变换到模板指纹特征点的坐标空间中。由于指纹图像的非线性形变往往呈放射状,在某个区域内的形变比较大,然后非线性地向外扩张,因而,在极坐标中能更好地描述非线性形变。另外,在极坐标系中,我们不需要考虑输入图像与模板图像的参考点之间的平移,因为输入图像与模板图像间的平移是固定的,也就是说另外一对对应点之间的平移与参照点之间的平移是一样的,这样,将另外一对对应点的坐标相对于参照点转换为极坐标时,平移就被抵消掉了,而且,在极坐标系中显然比在直角坐标系中更便于处理两幅图像间的旋转。综合上述原因,本发明将在极坐标系中进行细节匹配。For the template fingerprint to be matched and the input fingerprint, what we get is only the coordinate position, direction and type information of the minutiae and central point detected from the fingerprint image. Since the corresponding relationship between the two fingerprints is not known in advance, it is necessary to find a suitable transformation to match them first. This process is the registration of fingerprints. We use a pair of feature points (one from the template fingerprint and one from the input fingerprint) as reference points, and use the coordinate translation and direction rotation relationship between them to construct a similar transformation, transforming the input fingerprint feature points to the template fingerprint feature points in the coordinate space of . Since the nonlinear deformation of the fingerprint image is often radial, the deformation in a certain area is relatively large, and then expands nonlinearly, so the nonlinear deformation can be better described in polar coordinates. In addition, in the polar coordinate system, we do not need to consider the translation between the reference point of the input image and the template image, because the translation between the input image and the template image is fixed, that is to say, the translation between another pair of corresponding points It is the same as the translation between the reference points, so that when the coordinates of another pair of corresponding points are converted into polar coordinates relative to the reference point, the translation is canceled out, and, in the polar coordinate system, it is obviously better than in the rectangular coordinates It is easier to handle the rotation between two images in the system. Based on the above reasons, the present invention will perform detail matching in the polar coordinate system.
即使输入指纹与模板指纹来自同一个手指,它们之间还是会有像平移,旋转,尺度变化这样的形变。在对两幅图像进行匹配之前先要估计它们之间的形变参数,并以此对这两幅图像进行校准。由于两幅指纹图像通常是用同一个仪器采集的,可以假定它们间的尺度变化系数为1。另外,在极坐标中可以不考虑两幅图像间的平移。因而需要作出估计的仅有输入图像与模板图像间的旋转参数。Even if the input fingerprint and the template fingerprint come from the same finger, there will still be deformations such as translation, rotation, and scale change between them. Before matching the two images, it is necessary to estimate the deformation parameters between them, and then calibrate the two images. Since the two fingerprint images are usually collected by the same instrument, it can be assumed that the coefficient of scale variation between them is 1. In addition, the translation between the two images can be disregarded in polar coordinates. Therefore, only the rotation parameters between the input image and the template image need to be estimated.
令make
表示模板指纹中的M个细节点,Represents M minutiae points in the template fingerprint,
表示输入指纹中的N个细节点。Represents N minutiae points in the input fingerprint.
为了把细节点转换到极坐标系中去,我们要在模板细节点集和输入细节点集中各选一个参照点作为相应的极坐标系中的原点,并算出其它细节点相对于参照点的极坐标。若两枚指纹中都存在中心点,则先以中心点作为参考点对;若有一枚或两枚指纹都没有中心点,则无法知道模板细节点集与输入细节点集的对应关系,此时将考虑所有可能的参考点对。In order to convert the minutiae points to the polar coordinate system, we need to select a reference point in the template minutiae point set and the input minutiae point set as the origin of the corresponding polar coordinate system, and calculate the polar coordinates of other minutiae points relative to the reference point. coordinate. If there is a center point in both fingerprints, the center point is first used as a reference point pair; if one or both fingerprints do not have a center point, it is impossible to know the corresponding relationship between the template detail point set and the input detail point set. All possible pairs of reference points will be considered.
对模板点集中的每一点Pi(1≤i≤M)和输入点集中的每一点Qj(1≤j≤N),如果它们是相同类型的细节点,则计算它们之间的欧氏距离和Pi方向相对于Qj方向的旋转角度rotate[i][j]。若这两者满足一定的阈值条件(如距离不超过80,角度差不超过45),则将它们作为一对参考点进行接下来的粗匹配和精确匹配过程。否则,接着考察其它可能的参考点对。For each point P i (1≤i≤M) in the template point set and each point Q j (1≤j≤N) in the input point set, if they are minutiae points of the same type, calculate the Euclidean The distance and the rotation angle rotate[i][j] in the direction of Pi relative to the direction of Q j . If the two meet certain threshold conditions (such as the distance does not exceed 80, the angle difference does not exceed 45), then they will be used as a pair of reference points for the next rough matching and precise matching process. Otherwise, other possible reference point pairs are then examined.
要在极坐标系中将输入图像与模板图像校准,只需将其它所有输入细节点与模板细节点都分别相对于参照点Pi和Qj转换到极坐标系中,然后在所有输入细节点的极角上加一个角度rotate[i][j]。也就是说,将输入细节点与模板细节点都分别相对于参照点Pi和Qj用下式转换到极坐标系中To calibrate the input image and the template image in the polar coordinate system, it is only necessary to convert all other input minutiae points and template minutiae points to the polar coordinate system with respect to the reference points P i and Q j respectively, and then at all input minutiae points Add an angle rotate[i][j] to the polar angle of . That is to say, the input minutiae points and the template minutiae points are transformed into the polar coordinate system with respect to the reference points P i and Q j respectively by the following formula
其中(xi,yi,θt)T是待转换细节点的坐标,(xr,yr,θr)T是参考细节点的坐标,(ri,ei,θi)T是细节点在极坐标中的表示(ri表示极半径,ei表示极角,θi表示细节点方向与参考细节点方向之间的夹角)。然后,我们对每一个输入细节点的ei加一个角度rotate[i][j]。Where (x i , y i , θ t ) T is the coordinate of the minutiae point to be converted, (x r , y r , θ r ) T is the coordinate of the reference minutiae point, (r i , e i , θ i ) T is The representation of minutiae points in polar coordinates (r i represents the polar radius, e i represents the polar angle, and θi represents the angle between the direction of the minutiae point and the direction of the reference minutiae point). Then, we add an angle rotate[i][j] to e i of each input detail point.
3.基于局部结构信息的细节粗匹配3. Rough matching of details based on local structure information
对于给定的一对参考点,先利用它们在特征点集合中的局部邻域结构信息来判断它们是否可以作为真正的对应点来进行全局精确匹配。只有通过了局部结构信息粗匹配的参考点对,才利用它们作为对应点进行真正的全局精确匹配,这样就大大降低了细节点匹配的计算量,同时提高了匹配算法的精度和运算的效率。For a given pair of reference points, first use their local neighborhood structure information in the feature point set to judge whether they can be used as true corresponding points for global accurate matching. Only the reference point pairs that have passed the rough matching of the local structure information can be used as the corresponding points for the real global precise matching, which greatly reduces the calculation amount of the detail point matching, and improves the accuracy of the matching algorithm and the efficiency of the operation.
对于某一特征点Pk=(xk,yk,φk,)T,利用离它最近的另外m个特征点(m一般取值为6~9)和它自己建立特征向量,把这个特征向量作为局部结构信息,这个局部特征是旋转平移不变的。图3示意了m=2的情形,其特征向量描述如下:For a certain feature point P k =(x k , y k , φ k ,) T , use the other m feature points closest to it (m generally takes a value of 6~9) and itself to establish a feature vector, and put this The feature vector is used as local structure information, and this local feature is invariant to rotation and translation. Fig. 3 illustrates the situation of m=2, and its eigenvector description is as follows:
Flk=((dki,θki,φki)T,(dkj,θkj,φkj)T) (15)Fl k =((d ki , θ ki , φ ki ) T , (d kj , θ kj , φ kj ) T ) (15)
其中in
其中,dki是两个特征点之间的距离,θki是特征点Pk、Pi连线方向与特征点Pk方向之间的夹角,φki是两个特征点方向的夹角。以上计算过程实质上是以特征点Pk为原点进行的极坐标变换。Among them, d ki is the distance between two feature points, θ ki is the angle between the line direction of feature point P k , P i and the direction of feature point P k , φ ki is the angle between the direction of two feature points . The above calculation process is essentially a polar coordinate transformation with the feature point Pk as the origin.
对于任意两个参考特征点,计算出它们各自的局部特征向量后,利用下述的点匹配算法确定邻域细节点的匹配情况。如果邻域匹配细节点数超过一定的阈值,则利用它们作为对应点进行全局精确匹配,否则,继续考察其它的参考点对。For any two reference feature points, after calculating their respective local feature vectors, use the following point matching algorithm to determine the matching of neighborhood detail points. If the number of neighborhood matching detail points exceeds a certain threshold, use them as corresponding points for global precise matching, otherwise, continue to examine other reference point pairs.
4.基于全局结构信息的细节点精确匹配4. Accurate matching of minutiae points based on global structural information
在某一参考点对Pb(b=b1,b2)通过了邻域结构信息的粗匹配后,再利用这两个特征点作为进一步进行全局匹配的对应参考点,将图像上除最近邻点外的其它特征点Pk相对于对应点Pb进行坐标变换:After a reference point pair P b (b=b1, b2) has passed the rough matching of the neighborhood structure information, these two feature points are used as the corresponding reference points for further global matching, and the nearest neighbor points on the image are removed Coordinate transformation of other feature points P k relative to the corresponding point P b :
Fgk表示特征点Pk相对于对应点Pb进行极坐标变换后得到的向量。所有的Fgk和已经计算得到的Pb的局部特征向量一起构成全局特征向量。利用下述的细节点匹配算法对全局特征向量进行匹配从而得到全局匹配结果。Fg k represents the vector obtained after the polar coordinate transformation of the feature point P k relative to the corresponding point P b . All Fg k and the calculated local eigenvectors of P b together form the global eigenvector. Use the following minutiae point matching algorithm to match the global feature vector to obtain the global matching result.
5.细节点的匹配5. Matching of detail points
在上述的局部结构信息粗匹配和全局信息的精确匹配过程中,都涉及到极坐标系中的细节点特征向量的匹配问题。本发明中的细节点匹配算法如下:In the process of rough matching of local structure information and precise matching of global information mentioned above, the problem of matching feature vectors of minutiae points in the polar coordinate system is involved. The minutiae point matching algorithm in the present invention is as follows:
1)以模板细节点Pi和输入细节点Qj作为参考细节点,利用上面介绍的方法将输入点集和模板点集中的邻域细节点或全局细节点变换到极坐标系;1) Take the template minutiae P i and the input minutiae Qj as the reference minutiae, use the method described above to transform the neighborhood minutiae or global minutiae in the input point set and the template point set to the polar coordinate system;
2)将极坐标中的模板细节点和输入细节点按极角递增方向排序,并连接成串,表示如下:2) Sort the template detail points and input detail points in the polar coordinates according to the increasing direction of the polar angle, and connect them into a string, expressed as follows:
其中M,N分别为模板和输入特征向量中的细节点数目;Among them, M and N are the number of minutiae points in the template and input feature vector, respectively;
3)用后面将要介绍的自适应限界盒方法匹配串和找出并记录匹配分数,将匹配分数的大小作为是否进行全局精确或匹配成功与否的依据;3) Use the adaptive bounding box method to be introduced later to match strings and Find and record the matching score, and use the size of the matching score as the basis for global precision or matching success;
4)找出各次匹配分数中的最大值,把它当作输入细节点集与模板细节点集的匹配分数。如果匹配分数高于一个预先设定的阈值,则认为输入图像与模板图像来自同一个指纹,否则认为它们来自不同指纹。4) Find the maximum value of each matching score, and use it as the matching score between the input minutiae point set and the template minutiae point set. If the matching score is higher than a pre-set threshold, the input image and the template image are considered to be from the same fingerprint, otherwise they are considered to be from different fingerprints.
自适应限界盒及其大小如图4所示,一个限界盒是放在模板细节点上的一个盒子。限界盒的大小用radius_size和angle_size来表示,它们的值将随着细节点的极半径大小而改变。用下式来计算极半径为r的模板细节点的radius_size和angle。The adaptive bounding box and its size are shown in Fig. 4, a bounding box is a box placed on the template minutiae. The size of the bounding box is represented by radius_size and angle_size, and their values will change with the size of the polar radius of the minutiae. Use the following formula to calculate the radius_size and angle of the template minutiae with the polar radius r.
r_size=α1+β1r (21)r_size=α 1 +β 1 r (21)
a_size=α2-β2r (23)a_size=α 2 -β 2 r (23)
其中α1,β1,α2,β2为预先设定的经验参数,且均大于零。r是模板细节点的极半径,r_small,r_large,a_small,a_large分别是radius_size和angle_size的上界和下界,它们的值也是预先通过试验设定的。Among them, α 1 , β 1 , α 2 , and β 2 are preset empirical parameters, and all of them are greater than zero. r is the polar radius of the template detail point, r_small, r_large, a_small, a_large are the upper and lower bounds of radius_size and angle_size respectively, and their values are also pre-set through experiments.
使用自适应大小的限界盒而不是固定大小的限界盒是为了使算法对非线性形变更为鲁棒。非线性形变一般在一个特定的区域内较大,然后非线性地向外扩张。当细节点的极径较小时,小的形变就可以造成大的极角的改变,而极半径的改变较小。所以在这种情况下限界盒的angle_size应该比较大而radius_size则应该较小。另一方面,当细节点的极半径较大时,极角的较小改变就会造成细节点位置的较大变动,而极半径的形变可以看成是该细节点与参考细节点间的所有区域的形变的累加。所以在这种情况下限界盒的angle_size应该比较小而radius_size则应该较大。The purpose of using adaptively sized bounding boxes instead of fixed-sized bounding boxes is to make the algorithm more robust to non-linear deformations. Nonlinear deformation is generally larger in a specific region, and then expands nonlinearly outward. When the polar diameter of the minutiae is small, a small deformation can cause a large change in the polar angle, while the change in the polar radius is small. So in this case the angle_size of the bounding box should be larger and the radius_size should be smaller. On the other hand, when the polar radius of the minutiae is large, a small change in the polar angle will cause a large change in the position of the minutiae, and the deformation of the polar radius can be regarded as the difference between the minutiae and the reference minutiae. The accumulation of the deformation of the region. So in this case the angle_size of the bounding box should be smaller and the radius_size should be larger.
匹配和的算法描述如下:match and The algorithm is described as follows:
1)用(20)~(23)式决定每一个模板细节点的限界盒的大小。置nb_pair[i][j]=0。(nb_pair[i][j]表示以模板细节点Pi和输入细节点Qj作为参考细节点进行特征匹配时匹配上的细节点对数)1) Use (20)~(23) to determine the size of the bounding box of each template detail point. Set nb_pair[i][j]=0. (nb_pair[i][j] represents the number of minutiae pairs matched when using the template minutiae Pi and the input minutiae Qj as reference minutiae points for feature matching)
2)做如下循环:2) Do the following loop:
While 1≤m≤M doWhile 1≤m≤M do
While 1≤n≤N doWhile 1≤n≤N do
if 模板细节点m与输入细节点n满足condition1,thenIf template detail point m and input detail point n satisfy condition1, then
nb_pair[i][j]=nb_pair[i][j]+1;nb_pair[i][j]=nb_pair[i][j]+1;
End ifEnd if
Increase n;Increase n;
End whileEnd while
Increase m;Increase m;
End whileEnd while
上述过程中,condition1定义为:In the above process, condition1 is defined as:
其中:in:
r_low[m],r_high[m],θ_low[m],θ_high[m]分别为模板细节点m限界合极半径和极角误差的下限和上限。condition1是将模板细节点m和输入细节点n看作匹配点对的条件。其含义是,输入细节点n应该在模板细节点m的限界盒的内部,这两个细节点的方向差异应小于ε(如ε=30)。r_low[m], r_high[m], θ_low[m], θ_high[m] are the lower limit and upper limit of the limit and polar radius and polar angle error of the template detail point m respectively. condition1 is the condition for treating template minutiae m and input minutiae n as matching point pairs. It means that the input minutiae n should be inside the bounding box of the template minutiae m, and the direction difference between these two minutiae should be less than ε (eg ε=30).
6.匹配分数的计算6. Calculation of matching score
在以两个细节点(一个来自模板指纹,一个来自输入指纹)为参考完成以上匹配过程后,假定匹配上的细节点对的数目为nb_pair,各匹配点对的累积匹配误差为match_error。则匹配分数的计算如下:After completing the above matching process with reference to two minutiae points (one from the template fingerprint and one from the input fingerprint), assuming that the number of minutiae point pairs on the match is nb_pair, the cumulative matching error of each matching point pair is match_error. The matching score is then calculated as follows:
其中Mc,Nc分别是模板和输入指纹图像在公共区域的特征点个数,match_error为各匹配细节点对匹配误差(包括公式(25)~(27)中的的加权平均,α,β为预定的加权系数(其典型值分别为2.0和3.0)。将以上匹配分数与预先设定的匹配阈值进行比较,即可作出匹配与否的判断。Among them, M c and N c are the number of feature points in the common area of the template and the input fingerprint image respectively, and match_error is the matching error of each matching detail point pair (including the The weighted average of , α, β are predetermined weighting coefficients (the typical values are 2.0 and 3.0, respectively). Comparing the above matching scores with the preset matching threshold, a judgment of matching or not can be made.
7.基于中心点信息的特征匹配7. Feature matching based on center point information
经过指纹图像中心点检测后,指纹特征信息包括两部分:一部分为局部细节特征点信息,即为指纹脊线的分叉点和端点;另一部分为指纹的全局特征信息,即为指纹的中心点。因此,有了指纹的中心点信息后,就可以提高匹配算法的运行效率。总结起来,本发明中的特征点匹配算法如图5所示,以如下步骤进行:After the center point detection of the fingerprint image, the fingerprint feature information includes two parts: one part is the local detail feature point information, which is the bifurcation point and endpoint of the fingerprint ridge line; the other part is the global feature information of the fingerprint, which is the fingerprint center point . Therefore, with the center point information of the fingerprint, the operating efficiency of the matching algorithm can be improved. To sum up, the feature point matching algorithm among the present invention is as shown in Figure 5, carried out with the following steps:
1)若模板指纹和输入指纹中均有中心点,则以两枚指纹的中心点为对应点进行特征点匹配,其中的特征点匹配算法如上述第5节所述;否则转第四步;1) If there are center points in both the template fingerprint and the input fingerprint, the center point of the two fingerprints is used as the corresponding point for feature point matching, and the feature point matching algorithm is as described in
2)若以中心点为对应点进行特征点匹配时返回匹配一致结果,则特征匹配算法结束;否则,以上述两个中心点为参考,在模板和输入特征点中寻找可能的可以作为匹配对应点的模板和输入特征点对(将模板细节点限界合的尺寸放大,重新进行全局细节特征点匹配即可),并将它们记录到集合Point Pair中;2) If the center point is used as the corresponding point for feature point matching and the matching result is returned, then the feature matching algorithm ends; otherwise, with the above two center points as a reference, it is possible to find a possible match between the template and the input feature point as a matching Point templates and input feature point pairs (enlarge the size of the template detail point limit and re-match the global detail feature points), and record them in the set Point Pair;
3)利用第二步中得到的各个特征点点对作为匹配算法的参考对应点,对模板和输入细节点进行特征点匹配,遇到匹配一致结果则结束特征匹配算法;3) Use each feature point pair obtained in the second step as the reference corresponding point of the matching algorithm to perform feature point matching on the template and the input minutiae points, and end the feature matching algorithm when encountering a matching consistent result;
4)将其他未使用过的模板和输入细节点对作为参考对应点对两枚指纹进行细节点匹配,遇到匹配一致结果则结束特征匹配算法。4) Use other unused templates and input minutiae pairs as reference corresponding points to perform minutiae matching on the two fingerprints, and end the feature matching algorithm when a consistent matching result is encountered.
考虑到卡内指纹比对的响应时间,本发明所提供的方法在考虑了足够多的参考对应点对并进行细节点匹配后,不管匹配成功与否都将返回。Considering the response time of fingerprint comparison in the card, the method provided by the present invention will return no matter whether the matching is successful or not after considering enough reference corresponding point pairs and matching the minutiae points.
上述的指纹特征快速匹配方法主要是针对智能卡内存储和运算能力有限的具体环境而专门优化设计的。由于在匹配过程中充分利用中心点的指导作用,对两枚指纹图像进行快速配准,从而大大提高了算法的时间效率;同时,利用自适应限界合的细节点匹配方法有效克服了指纹图像非线性形变对匹配算法精度的影响;对于给定的一对参考点,先进行局部邻域结构信息的粗匹配,再进行全局细节信息的精确匹配,从而大大降低了细节点匹配过程的计算量,同时提高了匹配算法的精度和运算的效率。The above-mentioned fast fingerprint feature matching method is mainly specially optimized and designed for the specific environment with limited storage and computing capabilities in the smart card. Due to the full use of the guiding role of the center point in the matching process, the two fingerprint images are quickly registered, which greatly improves the time efficiency of the algorithm; at the same time, the minutiae point matching method using adaptive limit integration effectively overcomes the fingerprint image non-identity. The impact of linear deformation on the accuracy of the matching algorithm; for a given pair of reference points, the rough matching of the local neighborhood structure information is performed first, and then the precise matching of the global detail information is performed, thereby greatly reducing the calculation amount of the detail point matching process, At the same time, the precision of the matching algorithm and the efficiency of operation are improved.
用于实施本方法的软件可以做成固件形式的指纹特征快速匹配装置,集成在智能卡中。该指纹特征快速匹配装置包括中心点判断单元,用于寻找和判断指纹中的中心点;匹配对应点选择单元,用于在模板和输入特征点中寻找可能的可以作为匹配对应点的模板和输入特征点对,并将它们记录起来;匹配判断单元,用于执行特征点的匹配计算,极坐标转换单元,用于将有关特征点与中心点的空间位置转换为极坐标。在进行指纹特征快速匹配操作时,首先由中心点判断单元寻找和判断指纹中的中心点,若存在中心点,则以该中心点为对应点,由匹配判断单元进行匹配计算,得出匹配结果;如果中心点未能找到,则由匹配对应点选择单元和匹配判断单元将其他未考虑的模板和输入特征点对作为参考对应点对两枚指纹进行细节点匹配,得出匹配结果。如果中心点进行匹配计算的结果是不匹配,则由匹配对应点选择单元以上述两个中心点为参考,在模板和输入特征点中寻找可能的可以作为匹配对应点的模板和输入特征点对,以这些特征点对作为匹配算法的参考对应点,由匹配判断单元对模板和输入特征点进行特征点匹配,得出匹配结果。匹配判断单元进行匹配计算之前,由极坐标转换单元将有关中心点和特征点的空间位置转换为极坐标,匹配运算是基于极坐标数据进行的,这样可以减少计算量,从而更快地获得结果。The software used to implement the method can be made into a fast fingerprint feature matching device in the form of firmware, which is integrated in the smart card. The fingerprint feature fast matching device includes a central point judging unit, which is used to find and judge the central point in the fingerprint; a matching corresponding point selection unit, which is used to find possible templates and input that can be used as matching corresponding points among templates and input feature points feature point pairs, and record them; the matching judging unit is used to perform the matching calculation of the feature points, and the polar coordinate conversion unit is used to convert the spatial positions of the relevant feature points and the central point into polar coordinates. When performing fast matching operation of fingerprint features, firstly, the central point judging unit finds and judges the central point in the fingerprint, if there is a central point, then takes the central point as the corresponding point, and the matching judgment unit performs matching calculation to obtain the matching result ; If the central point is not found, then the matching corresponding point selection unit and the matching judging unit use other unconsidered templates and input feature point pairs as reference corresponding points to perform minutiae matching on the two fingerprints to obtain a matching result. If the result of the matching calculation of the center point is a mismatch, the matching corresponding point selection unit uses the above two center points as a reference to find a possible pair of templates and input feature points that can be used as matching corresponding points in the template and input feature points , using these feature point pairs as the reference corresponding points of the matching algorithm, the matching judging unit performs feature point matching on the template and the input feature points, and obtains a matching result. Before the matching judgment unit performs the matching calculation, the polar coordinate conversion unit converts the spatial position of the center point and the feature point into polar coordinates. The matching operation is based on the polar coordinate data, which can reduce the amount of calculation and obtain the result faster .
上述的指纹特征快速匹配装置主要用在智能卡中。图6介绍了一种典型的非接触式智能卡的内部逻辑结构。其中使用的微处理器芯片是MPU,具有逻辑控制、管理功能、加密解密功能等,存储器包括ROM、RAM、EEPROM等。RFC是射频收发电路,主要解决卡的读写通讯与供电,CAU是加密运算协处理器,SL是安全逻辑。当然,上述装置用在接触式智能卡也是完全可以的。The above-mentioned device for quickly matching fingerprint features is mainly used in smart cards. Figure 6 introduces the internal logic structure of a typical contactless smart card. The microprocessor chip used is MPU, which has logic control, management functions, encryption and decryption functions, etc., and the memory includes ROM, RAM, EEPROM, etc. RFC is a radio frequency transceiver circuit, which mainly solves card reading and writing communication and power supply, CAU is an encryption operation coprocessor, and SL is a security logic. Of course, it is also completely possible for the above-mentioned device to be used in a contact smart card.
不同类型的智能卡的存储器空间大小从512Byte至16KB不等,且有不同的分区选择。对于实施本发明所述方法的智能卡而言,由于需要存储指纹特征数据,智能卡的容量一般应选择1KB以上的。在本发明中,如果使用特征点的坐标模型,仅考虑脊末梢与分叉点两类特征点,则一枚指纹的特征数据加密后不超过256Byte,完全可以存储在智能卡的存储区内。The memory space size of different types of smart cards varies from 512Byte to 16KB, and there are different partition options. For the smart card implementing the method of the present invention, due to the need to store fingerprint feature data, the capacity of the smart card should generally be selected to be above 1KB. In the present invention, if the coordinate model of feature points is used and only two types of feature points, ridge tip and bifurcation point, are considered, the encrypted feature data of a fingerprint does not exceed 256 Byte and can be stored in the storage area of the smart card.
对于应用本发明所述方法进行指纹特征快速匹配的智能卡而言,可以将上述的指纹特征快速匹配装置内置于智能卡的微处理器中,也可以独立设置。图7介绍了内置指纹特征快速匹配装置的智能卡进行指纹特征匹配的典型过程。用户使用带有指纹识别的读卡器,通过读卡器上的指纹采集设备输入其指纹,指纹传感器采集到指纹数据后,提交给特征提取模块,提取本次指纹特征,并把指纹特征传递到智能卡;智能卡对输入的指纹特征与保存的指纹特征进行匹配,若成功,则允许用户进行后续的操作,如果匹配不成功,则提示用户重新输入指纹数据。一般连续多次不成功可以自动锁死智能卡。For the smart card that uses the method of the present invention for fast matching of fingerprint features, the above-mentioned fast fingerprint feature matching device can be built into the microprocessor of the smart card, or can be set independently. Fig. 7 introduces a typical process of fingerprint feature matching for a smart card with a built-in fingerprint feature fast matching device. The user uses a card reader with fingerprint recognition to input his fingerprint through the fingerprint collection device on the card reader. After the fingerprint sensor collects the fingerprint data, it submits it to the feature extraction module to extract the fingerprint feature of this time and transfer the fingerprint feature to the Smart card: The smart card matches the input fingerprint feature with the saved fingerprint feature, if successful, allows the user to perform subsequent operations, if the match is unsuccessful, prompts the user to re-input the fingerprint data. Generally, the smart card can be automatically locked after repeated unsuccessful attempts.
上述方案仍然存在一个安全弱点在于,如果攻击者通过具他途径获取了用户的指纹信息,那么只要他获取到用户的智能卡后,可以把事先获取到的指纹信息发送给智能卡完成指纹匹配、认证过程。解决这个安全弱点的方法有两种,一种方法是把指纹传感器,读卡器和智能卡集成在一个设备,使得智能卡只接受来自这个设备的指纹数据,这样做成本较高。另一种方法是把读卡器和指纹传感器集成在一起,智能卡和读卡器在交换数据前进行相互认证,数据进行加密传输,这样做的不足之处在于技术方案复杂。There is still a security weakness in the above scheme, if the attacker obtains the user's fingerprint information through other channels, as long as he obtains the user's smart card, he can send the fingerprint information obtained in advance to the smart card to complete the fingerprint matching and authentication process . There are two ways to solve this security weakness. One way is to integrate the fingerprint sensor, card reader and smart card into one device, so that the smart card only accepts fingerprint data from this device, which is costly. Another method is to integrate the card reader and the fingerprint sensor. The smart card and the card reader perform mutual authentication before exchanging data, and the data is encrypted for transmission. The disadvantage of this is that the technical solution is complicated.
以上所述,仅为本发明较佳的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到的变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应该以权利要求书的保护范围为准。The above is only a preferred embodiment of the present invention, but the scope of protection of the present invention is not limited thereto. Any person skilled in the art within the technical scope disclosed in the present invention can easily think of changes or Replacement should be covered within the protection scope of the present invention. Therefore, the protection scope of the present invention should be determined by the protection scope of the claims.
Claims (12)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CNB2006100658774A CN100492395C (en) | 2006-03-28 | 2006-03-28 | Fingerprint feature fast matching method, device and application thereof |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CNB2006100658774A CN100492395C (en) | 2006-03-28 | 2006-03-28 | Fingerprint feature fast matching method, device and application thereof |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN1831847A CN1831847A (en) | 2006-09-13 |
| CN100492395C true CN100492395C (en) | 2009-05-27 |
Family
ID=36994135
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CNB2006100658774A Expired - Fee Related CN100492395C (en) | 2006-03-28 | 2006-03-28 | Fingerprint feature fast matching method, device and application thereof |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN100492395C (en) |
Families Citing this family (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101782965B (en) * | 2010-02-11 | 2012-05-23 | 上海点佰趣信息科技有限公司 | Method for treating deformed fingerprint image |
| CN102147817B (en) * | 2011-04-27 | 2014-01-15 | 杭州晟元芯片技术有限公司 | Fingerprint singular point quick searching method |
| US9965607B2 (en) | 2012-06-29 | 2018-05-08 | Apple Inc. | Expedited biometric validation |
| GB2507539A (en) * | 2012-11-02 | 2014-05-07 | Zwipe As | Matching sets of minutiae using local neighbourhoods |
| US9928355B2 (en) | 2013-09-09 | 2018-03-27 | Apple Inc. | Background enrollment and authentication of a user |
| CN104462920B (en) * | 2014-12-08 | 2017-10-31 | 上海斐讯数据通信技术有限公司 | Startup method, starter and the mobile terminal of scene mode |
| CN104573651B (en) * | 2014-12-31 | 2018-02-13 | 北京天诚盛业科技有限公司 | Fingerprint identification method and device |
| CN105447437B (en) * | 2015-02-13 | 2017-05-03 | 比亚迪股份有限公司 | fingerprint identification method and device |
| US10032062B2 (en) * | 2015-04-15 | 2018-07-24 | Samsung Electronics Co., Ltd. | Method and apparatus for recognizing fingerprint |
| CN104820983B (en) * | 2015-04-23 | 2018-11-23 | 清华大学 | A kind of image matching method |
| CN105608409B (en) * | 2015-07-16 | 2019-01-11 | 宇龙计算机通信科技(深圳)有限公司 | The method and device of fingerprint recognition |
| CN106447839A (en) * | 2016-08-26 | 2017-02-22 | 合肥若涵信智能工程有限公司 | Intelligent fingerprint lock |
| CN106934361B (en) * | 2017-03-06 | 2021-01-05 | 苏州佳世达光电有限公司 | Identification method and electronic equipment |
| CN111310712B (en) * | 2020-03-04 | 2024-02-13 | 杭州晟元数据安全技术股份有限公司 | Quick searching method based on fingerprint word bag characteristics |
| CN112115848B (en) * | 2020-09-16 | 2022-09-06 | 哈尔滨理工大学 | Incomplete area processing method for low-quality fingerprint images |
| CN114360228A (en) * | 2022-03-21 | 2022-04-15 | 广州市致胜电器制造有限公司 | Linkage alarm device for fingerprint lock authentication failure |
-
2006
- 2006-03-28 CN CNB2006100658774A patent/CN100492395C/en not_active Expired - Fee Related
Non-Patent Citations (3)
| Title |
|---|
| Structure Matchhing Algorithm of Fingerprint Minutiae Basedon Core Point. ZHANG,Wei-Wei,WANG,Sen,WANG,Yang-Sheng.自动化学报,第29卷第6期. 2003 * |
| 一种基于结构匹配的指纹匹配算法. 罗翔,庄连生,张云超,庄镇泉.计算机工程与应用. 2004 * |
| 指纹识别. 宋强.吉林大学硕士学位论文. 2004 * |
Also Published As
| Publication number | Publication date |
|---|---|
| CN1831847A (en) | 2006-09-13 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN100492395C (en) | Fingerprint feature fast matching method, device and application thereof | |
| CN100447806C (en) | Method, device and use for matching two-stage mixed-fingerprint characteristics | |
| Yin et al. | Contactless fingerprint recognition based on global minutia topology and loose genetic algorithm | |
| Zhang et al. | Combining global and minutia deep features for partial high-resolution fingerprint matching | |
| US20100080425A1 (en) | Minutiae-based template synthesis and matching | |
| Uz et al. | Minutiae-based template synthesis and matching for fingerprint authentication | |
| CN102479328B (en) | Identity verification device and method based on biological characteristics | |
| Sheng et al. | A memetic fingerprint matching algorithm | |
| Kumar et al. | Integrating shape and texture for hand verification | |
| Doublet et al. | Contact less hand recognition using shape and texture features | |
| Wang et al. | Fingerprint matching using orientationcodes and polylines | |
| Jea et al. | Security and matching of partial fingerprint recognition systems | |
| Kaushik et al. | A new hybrid approch for palm print recognition in PCA based palm print recognition system | |
| Conti et al. | An advanced technique for user identification using partial fingerprint | |
| Tong et al. | Local relative location error descriptor-based fingerprint minutiae matching | |
| CN110956468B (en) | Fingerprint payment system | |
| Sheng et al. | Consensus fingerprint matching with genetically optimised approach | |
| Dale et al. | A single sensor hand geometry and palm texture fusion for person identification | |
| Uz et al. | Minutiae-based template synthesis and matching using hierarchical delaunay triangulations | |
| EP3093793A1 (en) | Fingerprint identification method and device using same | |
| Shen et al. | Coding 3D Gabor features for hyperspectral palmprint recognition | |
| KR20160101248A (en) | Portable secure authentication apparatus using fingerprint | |
| Khanyile et al. | Similarity score computation for minutiae-based fingerprint recognition | |
| Fons et al. | Hardware-software co-design of a fingerprint matcher on card | |
| Khazaei et al. | Fingerprint matching and classification using an onion layer algorithm of computational geometry |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| C06 | Publication | ||
| PB01 | Publication | ||
| C10 | Entry into substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| C14 | Grant of patent or utility model | ||
| GR01 | Patent grant | ||
| C17 | Cessation of patent right | ||
| CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20090527 Termination date: 20110328 |