US7565034B2 - Determination of a navigation window in an optical navigation system - Google Patents
Determination of a navigation window in an optical navigation system Download PDFInfo
- Publication number
- US7565034B2 US7565034B2 US10/870,192 US87019204A US7565034B2 US 7565034 B2 US7565034 B2 US 7565034B2 US 87019204 A US87019204 A US 87019204A US 7565034 B2 US7565034 B2 US 7565034B2
- Authority
- US
- United States
- Prior art keywords
- per
- image data
- column
- row
- sums
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 230000003287 optical effect Effects 0.000 title claims abstract description 41
- 238000005286 illumination Methods 0.000 claims abstract description 63
- 238000000034 method Methods 0.000 claims abstract description 52
- 230000004044 response Effects 0.000 claims abstract description 29
- 238000006073 displacement reaction Methods 0.000 claims description 19
- 238000007781 pre-processing Methods 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 5
- 238000003491 array Methods 0.000 description 4
- 238000009966 trimming Methods 0.000 description 3
- 241000699666 Mus <mouse, genus> Species 0.000 description 2
- 241000699670 Mus sp. Species 0.000 description 2
- 238000001514 detection method Methods 0.000 description 1
- 230000001627 detrimental effect Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 238000004148 unit process Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
- G06F3/0383—Signal control means within the pointing device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0317—Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
Definitions
- Optical navigation upon arbitrary surfaces produces motion signals indicative of relative movement along the directions of coordinate axes, and is becoming increasingly prevalent. It is used, for instance, in optical computer mice and fingertip tracking devices to replace conventional mice and trackballs for the position control of screen pointers in windowed user interfaces for computer systems. It has many advantages, among which are the lack of moving parts that accumulate dirt and suffer the mechanical wear and tear of use. Another advantage of an optical mouse is that it does not need a mouse pad, since it is generally capable of navigating upon arbitrary surfaces, so long as they are not optically featureless.
- Optical navigation operates by tracking the relative displacement between two images.
- a surface is illuminated and a two-dimensional view of a portion of the surface is focused upon an array of photodetectors.
- the output of the photodetectors is digitized and stored as a reference image in a corresponding array of memory.
- a brief time later a sample image is captured using the same process. If there has been no motion between the image capture events, then the sample image and the reference image are identical (or very nearly so). That is, the image features of the reference image data and the sample image data appear to match up.
- correlation typically involves a two-dimensional cross-correlation between the reference image data and the sample image data.
- a two-dimensional cross-correlation between the reference image data and the sample image data compares intensity values of the image data on a pixel-by-pixel basis to determine relative displacement between the two sets of image data.
- the image features that are relied upon to determine relative displacement are produced by illuminating a surface. If the illumination of the surface is not evenly distributed or the illumination source is not properly aligned, tracking errors may result. In particular, a misaligned illumination source can cause boarders of the image data to appear dark and therefore lack sufficient contrast to support the feature matching process. Further, if the illumination drops off suddenly at some point within the image data, the contrast in brightness may appear as an image feature (e.g., an edge), which can severely degrade the tracking efficiency. The false detection of an edge can be especially detrimental in image tracking algorithms that rely on edge detection.
- a technique for reducing navigation errors that are caused by uneven illumination involves using only the portion of the image data that is properly illuminated in the correlation process.
- the portion of the image data that is used for optical navigation referred to herein as the “navigation window” is established by summing the image data on a per-slice basis, comparing the per-slice sums to a pre-established intensity threshold, and defining boundaries of the navigation window in response to the comparison. Boundaries are set at the points where the per-slice sums equal the pre-established intensity threshold.
- the correlation process is responsive to actual illumination conditions such that the portions of the image data that are not properly illuminated are not used in the tracking process. Accordingly, portions of the image data that are most likely to cause navigation errors are “trimmed” from the set of image data that is used in the correlation process.
- FIG. 1 depicts an image sensor that is formed by an array of photodetectors.
- FIG. 2 depicts a set of reference image data that is obtained with the image sensor of FIG. 1 relative to the image sensor's photodetector array.
- FIG. 3 depicts a set of sample image data that is obtained with the image sensor of FIG. 1 relative to the image sensor's photodetector array.
- FIG. 4 depicts the reference and sample image data from FIGS. 2 and 3 aligned such that the T-shaped image features of the two sets of image data match.
- FIG. 5 depicts image data that is divided into columns relative to a graph of the per-column sums of the intensity values of the image data, where the columns correspond to the columns of photodetectors in the photodetector array.
- FIG. 6 depicts image data that is divided into rows relative to a graph of the per-row sums of the intensity values of the image data, where the rows correspond to the rows of photodetectors in the photodetector array.
- FIG. 7 depicts the image data and the graph of the per-column sums as described with reference to FIG. 5 relative to a column threshold and a navigation boundary in accordance with an embodiment of the invention.
- FIG. 8 depicts the image data and the graph of the per-row sums as described with reference to FIG. 6 relative to a row threshold and a navigation boundary in accordance with an embodiment of the invention.
- FIG. 9 depicts the navigation window that results when vertical and horizontal boundaries are established as described above with reference to FIGS. 7 and 8 in accordance with an embodiment of the invention.
- FIG. 10 depicts the navigation window in accordance with an embodiment of the invention that results when no edges of the image data are properly illuminated.
- FIG. 11 depicts a process flow diagram of a method for optical navigation in accordance with an embodiment of the invention.
- FIG. 12 depicts a process flow diagram of another method for optical navigation in accordance with an embodiment of the invention.
- FIG. 13 depicts an example of an optical navigation system in accordance with an embodiment of the invention relative to a surface that is used for navigation.
- FIG. 1 depicts an image sensor 10 that is formed by an array of photodetectors 12 , where each of the individual photodetectors is often referred to as a “pixel.”
- the photodetector array is formed in columns 20 and rows 22 of photodetectors. For description purposes, the columns are parallel to the y-axis and the rows are parallel to the x-axis as indicated in FIG. 1 .
- the photodetector array of FIG. 1 includes a 16 ⁇ 16 array of photodetectors, however, it should be noted that this is for example purposes only. Actual photodetector arrays used in optical navigations systems may range, for example, from 20 ⁇ 20 to 30 ⁇ 30 arrays although other array sizes are possible.
- images obtained by the photodetector array are stored as digital image data. In optical navigation applications, the image data is often stored in memory arrays that correspond to the photodetector array.
- FIG. 2 depicts a first set of image data 24 that is obtained with the image sensor of FIG. 1 relative to the image sensor's photodetector array.
- the image data includes a T-shaped image feature 26 .
- a T-shaped image feature is depicted for description purposes, the image data could include any combination of random or non-random image features.
- the first set of image data is referred to as the “reference image data.” In this example, the reference image data is obtained at some time, t 1 .
- FIG. 3 depicts sample image data 28 that is obtained with the image sensor of FIG. 1 relative to the image sensor's pixel array.
- the T-shaped image feature 26 has moved relative to the photodetector array in comparison to the T-shaped image feature in FIG. 2 .
- the movement of the T-shaped image feature is caused by movement that occurs between the image sensor and the imaged surface between image capture events.
- the relative movement between the image sensor and the imaged surface can be caused by movement of the image sensor relative to a stationary imaged surface, movement of an imaged surface relative to the stationary image sensor, or by movement of both the image sensor and the imaged surface.
- image data is captured by the image sensor at a rate of 1,500 images per second.
- Cross-correlation is used to determine the relative displacement between the reference image data 24 and the sample image data 26 .
- the cross-correlation process tries to find the best match between the reference image data and the sample image data to determine relative displacement in the x and y directions. The best match is found by matching image features in the two sets of image data.
- cross-correlation of digital image data involves “moving” the reference image data to different positions relative to the sample image data and calculating a cross-correlation coefficient at each different position. The location with the highest cross-correlation coefficient indicates the closest correspondence between the reference and sample image data.
- the reference image data is periodically changed to account for the displacement.
- FIG. 4 depicts the reference and sample image data 24 and 28 aligned such that the T-shaped image feature 26 of the two sets of image data match.
- the relative displacement between the reference image data and the sample image data is evident as depicted in FIG. 4 and can be easily calculated.
- the relative displacement between the reference image data and the sample image data can be characterized in terms of displacement in the y-direction ( ⁇ y) and displacement in the x-direction ( ⁇ x).
- image data includes discernable image features.
- the quality of the features captured in the image data degrades when the illumination intensity of the detected features is poor. That is, if the imaged surface is not adequately illuminated with a light source that is aligned to reflect light onto the image sensor, image features will not contain enough contrast to enable reliable image tracking. Although steps are taken to provide the proper illumination, the illumination is not always perfect.
- navigation errors that are caused by uneven illumination are reduced by using only the portion of the image data that is properly illuminated in the correlation process.
- the portion of the image data that is used for optical navigation referred to herein as the “navigation window” is established by summing the image data on a per-slice basis, comparing the per-slice sums to a pre-established intensity threshold, and defining boundaries of the navigation window in response to the comparison. In particular, boundaries are set at the points where the per-slice sums equal the pre-established intensity threshold.
- the correlation process is responsive to actual illumination conditions such that the portions of the image data that are not properly illuminated are not used in the tracking process. Accordingly, portions of the image data that are most likely to cause navigation errors are “trimmed” from the set of image data that is used in the correlation process.
- defining the boundaries of the navigation window involves summing the image data on a per-slice basis.
- per-slice basis refers to a group of photodetectors and their associated output signals, which is defined by a linear path of photodetectors that runs from opposite edges of a photodetector array.
- a slice of photodetectors will be either an entire column of photodetectors or an entire row of photodetectors although this is not a requirement.
- the optical navigation technique is described in terms of photodetector columns, per-column sums, photodetector rows, and per-row sums.
- FIG. 5 depicts image data 40 that is divided into columns 20 relative to a graph of the per-column sums of the intensity values 42 of the image data, where the columns correspond to the columns of photodetectors in the photodetector array.
- the image data can be any image data that is obtained by the photodetector array. In the embodiment of FIG. 5 , the image data is obtained from an imaged surface that includes random features. The horizontal lines that represent the individual pixels of the photodetector array and the corresponding image data are not shown in FIG. 5 to highlight the per-column summing operation.
- FIG. 6 depicts the image data 40 divided into rows 22 relative to a graph of the per-row sums of the intensity values 44 of the image data, where the rows correspond to the rows of photodetectors in the photodetector array.
- the vertical lines that represent the individual pixels of the photodetector array and the corresponding image data are not shown in FIG. 6 to highlight the per-row summing operation.
- an illumination threshold is established that represents the minimum illumination that is acceptable within the navigation window.
- the threshold is defined in terms of a minimum intensity value sum for an entire column or row of image data. That is, the threshold represents the minimum intensity value that should be maintained on a per-column or per-row basis for the sum total of intensity values over an entire column or row of image data.
- the threshold can be the same for the columns and the rows or it can be specific to the columns and the rows.
- the boundaries of the navigation window are established at the points where an intensity sum equals the corresponding threshold.
- each of the summed values is compared to the respective column or row threshold and the column and row boundaries are set at the point where the intensity sums equal the respective thresholds.
- the portions of the image data having intensity value sums that are below the respective thresholds are trimmed from the image data and are not used for image tracking.
- FIG. 7 depicts the image data 40 and the graph of the per-column sums 42 as described with reference to FIG. 5 relative to a column threshold 46 .
- FIG. 7 also includes the vertical navigation window boundaries 48 that are defined in response to the comparison of the per-column sums to the column threshold.
- the column threshold is established at an intensity value identified as “ ⁇ .” Boundaries of the navigation window are established at the point, or points, where the per-column sums are equal to the column threshold. If the per-column sum is above the threshold at the edge of the image data, then the respective boundary is established at the edge of the image data.
- the vertical dashed lines 48 that project from the graph into the image data represent the x-axis boundaries of the navigation window. The image data that lies between the boundaries meets the minimum illumination requirements on a per-column basis.
- FIG. 8 depicts the image data 40 and the graph of the per-row sums 44 as described with reference to FIG. 6 relative to a row threshold 50 .
- FIG. 8 also includes the horizontal navigation window boundaries 52 that are defined in response to the comparison of the per-row sums to the row threshold.
- the row threshold is established at an intensity value identified as “ ⁇ .” Boundaries of the navigation window are established at the point, or points, where the per-row sums are equal to the row threshold. If the per-row sum is above the threshold at the edge of the image data, then the respective boundary is established at the edge of the image data.
- the horizontal dashed lines 52 that project from the graph into the image data represent the y-axis boundaries of the navigation window. The image data that lies between the boundaries meets the minimum illumination requirements on a per-row basis.
- FIG. 9 depicts the navigation window 54 that results when vertical and horizontal boundaries 48 and 52 are established as described above with reference to FIGS. 7 and 8 .
- the vertical boundaries (min_x and max_x) and the horizontal boundaries (min_y and max_y) combine to define the navigation window. Only the image data that falls within the navigation window is used in the tracking process.
- the image data has proper illumination at some of the edges of the image data. For example, the top edge and the right edge have intensity sums that are above the respective column and row thresholds and therefore no trimming is necessary.
- FIG. 10 depicts the navigation window 54 that results when no edges of the image data are properly illuminated.
- the portions of the image data on both the left and right sides are below the column threshold and therefore trimming of both the left and right edges of the image data is necessary.
- the portions of the image data on the top and the bottom sides are below the row threshold and therefore trimming of both the top and bottom edges of the image data is required.
- FIG. 11 depicts a process flow diagram of a method for optical navigation that corresponds to the above-described techniques.
- image data is obtained.
- the processing involved in defining the x and y axis window boundaries can be performed in parallel processes and therefore the process flow diagram diverges into separate paths.
- the image data is summed on a per-column basis.
- the per-column sums are compared with the column threshold, ⁇ .
- the x-axis window boundaries are established in response to the comparison.
- the image data is summed on a per-row basis.
- the per-row sums are compared with the row threshold, ⁇ .
- the y-axis window boundaries are established in response to the comparison.
- FIG. 12 depicts a process flow diagram of another method for optical navigation that corresponds to the above-described techniques.
- a threshold is established.
- image data is obtained.
- the image data is summed on a per-slice basis to create per-slice sums.
- the per-slice sums are compared to the threshold.
- a boundary of a navigation window is defined in response to the comparison.
- new image data is constantly acquired (e.g., at a rate of 1,500 images per second) and the window boundaries are dynamically adjusted in response to the newly acquired image data.
- the system is responsive to changes in illumination conditions.
- the navigation window is established as part of a start-up process and modified at periodic intervals that are less often than the image capture rate.
- the same window boundaries can be used for both the reference image data and the sample image data as depicted in FIGS. 2-4 .
- the window boundaries can be determined for each set of image data.
- FIG. 13 depicts an example of an optical navigation system 90 relative to a surface 92 that is used for navigation.
- the optical navigation system includes a light source 94 , optics 96 , an image sensor 98 , and a processing unit 100 .
- the light source e.g., a light emitting diode
- the image sensor detects the received light and outputs image data to the processing unit. For example, the image sensor captures reference image data and at some later time, sample image data.
- the processing unit processes the image data as described above to determine the boundaries of the navigation window and to determine the relative displacement between image data sets.
- the processing unit includes pre-processing logic 102 and boundary logic 104 .
- the pre-processing logic performs the summing operations as described above and the boundary logic defines the boundaries of the navigation window as described above.
- the functional elements of the processing unit may be implemented in hardware, software, firmware, or any combination thereof. In an embodiment, the pre-processing logic is implemented in hardware while the boundary logic is implemented in firmware.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims (19)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/870,192 US7565034B2 (en) | 2004-06-17 | 2004-06-17 | Determination of a navigation window in an optical navigation system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/870,192 US7565034B2 (en) | 2004-06-17 | 2004-06-17 | Determination of a navigation window in an optical navigation system |
Publications (2)
Publication Number | Publication Date |
---|---|
US20050281466A1 US20050281466A1 (en) | 2005-12-22 |
US7565034B2 true US7565034B2 (en) | 2009-07-21 |
Family
ID=35480631
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/870,192 Active 2026-06-26 US7565034B2 (en) | 2004-06-17 | 2004-06-17 | Determination of a navigation window in an optical navigation system |
Country Status (1)
Country | Link |
---|---|
US (1) | US7565034B2 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060074532A1 (en) * | 2004-10-05 | 2006-04-06 | Samsung Electronics Co., Ltd. | Apparatus and method for navigation based on illumination intensity |
US20070109267A1 (en) * | 2005-11-14 | 2007-05-17 | Microsoft Corporation | Speckle-based two-dimensional motion tracking |
US20110147101A1 (en) * | 2009-12-18 | 2011-06-23 | Bateman Steven S | Compensating for multi-touch signal bias drift in touch panels |
US9123131B2 (en) | 2013-09-24 | 2015-09-01 | Pixart Imaging Inc. | Parallel correlation method and correlation apparatus using the same |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100624366B1 (en) * | 2005-06-29 | 2006-09-15 | 엘지.필립스 엘시디 주식회사 | Display device and dynamic gamma application method |
TWI450154B (en) * | 2010-09-29 | 2014-08-21 | Pixart Imaging Inc | Optical touch system and object detection method therefor |
US10962592B2 (en) * | 2018-09-07 | 2021-03-30 | Globalfoundries Singapore Pte. Ltd. | Defect localization in embedded memory |
Citations (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2645158A (en) * | 1950-12-08 | 1953-07-14 | Standard Mirror Company | Adjustable antiglare rearvision mirror |
US4162481A (en) * | 1976-12-22 | 1979-07-24 | Recognition Equipment Incorporated | Adaptive correlator for video processing |
US4326258A (en) * | 1980-01-31 | 1982-04-20 | Ncr Canada Ltd - Ncr Canada Ltee | Method and apparatus for reducing the gray scale resolution of a digitized image |
US4398256A (en) * | 1981-03-16 | 1983-08-09 | Hughes Aircraft Company | Image processing architecture |
US4441205A (en) * | 1981-05-18 | 1984-04-03 | Kulicke & Soffa Industries, Inc. | Pattern recognition system |
US4811410A (en) * | 1986-12-08 | 1989-03-07 | American Telephone And Telegraph Company | Linescan inspection system for circuit boards |
US4853968A (en) * | 1987-09-21 | 1989-08-01 | Kulicke & Soffa Industries, Inc. | Pattern recognition apparatus and method |
US4926452A (en) * | 1987-10-30 | 1990-05-15 | Four Pi Systems Corporation | Automated laminography system for inspection of electronics |
US5392359A (en) * | 1991-12-27 | 1995-02-21 | Japan Tobacco, Inc. | Apparatus for inspecting appearance of cylindrical objects |
US5443152A (en) * | 1992-12-21 | 1995-08-22 | Johnson & Johnson Vision Products, Inc. | Apparatus for carrying ophthalmic lenses |
US5555312A (en) * | 1993-06-25 | 1996-09-10 | Fujitsu Limited | Automobile apparatus for road lane and vehicle ahead detection and ranging |
US5578813A (en) | 1995-03-02 | 1996-11-26 | Allen; Ross R. | Freehand image scanning device which compensates for non-linear movement |
US5831254A (en) * | 1995-12-18 | 1998-11-03 | Welch Allyn, Inc. | Exposure control apparatus for use with optical readers |
US5886704A (en) * | 1996-05-03 | 1999-03-23 | Mitsubishi Electric Information Technology Center America, Inc. | System and method for exploring light spaces |
US5892855A (en) * | 1995-09-29 | 1999-04-06 | Aisin Seiki Kabushiki Kaisha | Apparatus for detecting an object located ahead of a vehicle using plural cameras with different fields of view |
US20020001418A1 (en) * | 1996-11-01 | 2002-01-03 | Christer Fahraeus | Recording method and apparatus |
US6385338B1 (en) * | 1992-09-11 | 2002-05-07 | Canon Kabushiki Kaisha | Image processing method and apparatus |
US20020060522A1 (en) * | 1998-09-18 | 2002-05-23 | Stam Joseph S. | Continuously variable headlamp control |
US6473522B1 (en) * | 2000-03-14 | 2002-10-29 | Intel Corporation | Estimating text color and segmentation of images |
US20020179713A1 (en) * | 1995-12-18 | 2002-12-05 | Welch Allyn Data Collection, Inc. | Exposure control method for use with optical readers |
US20030016851A1 (en) * | 2001-07-17 | 2003-01-23 | Accuimage Diagnostics Corp. | Methods and software for self-gating a set of images |
US6603111B2 (en) | 2001-04-30 | 2003-08-05 | Agilent Technologies, Inc. | Image filters and source of illumination for optical navigation upon arbitrary surfaces are selected according to analysis of correlation during navigation |
US6717518B1 (en) * | 1998-01-15 | 2004-04-06 | Holding B.E.V.S.A. | Method and apparatus for detection of drowsiness |
US20040179099A1 (en) * | 1998-11-25 | 2004-09-16 | Donnelly Corporation, A Corporation | Vision system for a vehicle |
US20050260583A1 (en) * | 2001-07-19 | 2005-11-24 | Paul Jackway | Chromatin segmentation |
US20060093193A1 (en) * | 2004-10-29 | 2006-05-04 | Viswanathan Raju R | Image-based medical device localization |
US7065261B1 (en) * | 1999-03-23 | 2006-06-20 | Minolta Co., Ltd. | Image processing device and image processing method for correction of image distortion |
US7236623B2 (en) * | 2000-04-24 | 2007-06-26 | International Remote Imaging Systems, Inc. | Analyte recognition for urinalysis diagnostic system |
US20070230784A1 (en) * | 2006-03-30 | 2007-10-04 | Nidec Sankyo Corporation | Character string recognition method and device |
-
2004
- 2004-06-17 US US10/870,192 patent/US7565034B2/en active Active
Patent Citations (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2645158A (en) * | 1950-12-08 | 1953-07-14 | Standard Mirror Company | Adjustable antiglare rearvision mirror |
US4162481A (en) * | 1976-12-22 | 1979-07-24 | Recognition Equipment Incorporated | Adaptive correlator for video processing |
US4326258A (en) * | 1980-01-31 | 1982-04-20 | Ncr Canada Ltd - Ncr Canada Ltee | Method and apparatus for reducing the gray scale resolution of a digitized image |
US4398256A (en) * | 1981-03-16 | 1983-08-09 | Hughes Aircraft Company | Image processing architecture |
US4441205A (en) * | 1981-05-18 | 1984-04-03 | Kulicke & Soffa Industries, Inc. | Pattern recognition system |
US4811410A (en) * | 1986-12-08 | 1989-03-07 | American Telephone And Telegraph Company | Linescan inspection system for circuit boards |
US4853968A (en) * | 1987-09-21 | 1989-08-01 | Kulicke & Soffa Industries, Inc. | Pattern recognition apparatus and method |
US4926452A (en) * | 1987-10-30 | 1990-05-15 | Four Pi Systems Corporation | Automated laminography system for inspection of electronics |
US5392359A (en) * | 1991-12-27 | 1995-02-21 | Japan Tobacco, Inc. | Apparatus for inspecting appearance of cylindrical objects |
US6385338B1 (en) * | 1992-09-11 | 2002-05-07 | Canon Kabushiki Kaisha | Image processing method and apparatus |
US5443152A (en) * | 1992-12-21 | 1995-08-22 | Johnson & Johnson Vision Products, Inc. | Apparatus for carrying ophthalmic lenses |
US5555312A (en) * | 1993-06-25 | 1996-09-10 | Fujitsu Limited | Automobile apparatus for road lane and vehicle ahead detection and ranging |
US5578813A (en) | 1995-03-02 | 1996-11-26 | Allen; Ross R. | Freehand image scanning device which compensates for non-linear movement |
US5644139A (en) | 1995-03-02 | 1997-07-01 | Allen; Ross R. | Navigation technique for detecting movement of navigation sensors relative to an object |
US5892855A (en) * | 1995-09-29 | 1999-04-06 | Aisin Seiki Kabushiki Kaisha | Apparatus for detecting an object located ahead of a vehicle using plural cameras with different fields of view |
US5831254A (en) * | 1995-12-18 | 1998-11-03 | Welch Allyn, Inc. | Exposure control apparatus for use with optical readers |
US20020179713A1 (en) * | 1995-12-18 | 2002-12-05 | Welch Allyn Data Collection, Inc. | Exposure control method for use with optical readers |
US5886704A (en) * | 1996-05-03 | 1999-03-23 | Mitsubishi Electric Information Technology Center America, Inc. | System and method for exploring light spaces |
US20020001418A1 (en) * | 1996-11-01 | 2002-01-03 | Christer Fahraeus | Recording method and apparatus |
US6717518B1 (en) * | 1998-01-15 | 2004-04-06 | Holding B.E.V.S.A. | Method and apparatus for detection of drowsiness |
US20020060522A1 (en) * | 1998-09-18 | 2002-05-23 | Stam Joseph S. | Continuously variable headlamp control |
US6906467B2 (en) * | 1998-09-18 | 2005-06-14 | Gentex Corporation | Continuously variable headlamp control |
US20040179099A1 (en) * | 1998-11-25 | 2004-09-16 | Donnelly Corporation, A Corporation | Vision system for a vehicle |
US7065261B1 (en) * | 1999-03-23 | 2006-06-20 | Minolta Co., Ltd. | Image processing device and image processing method for correction of image distortion |
US6473522B1 (en) * | 2000-03-14 | 2002-10-29 | Intel Corporation | Estimating text color and segmentation of images |
US7236623B2 (en) * | 2000-04-24 | 2007-06-26 | International Remote Imaging Systems, Inc. | Analyte recognition for urinalysis diagnostic system |
US6603111B2 (en) | 2001-04-30 | 2003-08-05 | Agilent Technologies, Inc. | Image filters and source of illumination for optical navigation upon arbitrary surfaces are selected according to analysis of correlation during navigation |
US20030016851A1 (en) * | 2001-07-17 | 2003-01-23 | Accuimage Diagnostics Corp. | Methods and software for self-gating a set of images |
US7142703B2 (en) * | 2001-07-17 | 2006-11-28 | Cedara Software (Usa) Limited | Methods and software for self-gating a set of images |
US20050260583A1 (en) * | 2001-07-19 | 2005-11-24 | Paul Jackway | Chromatin segmentation |
US20060093193A1 (en) * | 2004-10-29 | 2006-05-04 | Viswanathan Raju R | Image-based medical device localization |
US20070230784A1 (en) * | 2006-03-30 | 2007-10-04 | Nidec Sankyo Corporation | Character string recognition method and device |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060074532A1 (en) * | 2004-10-05 | 2006-04-06 | Samsung Electronics Co., Ltd. | Apparatus and method for navigation based on illumination intensity |
US7996126B2 (en) * | 2004-10-05 | 2011-08-09 | Samsung Electronics Co., Ltd. | Apparatus and method for navigation based on illumination intensity |
US20070109267A1 (en) * | 2005-11-14 | 2007-05-17 | Microsoft Corporation | Speckle-based two-dimensional motion tracking |
US20110147101A1 (en) * | 2009-12-18 | 2011-06-23 | Bateman Steven S | Compensating for multi-touch signal bias drift in touch panels |
US8698015B2 (en) * | 2009-12-18 | 2014-04-15 | Intel Corporation | Compensating for multi-touch signal bias drift in touch panels |
US9123131B2 (en) | 2013-09-24 | 2015-09-01 | Pixart Imaging Inc. | Parallel correlation method and correlation apparatus using the same |
Also Published As
Publication number | Publication date |
---|---|
US20050281466A1 (en) | 2005-12-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7442916B2 (en) | Lift detection adapted for navigation on a transparent structure | |
US8780045B2 (en) | Optical navigation with a dynamic SQUAL threshold | |
JP4392377B2 (en) | Optical device that measures the distance between the device and the surface | |
US7221356B2 (en) | Data input device and method for detecting an off-surface condition by a laser speckle size characteristic | |
TWI363982B (en) | Apparatus for controlling the position of a screen pointer, method of generating movement data and navigation sensor | |
US8264461B2 (en) | Apparatus for controlling the position of a screen pointer with low sensitivity to particle contamination | |
US11307308B2 (en) | Tracking device and electronic device with improved work surface adaptability | |
US9292130B2 (en) | Optical touch system and object detection method therefor | |
US20040095323A1 (en) | Method for calculating movement value of optical mouse and optical mouse using the same | |
US20080181526A1 (en) | System and method for reducing jitter during an optical navigation operation | |
US7317447B2 (en) | Pointing device with adaptive illumination level | |
US7565034B2 (en) | Determination of a navigation window in an optical navigation system | |
EP1868066B1 (en) | Optimization of statistical movement measurement for optical mouse, with particular application to laser-illuminated surfaces | |
US20120262423A1 (en) | Image processing method for optical touch system | |
US7876307B2 (en) | Motion detection mechanism for laser illuminated optical mouse sensor | |
US8189954B2 (en) | System and method for performing optical navigation using enhanced frames of image data | |
US7315013B2 (en) | Optical navigation using one-dimensional correlation | |
US20050162393A1 (en) | Method of calculating sub-pixel movement and position tracking sensor using the same | |
US8692804B2 (en) | Optical touch system and method | |
KR100332639B1 (en) | Moving object detection method using a line matching technique | |
US11480664B2 (en) | Optical detection device of detecting a distance relative to a target object | |
US9268414B2 (en) | Motion detection mechanism for laser illuminated optical mouse sensor | |
US7193203B1 (en) | Method and device for tracking movement between a surface and an imager | |
US20050231479A1 (en) | Illumination spot alignment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: AGILENT TECHNOLOGIES, INC., COLORADO Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SIAH, TIONG HENG;KONG, HENG YEW;REEL/FRAME:014868/0017;SIGNING DATES FROM 20040608 TO 20040610 |
|
AS | Assignment |
Owner name: AVAGO TECHNOLOGIES GENERAL IP PTE. LTD.,SINGAPORE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AGILENT TECHNOLOGIES, INC.;REEL/FRAME:017206/0666 Effective date: 20051201 Owner name: AVAGO TECHNOLOGIES GENERAL IP PTE. LTD., SINGAPORE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AGILENT TECHNOLOGIES, INC.;REEL/FRAME:017206/0666 Effective date: 20051201 |
|
AS | Assignment |
Owner name: AVAGO TECHNOLOGIES ECBU IP (SINGAPORE) PTE. LTD.,S Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:017675/0518 Effective date: 20060127 Owner name: AVAGO TECHNOLOGIES ECBU IP (SINGAPORE) PTE. LTD., Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:017675/0518 Effective date: 20060127 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
AS | Assignment |
Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD Free format text: MERGER;ASSIGNOR:AVAGO TECHNOLOGIES ECBU IP (SINGAPORE) PTE. LTD.;REEL/FRAME:030369/0528 Effective date: 20121030 |
|
AS | Assignment |
Owner name: DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT, NEW YORK Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:032851/0001 Effective date: 20140506 Owner name: DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AG Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:032851/0001 Effective date: 20140506 |
|
AS | Assignment |
Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENT RIGHTS (RELEASES RF 032851-0001);ASSIGNOR:DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT;REEL/FRAME:037689/0001 Effective date: 20160201 Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENT RIGHTS (RELEASES RF 032851-0001);ASSIGNOR:DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT;REEL/FRAME:037689/0001 Effective date: 20160201 |
|
AS | Assignment |
Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:037808/0001 Effective date: 20160201 Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:037808/0001 Effective date: 20160201 |
|
AS | Assignment |
Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE NAME PREVIOUSLY RECORDED AT REEL: 017206 FRAME: 0666. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:AGILENT TECHNOLOGIES, INC.;REEL/FRAME:038632/0662 Effective date: 20051201 |
|
AS | Assignment |
Owner name: PIXART IMAGING INC., TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:039788/0572 Effective date: 20160805 |
|
AS | Assignment |
Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:039862/0129 Effective date: 20160826 Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:039862/0129 Effective date: 20160826 |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
AS | Assignment |
Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041710/0001 Effective date: 20170119 Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041710/0001 Effective date: 20170119 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 12 |