+

WO2017173017A1 - Détection de contrefaçon d'équipements de régulation du trafic à l'aide d'images capturées dans des conditions d'éclairage multiples et différentes - Google Patents

Détection de contrefaçon d'équipements de régulation du trafic à l'aide d'images capturées dans des conditions d'éclairage multiples et différentes Download PDF

Info

Publication number
WO2017173017A1
WO2017173017A1 PCT/US2017/024894 US2017024894W WO2017173017A1 WO 2017173017 A1 WO2017173017 A1 WO 2017173017A1 US 2017024894 W US2017024894 W US 2017024894W WO 2017173017 A1 WO2017173017 A1 WO 2017173017A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
traffic
traffic material
different
character
Prior art date
Application number
PCT/US2017/024894
Other languages
English (en)
Inventor
Stephanie R. SCHUMACHER
Anthony D. Jacques
David J. Mcconnell
Benjamin W. WATSON
Original Assignee
3M Innovative Properties Company
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 3M Innovative Properties Company filed Critical 3M Innovative Properties Company
Publication of WO2017173017A1 publication Critical patent/WO2017173017A1/fr

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/62Text, e.g. of license plates, overlay texts or captions on TV images
    • G06V20/63Scene text, e.g. street names
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/95Pattern authentication; Markers therefor; Forgery detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/1429Identifying or ignoring parts by sensing at different wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/62Text, e.g. of license plates, overlay texts or captions on TV images
    • G06V20/625License plates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition

Definitions

  • the present application relates generally to traffic materials, methods of using these, and systems in which the articles may be used.
  • AVR Automatic Vehicle Recognition
  • ALPR Automated License Plate Recognition
  • AVR or ALPR may refer to the detection and recognition of a vehicle by an electronic system.
  • Exemplary uses for AVR or ALPR include, for example, automatic tolling (e.g. , electronic toll systems), traffic law enforcement (e.g., red light running systems, speed enforcement systems), searching for vehicles associated with crimes, access control systems, and facility access control.
  • AVR systems in use today may include systems using RFID technology to read an RFID tag attached to a vehicle.
  • ALPR systems may use cameras to capture images of license plates.
  • Some AVR systems use RFID, although not all vehicles may include RFID tags.
  • tag readers may have difficulty pinpointing the exact location of an unpowered RFID tag. As such, these tag readers may only detect the presence or absence of a tag in their field of sensitivity, rather than information included in the RFID tags. Some RFID tag readers may only operate at short range, function poorly in the presence of metal, and/or may be blocked by interference when many tagged objects are present.
  • ALPR systems use an image capture device to read information of a vehicle, such as a license plate number or other visual content of the license plate. In some instances, the information is attached to, printed on, or adjacent to a license plate. ALPR systems may be used in many environments, since almost all areas of the world require that vehicles have license plates with visually identifiable information thereon. However, image capture and recognition of license plate information for a vehicle may be complex. For example, the read accuracy from an ALPR system may be dependent on the quality of the captured image as assessed by the reader.
  • Traffic materials or articles such as license plates and traffic signs typically comprise identifying information.
  • identifying information include characters (e.g.,
  • ALPR systems typically employ cameras to detect or read the identifying information.
  • the cameras are capable of irradiating the traffic material with radiation having different wavelengths and capturing an image of the material under such conditions.
  • an ALPR camera is capable of irradiating an article with a wavelength in the near infrared ("near IR”) range (e.g.
  • a separate light source is used to irradiate the traffic material.
  • the light source is controlled by the camera. In other examples, the camera relies on ambient light to produce an image in the visible spectrum.
  • Some ALPR systems are capable of extracting information from said images.
  • the extracted information may be recorded or compared with information provided in a database for identification purposes.
  • traffic materials may be tampered with to avoid detection and/or identification.
  • traffic materials may be counterfeited.
  • the traffic material is a license plate, identifying information disposed thereon may be altered to avoid detection by Law Enforcement Officers. Tampering typically occurs when a vehicle is to be used in criminal activity.
  • the same traffic material may have a different appearance under two distinct conditions due to degradation of the underlying substrate and dirt present on the material. Techniques of this disclosure may generate notifications when discrepant results are detected so that the traffic material can be properly replaced or maintained.
  • Techniques of this disclosure may improve identification and authentication of traffic materials and/or improve the identification accuracy of traffic materials.
  • techniques of the disclosure may improve detection of tampered or counterfeited traffic materials.
  • techniques of this disclosure may produce notifications when discrepant results are found.
  • the present application relates to the detection of discrepancies between generated images of traffic materials, such images being generated under at least two different conditions. In some examples, the present application relates to a method to detect counterfeited or tampered traffic materials.
  • the traffic material is reflective or retroreflective. In some examples, the traffic material is at least one of a license plate or traffic sign. In some examples, the reflective article is non-retroreflective (e.g., an aluminum substrate).
  • the apparatus includes a first source of radiation and a second source of radiation. In some examples, the first source of radiation produces radiation in the visible spectrum, and the second source of radiation produces radiation in the near infrared spectrum. In other examples, the first source of radiation and the second source of radiation emit radiation in the near infrared spectrum. BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1A is a block diagram illustrating an example system for detecting whether an optically active article is counterfeit, in accordance with techniques of this disclosure.
  • FIG. IB illustrates a flow diagram including example operations of a computing device configured to detect counterfeit optical articles, in accordance with one or more techniques of this disclosure.
  • FIG. 2A is a photograph of a traffic material taken under visible light conditions.
  • FIG. 2B is a photograph of the traffic material shown in FIG. 2A taken under near infrared conditions.
  • FIG. 3A is a photograph of a traffic material taken under visible light conditions.
  • FIG. 3B is a photograph of the traffic material shown in FIG. 3 A taken under near infrared conditions.
  • FIG. 4A is a photograph of a traffic material taken under visible light conditions.
  • FIG. 4B is a photograph of the traffic material shown in FIG. 4A taken under near infrared conditions.
  • FIG. 5 illustrates a flow diagram including example operations performed by one or more capture devices and computing devices, in accordance with one or more techniques of this disclosure.
  • FIG. 6 is block diagram illustrating example operations for accommodating occlusions, in accordance with techniques of the present application.
  • FIG. 7 is block diagram illustrating example operations of a computing device, in accordance with techniques of the present application.
  • FIG. 8 is block diagram illustrating example operations of a computing device, in accordance with techniques of the present application.
  • FIG. 9 is block diagram illustrating example operations of a computing device, in accordance with techniques of the present application.
  • FIG. 10 is block diagram illustrating example operations of a computing device, in accordance with techniques of the present application.
  • FIG. 11 shows a detailed example of various devices that may be configured to execute program code to practice some examples in accordance with the current disclosure.
  • Techniques of this disclosure may determine that an individual has changed the appearance of a license plate or modified at least part of the identifying information when the license plate is viewed in a first condition, such as, for example, in the visible light spectrum. However, such changes are typically not effected when the license plate is viewed in a second condition, the second condition being different from the first condition, such as, for example, in near infrared. Conversely, individuals may choose to tamper with the traffic material such that alterations are not detectable under visible light but may be detected in the near infrared.
  • a material that is transparent and substantially invisible in the visible spectrum but opaque in infrared is used to alter identifying information in a traffic material.
  • a material that is opaque (thus visible) in the visible spectrum but transparent and substantially invisible in the infrared spectrum is used. Regardless of which material is selected, the result is that the identifying information may appear differently when read under two different conditions.
  • a license plate may appear to contain a first set of identifying information to the unaided human eye, while to a detector under a second condition (e.g., near infrared) the license plate may appear to contain a second set of identifying information, different from the first set.
  • the traffic material of the present application is a vehicle license plate.
  • License plates or vehicle number plates, may be used by Law Enforcement Officers to identify a vehicle and/or the vehicle's owner. With this information, Officers may rely on existing systems and software to check, for example, for pending tickets or unpaid registration fees.
  • information garnered from a vehicle license plate may be used to identify and/or locate vehicles that have been used in criminal activities. Techniques of the disclosure may accurately and automatically detect counterfeited or tampered traffic materials, specifically, vehicle license plates.
  • the present application relates to a method for identifying discrepancies between identifying information detected on a first image and identifying information detected on a second image of the same traffic material, wherein the first image is obtained under a first condition and the second image is obtained under a second condition, different from the first condition.
  • the present application relates to an apparatus that produces an alert or notification when discrepancies are found between the two images.
  • the present application relates to a method that uses the discrepancies as means to authenticate the identifying information.
  • a traffic material such as a license plate has identifying information detectable under a plurality of conditions, such as, for example, visible light and near infrared conditions.
  • identifying information detectable under a plurality of conditions, such as, for example, visible light and near infrared conditions.
  • a first detector detecting in the visible light spectrum would be able to detect and identify the same identifying information as a second detector operating in, for example, near infrared.
  • FIG. 1A is a block diagram illustrating an example system 100 for detecting whether an optically active article 108 is counterfeit, in accordance with techniques of this disclosure.
  • system 100 includes an image capture device 102.
  • Image capture device 102 may include one or more image capture sensors 106 and one or more light sources 104.
  • System 1 10 may also include one or more optically active articles as described in this disclosure, such as license plate 108. License plate 108 may be attached or otherwise associated with vehicle 1 10.
  • image capture device 102 is communicatively coupled to computing device 1 16 via network 1 14 using one or more communication links.
  • image capture device 102 may be communicatively coupled to computing device 102 via one or more forms of direct communication without network 1 14, such as via a wired or wireless connection that does not require a network.
  • system 100 may include image capture device 102.
  • Image capture device 102 may convert light or electromagnetic radiation sensed by image capture sensors 106 into information, such as digital image or bitmap comprising a set of pixels. Each pixel may have chromiance and/or luminance components that represent the intensity and/or color of light or electromagnetic radiation.
  • image capture device 102 captures a first image and wherein the first image of the optically active article is captured in a first spectral range within the near-infrared spectrum.
  • Image capture device 102 may capture a second image of the optically active article in a second spectral range within the visible spectrum.
  • Image capture device 102 may include one or more image capture sensors 106 and one or more light sources 104.
  • image capture device 102 may include image capture sensors 106 and light sources 104 in a single integrated device, such as shown in FIG. 1A.
  • image capture sensors 106 or light sources 104 may be separate from or otherwise not integrated in image capture device 102.
  • Examples of image capture sensors 106 may include semiconductor charge -coupled devices (CCD) or active pixel sensors in complementary metal- oxide-semiconductor (CMOS) or N-type metal-oxide-semiconductor (NMOS, Live MOS) technologies.
  • Digital sensors include flat panel detectors.
  • image capture device 102 includes at least two different sensors for detecting light in two different wavelength spectrums.
  • a first image capture and a second image capture sensor substantially concurrently detect the first and second wavelengths.
  • Substantially concurrently may refer to detecting the first and second wavelengths within 10 milliseconds of one another, within 50 milliseconds of one another, or within 100 milliseconds of one another to name only a few examples.
  • one or more light sources 104 include a first source of radiation and a second source of radiation.
  • the first source of radiation emits radiation in the visible spectrum
  • the second source of radiation emits radiation in the near infrared spectrum.
  • the first source of radiation and the second source of radiation emit radiation in the near infrared spectrum.
  • one or more light sources 104 may emit radiation (e.g., infrared light 127) in the near infrared spectrum.
  • image capture device 102 includes a first lens and a second lens. In some examples, image capture device 102 captures frames at 50 frames per second (fps).
  • frame capture rates include 60, 30 and 25 fps. It should be apparent to a skilled artisan that frame capture rates are dependent on application and different rates may be used, such as, for example, 100 or 200 fps. Factors that affect required frame rate are, for example, application (e.g., parking vs, tolling), vertical field of view (e.g., lower frame rates can be used for larger fields of view, but depth of focus can be a problem), and vehicle speed (faster traffic requires a higher frame rate).
  • application e.g., parking vs, tolling
  • vertical field of view e.g., lower frame rates can be used for larger fields of view, but depth of focus can be a problem
  • vehicle speed faster traffic requires a higher frame rate
  • image capture device 102 includes at least two channels.
  • the channels may be optical channels.
  • the two optical channels may pass through one lens onto a single sensor.
  • image capture device 102 includes at least one sensor, one lens and one band pass filter per channel.
  • the band pass filter permits the transmission of multiple near infrared wavelengths to be received by the single sensor.
  • the at least two channels may be differentiated by one of the following: (a) width of band (e.g., narrowband or wideband, wherein narrowband illumination may be any wavelength from the visible into the near infrared); (b) different wavelengths (e.g., narrowband processing at different wavelengths can be used to enhance features of interest, such as, for example, a license plate and its lettering (license plate identifier), while suppressing other features (e.g., other objects, sunlight, headlights); (c) wavelength region (e.g., broadband light in the visible spectrum and used with either color or monochrome sensors); (d) sensor type or characteristics; (e) time exposure; and (f) optical components (e.g., lensing).
  • width of band e.g., narrowband or wideband, wherein narrowband illumination may be any wavelength from the visible into the near infrared
  • different wavelengths e.g., narrowband processing at different wavelengths can be used to enhance features of interest, such as, for example, a license plate and its
  • image capture device 102 may be stationary or otherwise mounted in a fixed position and the position of optically active article 108 may not be stationary.
  • Image capture device 102 may capture one or more images of optically active article 108 as vehicle 1 10 approaches or passes by image capture device 102.
  • image capture device 102 may be not be stationary.
  • image capture device 102 may be in another vehicle or moving object.
  • Optically active article 108 may be stationary in some examples.
  • image capture device 102 may be held by a human operator or robotic device, which changes the position of image capture device 102 relative to optically active article 108.
  • image capture device 102 may be communicatively coupled to computing device 1 16 by one or more communication links 13 OA and 130B. Image capture device 102 may send images of optically active article 108 to computing device 1 16.
  • Communication links 130A and 130B may represent wired or wireless connections.
  • communication links 130A and 130B may be wireless Ethernet connections using a WiFi protocol and/or may be wired Ethernet connections using Category 5 or Category 6 cable. Any suitable communication links are possible.
  • image capture device 102 is
  • Network 114 may represent any number of one or more network connected devices including by not limited to routers, switches, hubs, and interconnecting communication links that provide for forwarding of packet and/or frame-based data.
  • network 114 may represent the Internet, a service provider network, a customer network, or any other suitable network.
  • image capture device 102 is communicatively coupled to computing device 116 by a direct connection, such as Universal Serial Bus (USB) link or other high-speed bus.
  • USB Universal Serial Bus
  • image capture device 102 and computing device 116 may be integrated in a single device or housing.
  • the single device or housing may be mounted at a vehicle, attached to a building or other stationary structure, or may not be stationary such that a human operator may carry the single device or housing as a portable structure.
  • Computing device 116 represents any suitable computing system, which may be remote from or tightly integrated with image capture device 102, such as one or more desktop computers, laptop computers, mainframes, servers, cloud computing systems, etc. capable of sending and receiving information with image capture device 102.
  • computing device 116 implements techniques of this disclosure. For instance, techniques of this disclosure provide for detecting whether an optically active article is counterfeit based on a combination of security elements and an article message. Using techniques of this disclosure computing device 116 may determine whether the license plate is counterfeit.
  • computing device 116 includes an optical character recognition component 118 (or "OCR module 118), security component 120, service component 122 and user interface (UI) component 124.
  • OCR module 118 optical character recognition component
  • security component 120 security component 120
  • service component 122 service component 122
  • user interface (UI) component 124 user interface component 124.
  • Components 118, 120, 122, and 124 may perform operations described herein using software, hardware, firmware, or a mixture of both hardware, software, and firmware residing in and executing on computing device 116 and/or at one or more other remote computing devices.
  • components 118, 120, 122, and 124 may be implemented as hardware, software, and/or a combination of hardware and software.
  • Computing device 116 may execute components 118, 120, 122, and 124 with one or more processors.
  • Computing device 116 may execute any of components 118, 120, 122, and 124 as or within a virtual machine executing on underlying hardware.
  • Components 118, 120, 122, and 124 may be implemented in various ways.
  • any of components 118, 120, 122, and 124 may be implemented as a downloadable or pre-installed application or "app.”
  • any of components 118, 120, 122, 124 may be implemented as part of an operating system of computing device 116.
  • components 118, 120, 122, and 124 may execute at or be implemented at computing device 1100, which may be an example of computing device 116. [0036]
  • optically active article 108 is illustrated as a license plate as attached to vehicle 1 10.
  • Vehicle 1 10 may be an automobile, motorcycle, airplane, water vessel, military equipment, bicycle, train, or any other transportation vehicle.
  • optically active article 108 may be attached to, included or embedded in, or otherwise comprise: a document, clothing, wearable equipment, a building, stationary equipment, or any other object to name only a few examples.
  • optically active article 108 may not be a separate object attached to vehicle 1 10 but rather printed on vehicle 1 10 or other suitable object.
  • optically active article 108 may include reflective, non- reflective, and/or retroreflective sheet applied to a base surface.
  • an optically active article may be a retroreflective article.
  • An article message (e.g., "RZA LML" in FIG. 1A), such as but not limited to characters, images, and/or any other information, may be printed, formed, or otherwise embodied on the optically active article 108.
  • the reflective, non-reflective, and/or retroreflective sheet may be applied to a base surface using one or more techniques and/or materials including but not limited to: mechanical bonding, thermal bonding, chemical bonding, or any other suitable technique for attaching retroreflective sheet to a base surface.
  • a base surface may include any surface of an object (such as described above, e.g., an aluminum plate) to which the reflective, non-reflective, and/or retroreflective sheet may be attached.
  • An article message may be printed, formed, or otherwise embodied on the sheeting using any one or more of an ink, a dye, a thermal transfer ribbon, a colorant, a pigment, and/or an adhesive coated film.
  • content is formed from or includes a multi-layer optical film, a material including an optically active pigment or dye, or an optically active pigment or dye.
  • Optically active article 108 in FIG. 1A includes article message 126A-126F ("article message 126").
  • article message 126 is a symbol from a symbol set.
  • the symbol set may be an alphabet, number set, and/or any other set of glyphs.
  • the symbol set includes at least the letters of Roman alphabet and Arabic numerals.
  • article message 126 in FIG. 1A is described for illustration purposes as being formed by different areas that either retroreflect or do not retroreflect light
  • article message 126 in FIG. 1A may be printed, formed, or otherwise embodied in an optically active article using any light reflecting technique in which validation information may be determined from article message 126.
  • article message 126 may be printed using visibly-opaque, infrared-transparent ink and/or visibly-opaque, infrared-opaque ink. Any suitable construction, in which article message 126 or portions thereof are distinguishable under one or more lighting conditions, may be used in accordance with techniques and articles of this disclosure.
  • character of optically article message 126 may be printed using a flexographic printing process.
  • optically active article 108 may include a base layer (e.g., an aluminum sheet), an adhesive layer disposed on the base layer, a structured surface disposed on the adhesive layer, and an overlay layer disposed on the structured surface such as described in U.S. Publication US2013/0034682, US2013/01 14142, US2014/0368902, US2015/0043074, which are hereby expressly incorporated by reference in their entireties.
  • the structured surface may be formed from optical elements, such as full cubes (e.g., hexagonal cubes or preferred geometry (PG) cubes), or truncated cubes, or beads as described in, for example, U.S. Patent No. 7,422,334, which is hereby expressly incorporated by reference in its entirety.
  • optical elements such as full cubes (e.g., hexagonal cubes or preferred geometry (PG) cubes), or truncated cubes, or beads as described in, for example, U.S. Patent No. 7,422,334, which is hereby expressly incorporated by reference in its entirety.
  • a reflective layer is disposed adjacent to the structured surface of the optically active article, in addition to or in lieu of the seal film.
  • Suitable reflective layers include, for example, a metallic coating that can be applied by known techniques such as vapor depositing or chemically depositing a metal such as aluminum, silver, or nickel.
  • a primer layer may be applied to the backside of the cube-corner elements to promote the adherence of the metallic coating.
  • characters of article message 126 are printed with in that is visible light opaque and infrared light opaque.
  • article message 126 appears as the color black in images captured under visible light and infrared light.
  • a person seeking to create a different, counterfeit article message may alter character 126D using a visible light opaque but infrared light transparent ink to create ink portions 127A, 127B, such that the "L" character appear as a character "E” in an image captured with light from the visible spectrum.
  • FIG. 1A illustrates an image 109 of optically active article 108 that is captured with light in the visible light spectrum.
  • the light may be visible light 127.
  • a first spectral range is from about 350 nm to about 700 nm (i.e., visible light spectrum) and a second spectral range is from about 700 nm to about 1 100 nm (i.e., near infrared spectrum).
  • a first spectral range is from about 700 nm to about 850 nm
  • a second spectral range is between 860 nm to 1 100 nm.
  • the first lighting condition includes a first range of wavelengths and the second lighting condition includes a second range of wavelengths that is substantially different from the first range of wavelengths.
  • first and second ranges of wavelengths may be substantially different if less than 1% of the wavelengths are the same in each range of wavelengths.
  • first and second ranges of wavelengths may be substantially different if the fewer than between 1% and 10% of the wavelengths are the same in each range of wavelengths.
  • first and second ranges of wavelengths may be substantially different if the amount of wavelengths are the same in each range is less than a threshold amount.
  • article message 126 include ink compositions and are provided on optically active article 108 using flexographic printing.
  • ink compositions include at least one of coating compositions, films (e.g., polymeric), or additives that reduce or eliminate adhesion of the underlying adhesive layer.
  • other techniques may also be used, such as needle die coating, gravure printing, ink jet printing, screen printing, thermal mass transfers printing, laser printing, or any other suitable printing technique.
  • a second image (the first image being image 109) may be captured under IR lighting.
  • ink portions 127A, 127B may not appear black but rather as the color and/or brightness of an area of retroreflective sheeting without retroreflective elements, such as barrier material or other retroreflective structures described in this disclosure.
  • a counterfeit optically active article may include first article message captured in a first image taken under a first lighting condition (e.g., "RZA EML" in visible light), but may include a second, different article message captured in a second image taken under a second lighting condition (e.g., "RZA LML").
  • a second portion of identifying information may include non-counterfeit information and represent a first character in a character set.
  • a first portion of identifying information may include counterfeit information and may be arranged with respect to the second portion of identifying information such that the first and second portions collectively represent a second character in the character set that is different than the first character in the character set.
  • computing device 1 16 may receive, from image capture device 102, a first image of a traffic material, e.g., optically active article 108, under a first lighting condition, such as visible light.
  • a second image of the traffic material e.g., optically active article 108
  • a second lighting condition such as infrared light.
  • computing device 1 16 may receive the second image from image capture device 102, while in other examples, computing device 1 16 may receive the second image from a separate, different image capture device 102.
  • the first image and the second image are captured at a substantially same time.
  • substantially the same time may include capturing the first and second images within a time range of 50 milliseconds to 2 second of one another.
  • substantially the same time may include capturing the first and second images within a time range of 50 milliseconds to 200 milliseconds of one another.
  • substantially the same time may include capturing the first and second images within a threshold time range of one another, wherein the threshold time range is hard-coded, user-defined, or machine-generated.
  • Security component 120 may receive each of the first and second images. Security component 120 may determine, based at least in part on the first portion of counterfeit information, that the first image of the traffic material and the second image of the traffic material are different.
  • the first portion of counterfeit information may be portions 127A, 127B that were added a person after optically active article 109 was fabricated with article message that represents authentic or non-counterfeit identifying information.
  • Security component 120 may determine that the first image and second image are different, by comparing a first set of pixels that correspond locations of a first portion of the identifying information to a second set of pixels that correspond to locations of a second portion of the identifying information. For instance, security component 120 may compare a region that includes character 126D in the first and second images. Under the first lighting condition, pixel values the region may represent a character "E", while under a second lighting condition, pixel values for the same region may represent a character "L”. Security component 120 may determine, based at least in part on the comparison, that the first set of pixels is different from the second set of pixels by at least a threshold amount. In some examples, the threshold amount may be user defined, hard-coded, or machine generated.
  • Security component 120 may determine, based at least in part on the comparison, that the first image of the traffic material and the second image of the traffic material are different. For instance, if the first set of pixels and the second set of pixels differ by an amount greater than or equal to the threshold amount, security component 120 may determine that the first and second images are different, in a way that indicates the optically active article is counterfeit.
  • security component 120 may determine a confidence value that indicates at least one of a degree of difference or a degree of similarity between a first region of the first image and a second region of the second image.
  • the degree of difference or the degree of similarity may be within a range of degrees.
  • the confidence value may be a percentage or a probability that indicates a likelihood of a degree of different or similarity between a first region of the first image and the second region of the second image.
  • security component 120 may determine that the first image of the traffic material and the second image of the traffic material are different.
  • security component 120 may cause service component 122 to perform at least one operation. That is, security component 120 may send the data that indicates whether optically active article 108 is authentic or counterfeit.
  • Service component 122 may provide any number of services, by performing one or more operations. For instance, service component 122, upon receiving data that indicates an optically active article is counterfeit, may generate one or more alerts, notifications, reports, or other communications that are sent to one or more other computing devices. Such alerts may include but are not limited to: emails, text messages, lists, phone calls, or any other suitable communications.
  • Service component 122 may store data representing any of the aforementioned data.
  • user interface (UI) component 124 may act as an intermediary between various components and modules of computing device 1 16 to process and send input detected by input devices to other components and modules, and generate output from other components and modules that may be presented at one or more output devices. For instance, UI component 124 may generate one or more user interfaces for display, which may include data and/or graphical representations of alerts, reports, or other communications as described above.
  • computing device 1 16 may include optical character recognition component (OCR) 1 18.
  • OCR component 1 18 may identify one or more characters of a character set from an image. For instance OCR component 1 18 may identify a set of characters, "RSA EML" from an image of optically active article 108.
  • OCR component 1 18 may perform optical character recognition (OCR) on a first image taken under a first lighting condition and a second image under a second lighting condition to identify at least one character in the first image and at least one character in the second image.
  • OCR component 1 18 may send the first and second characters to security component 120, which may determine, based on the OCR, that the at least one character identified in the first image is different than the at least one character identified in the second image. Security component 120 may then determine that the first and second images are different, and/or that the optically active article is counterfeit.
  • individuals may choose to alter identifying information detectable under a first condition, such as, for example, near infrared.
  • a material that is substantially invisible in the visible spectrum but opaque in the near infrared is used, as described in U.S. Patent No. 8,865,293, the disclosure of which is incorporated herein by reference in its entirety.
  • Example materials may include Multilayer Optical Films and adhesive tape "SCOTCH MAGIC TAPE", both commercially available from 3M Company, St. Paul, MN.
  • FIG. IB illustrates a flow diagram 150 including example operations of a computing device configured to detect counterfeit optical articles, in accordance with one or more techniques of this disclosure. For purposes of illustration only, the example operations are described below as being performed by computing device 1 16 in FIG. 1. Some techniques of FIG. 1 may be performed by one or more image capture devices, computing devices or other hardware.
  • an image capture device or other light sensor captures images of an object of interest (e.g., a license plate), such as an optically active article.
  • an object of interest e.g., a license plate
  • at least two images captured under two different lighting conditions are processed through a plate- find process (154), in which computing device 1 16 determines a location within the image of the optically active article ( 156).
  • computing device 1 16 may perform OCR on the selected region or location.
  • Computing device 116 may perform OCR on the location or region of an image where the plate find step indicated a license plate may be (158). If a result is not obtained by the computing device (e.g., a license plate is not found on the image), the full image may then be processed by the OCR engine to identify an article message. In some examples the OCR and feature identification step is performed separately for each channel.
  • computing device 116 obtains a final result containing at least one image (which may be represented as a bitmap) and bundles of data (e.g., including date, time, images, barcode read data, OCR read data, and other metadata) ( 160).
  • the present apparatus and systems use a process step referred to as fusion.
  • Computing device 1 16 may collect consecutive read results from each channel (or sensor), and processes these read results to determine consensus on an intra- channel (one channel), or inter-channel basis.
  • a fusion buffer of computing device 1 16 may accumulate incoming read results (and associated metadata thereof) until it determines that the vehicle transit is complete.
  • computing device 1 16 may detect conflicts and adjust read confidences accordingly. For example, a license plate having the character '0' (zero) and an infrared-opaque bolt positioned in the middle of the zero, could be misread as an '8' under infrared conditions by the second (infrared) channel. However, the first (color) channel would be able to distinguish the bolt from the character zero and read it correctly. In these circumstances, the system may not be able to decide by itself which read is correct, but it will flag it as a discrepant event for further review.
  • identifying information disposed on a traffic material may be purposefully different when viewed under different conditions. This discrepancy is then used to authenticate the traffic material.
  • a license plate comprises one set of identifying information in the visible spectrum and a second set of identifying information in the near infrared spectrum. The system is then configured to recognize this discrepancy as authentication of the first set of identifying information.
  • at least one of the images is colored as illuminated by a broad spectrum radiation.
  • security component 120 may, upon determining that two images are different due to counterfeit information detectable in at least one lighting condition, may cause service component 122 to perform at least one operation (162).
  • a final result may be generated by computing device 116 such as a report, notification or other indication of counterfeit information.
  • FIG. 2A is photograph taken under visible light conditions of a retroreflective license plate 200 having identifying information 210.
  • the identifying information 210 is made up of printed alphanumeric characters "WG09 LML”.
  • License plate 200 also includes pieces of 3M's SCOTCH MAGIC TAPE positioned next to the characters "L" of identifying information 210. These pieces of tape are substantially invisible in the visible spectrum and therefore are not seen in FIG. 2A.
  • FIG. 2B is a photograph of the license plate 200 taken under near infrared conditions and showing identifying information 220 comprising alphanumeric characters "WG09 EME". In near infrared conditions, the printed character "L" in identifying information 210 is made to look like a character "E”. This discrepancy in the identifying information detected under visible light conditions and near infrared indicates that tampering of the vehicle license plate has occurred.
  • FIG. 3 A is a photograph of a license plate 300 taken under visible light conditions and including identifying information 310 comprising alphanumeric characters "SOI 2886".
  • a piece of SCOTCH MAGIC TAPE is positioned over the character "0" of identifying information 310. As previously explained, this tape is substantially invisible in the visible spectrum and therefore is not seen in FIG. 3A.
  • FIG. 3B is a photograph of the license plate 300 taken under near infrared conditions and having identifying information 320. As it may be seen in FIG. 3B, the character "0" of identifying information 310 appears obscured and therefore illegible under near infrared conditions.
  • FIG. 4A is a photograph of a vehicle license plate 400 taken under visible light conditions and including identifying information 410 comprising characters "Y249RTB". Identifying information 410 was prepared by printing characters
  • FIG. 4B is a photograph of the license plate 400 taken under near-infrared conditions and including identifying information 420 comprising characters ' ⁇ 249 ⁇ ".
  • the traffic material is one of reflective (non-retroreflective) or retroreflective.
  • the retroreflective article is a retroreflective sheeting.
  • the retroreflective sheeting can be either microsphere-based sheeting (often referred to as beaded sheeting) or cube corner sheeting (often referred to as prismatic sheeting).
  • microsphere-based sheeting are described in, for example, U.S. Patent Nos. 3,190, 178 (McKenzie), 4,025, 159 (McGrath), and 5,066,098 (Kult).
  • Illustrative examples of cube corner sheeting are described in, for example, U.S. Patent Nos.
  • a seal layer may be applied to the structured cube corner sheeting surface to keep contaminants away from individual cube corners.
  • Flexible cube corner sheetings such as those described, for example, in U.S. Patent No. 5,450,235 (Smith et al.) can also be incorporated in examples or implementations of the present disclosure.
  • Retroreflective sheeting for use in connection with the present disclosure can be, for example, either matte or glossy.
  • the traffic material or retroreflective sheeting can be used for, for example, as signage.
  • the term "signage” as used herein refers to an article that conveys information, usually by means of alphanumeric characters, symbols, graphics, or other indicia.
  • Specific signage examples include, but are not limited to, signage used for traffic control purposes, street signs, validation stickers or decals, identification materials (e.g. , licenses), and vehicle license plates.
  • Exemplary methods and systems for reading a traffic material of for reading identifying information on a traffic material include an apparatus and at least one source of radiation.
  • the present apparatus captures at least two images of the traffic material under two different conditions.
  • the different conditions include different wavelengths.
  • the apparatus of the present application is capable of capturing at least a first image of the traffic material at a first wavelength, and a second image of the traffic material at a second wavelength, the second wavelength being different from the first wavelength.
  • the first and second images are taken within a time interval of less than 40 milliseconds (ms). In other examples, the time interval is less than 20 ms, less than 5 ms, or less than 1 ms.
  • the apparatus of the present application is a camera.
  • the camera includes one sensor and a wavelength-selective filter (band pass filter) over at least a portion of the sensor.
  • the camera includes two sensors detecting at two different wavelengths.
  • the camera includes a first source of radiation and relies on a different source as a second source of radiation.
  • the first source of radiation produces radiation in the near infrared spectrum.
  • the second source of radiation is ambient light. In other examples, the second source of radiation is a light source separate from the camera.
  • the camera relies on ambient light to produce an image in the visible spectrum and a separate source to produce radiation in a different wavelength (e.g., infrared).
  • the separate source is controlled by the camera.
  • the camera includes a first source of radiation and a second source of radiation.
  • the first source of radiation produces radiation in the visible spectrum
  • the second source of radiation produces radiation in the near infrared spectrum.
  • the first source of radiation and the second source of radiation emit radiation in the near infrared spectrum.
  • at least one of the sources of radiation emit radiation in one of the ultraviolet or far infrared spectrum.
  • the camera includes one lens. In other examples, the camera further includes an image splitter. In yet another example, the camera includes at least a first lens and a second lens.
  • the present camera captures frames at 50 frames per second (fps).
  • Other exemplary frame capture rates include 60, 30 and 25 fps. It should be apparent to a skilled artisan that frame capture rates are dependent on application and different rates may be used, such as, for example, 100 or 200 fps. Factors that affect required frame rate are, for example, application (e.g., parking vs, tolling), vertical field of view (e.g., lower frame rates can be used for larger fields of view, but depth of focus can be a problem), and vehicle speed (faster traffic requires a higher frame rate).
  • application e.g., parking vs, tolling
  • vertical field of view e.g., lower frame rates can be used for larger fields of view, but depth of focus can be a problem
  • vehicle speed faster traffic requires a higher frame rate
  • the present camera includes at least two channels.
  • the channels are optical channels.
  • the two optical channels pass through one lens onto a single sensor.
  • the present camera includes at least one sensor, one lens and one band pass filter per channel.
  • the band pass filter permits the transmission of multiple near infrared wavelengths to be received by the single sensor.
  • the at least two channels may be differentiated by one of the following: (a) width of band (e.g., narrowband or wideband, wherein narrowband illumination may be any wavelength from the visible into the near infrared); (b) different wavelengths (e.g., narrowband processing at different wavelengths can be used to enhance features of interest, such as, for example, a license plate and its lettering (license plate identifier), while suppressing other features (e.g., other objects, sunlight, headlights); (c) wavelength region (e.g., broadband light in the visible spectrum and used with either color or monochrome sensors); (d) sensor type or characteristics; (e) time exposure; and (f) optical components (e.g., lensing).
  • the channels may follow separate logical paths through the system.
  • the camera further comprises a third channel detecting at a third wavelength. It should be apparent to a skilled artisan that even though the examples described above include two channels, the same inventive concepts and benefits may be applied to three or more channels. These examples are also included within the scope of the present disclosure.
  • FIG. 5 illustrates a flow diagram 500 including example operations performed by one or more capture devices and computing devices, in accordance with one or more techniques of this disclosure. For purposes of illustration only, the example operations are described below as being performed by computing device 1 16 in FIG. 1. Some techniques of FIG. 5 may be performed by one or more image capture devices, computing devices or other hardware.
  • computing device 1 16 captures at least two images of a license plate (502) and performs OCR on identifying information of the license plate under a first condition for a first OCR result (504) and under a second condition for a second OCR result (506), as generally shown and described in FIG. 1.
  • computing device 1 16 compares the OCR results (or reads) obtained under the first and second conditions (508). If the results are identical or satisfy a degree of similarity that satisfies a threshold (e.g., greater than or equal to the threshold) (510), no notification is displayed / produced by the computing device. If the results differ from each other by a degree of dissimilarity that satisfies a threshold (e.g., less than a threshold), a notification is produced/displayed or otherwise stored by computing device 1 16 (512).
  • the threshold is pre-defined, hard-coded, user-specified, or machine generated. Examples of notifications may include email, text, phone call, user interface graphical element, sound, haptic feedback, or any other suitable notification.
  • FIG. 6 is block diagram illustrating example operations for accommodating occlusions, in accordance with techniques of the present application.
  • FIG. 6 illustrates example operations 600 to interpret an OCR result obtained under a first condition that differs from the OCR result obtained under a second condition due to, in part, improper lighting, dirt present on the article to be detected, unfavorable weather conditions (e.g., rain), and poor quality of the underlying license plate substrate (e.g., retroreflective sheeting or aluminum blank).
  • the example operations are described below as being performed by computing device 1 16 in FIG. 1.
  • Some techniques of FIG. 6 may be performed by one or more image capture devices, computing devices or other hardware.
  • computing device 116 captures at least two images of a license plate (602) and performs OCR on identifying information of the license plate under a first condition for a first OCR result (604) and under a second condition for a second OCR result (606), as generally shown and described in FIG. 1.
  • computing device 1 16 compares the OCR results (or reads) obtained under the first and second conditions (608). If the results are identical or satisfy a degree of similarity that satisfies a threshold (e.g., greater than or equal to the threshold) (610), no notification is displayed / produced by the computing device. If the results differ from each other by a degree of dissimilarity that satisfies a threshold (e.g., less than a threshold), a notification is produced/displayed or otherwise stored by computing device 1 16 (612).
  • the threshold is pre-defined, hard-coded, user-specified, or machine generated. Examples of notifications may include email, text, phone call, user interface graphical element, sound, haptic feedback, or any other suitable notification.
  • computing device 1 16 may determine or assign a confidence level based on the first OCR result (e.g., article message under first lighting condition) and the second OCR result (e.g., article message under second lighting condition). The present method then compares the first and second results to calculate a difference, which may represent the confidence level (614). In one example, computing device 1 16 may generate a notification or perform another operation if the confidence level does not satisfy (e.g., is less than or equal to) a threshold (616). In one example, computing device 1 16 may not generate a notification or perform another operation if the confidence level satisfies (e.g., is greater than or equal to) a threshold (618).
  • a threshold e.g., is greater than or equal to
  • FIG. 7 is block diagram illustrating example operations of a computing device, in accordance with techniques of the present application.
  • FIG. 7 includes similar operations to those in FIG. 6, but further describes operations for a confidence level being used to output the most likely correct read result. For purposes of illustration only, the example operations are described below as being performed by computing device 1 16 in FIG. 1. Some techniques of FIG. 7 may be performed by one or more image capture devices, computing devices or other hardware.
  • computing device 116 captures at least two images of a license plate (702) and performs OCR on identifying information of the license plate under a first condition for a first OCR result (704) and under a second condition for a second OCR result (706), as generally shown and described in FIG. 1.
  • computing device 1 16 compares the OCR results (or reads) obtained under the first and second conditions (708). If the results are identical or satisfy a degree of similarity that satisfies a threshold (e.g., greater than or equal to the threshold) (710), the article message is output for display without a counterfeit or tamper warning. If the results differ from each other by a degree of dissimilarity that satisfies a threshold (e.g., less than a threshold), computing device 1 16 may determine the article messages are different between the images (712).
  • the threshold is pre-defined, hard-coded, user-specified, or machine generated. Examples of notifications may include email, text, phone call, user interface graphical element, sound, haptic feedback, or any other suitable notification.
  • Computing device 116 may perform a confidence comparison (714), wherein confidence comparison relies on a character score assigned to each of the first and second OCR results. For instance, computing device 1 16 may generate a character score (e.g., a character string) for the OCR result of the first image that indicates a level of confidence that the characters recognized in the image represent the actual characters embodied on the article. Computing device 1 16 may generate a character score for the OCR result (e.g., a character string) of the second image that indicates a level of confidence that the characters recognized in the image represent the actual characters embodied on the article. The confidence comparison may determine whether the confidence score for the first image or the second image is greater.
  • a character score e.g., a character string
  • computing device 1 16 may output the OCR result for the first image and a notification of a counterfeit or tamper warning. If the second image is captured under IR light and the character score for the second image is greater than or equal to the character score for the first image (716), then computing device 1 16 may output the OCR result for the second image and a notification of a counterfeit or tamper warning.
  • FIG. 8 is block diagram illustrating example operations of a computing device, in accordance with techniques of the present application.
  • FIG. 8 includes similar operations to those in FIG. 6, but further describes operations for a difference comparison being used to output the most likely correct read result.
  • the example operations are described below as being performed by computing device 1 16 in FIG. 1.
  • Some techniques of FIG. 8 may be performed by one or more image capture devices, computing devices or other hardware.
  • computing device 116 captures at least two images of a license plate (802) and performs OCR on identifying information of the license plate under a first condition for a first OCR result (804) and under a second condition for a second OCR result (806), as generally shown and described in FIG. 1.
  • computing device 1 16 compares the OCR results (or reads) obtained under the first and second conditions (808). If the results are identical or satisfy a degree of similarity that satisfies a threshold (e.g., greater than or equal to the threshold) (810), the article message is output for display without a counterfeit or tamper warning. If the results differ from each other by a degree of dissimilarity that satisfies a threshold (e.g., less than a threshold), computing device 1 16 may determine the article messages are different between the images (812).
  • the threshold is pre-defined, hard-coded, user-specified, or machine generated. Examples of notifications may include email, text, phone call, user interface graphical element, sound, haptic feedback, or any other suitable notification.
  • Computing device 116 may perform a confidence comparison (814), wherein confidence comparison determines a difference or degree of difference for first and second OCR results, wherein the difference or degree of difference is compared to a threshold. For instance, computing device 1 16 may compare a first character (e.g., first OCR result) "L" from a first image to a second character (e.g., second OCR result) "E" from a second image to determine the existence of a difference or a degree of deference. If the difference or degree of difference satisfies a threshold (e.g., greater than or equal to), then computing device 1 16 may perform an operation such as outputting a notification or storing a value. If the difference or degree of difference does not satisfy a threshold (e.g., less than or equal to), then computing device 1 16 may refrain from outputting a notification or storing a value.
  • a threshold e.g., less than or equal to
  • FIG. 9 is block diagram illustrating example operations of a computing device, in accordance with techniques of the present application.
  • FIG. 9 includes similar operations to those in FIG. 6, but further describes operations for a difference comparison being used to output the most likely correct read result.
  • the example operations are described below as being performed by computing device 1 16 in FIG. 1.
  • Some techniques of FIG. 9 may be performed by one or more image capture devices, computing devices or other hardware.
  • computing device 116 captures at least two images of a license plate (902) and performs OCR on identifying information of the license plate under a first condition for a first OCR result (904) and under a second condition for a second OCR result (906), as generally shown and described in FIG. 1.
  • computing device 1 16 compares the OCR results (or reads) obtained under the first and second conditions (908). If the results are identical or satisfy a degree of similarity that satisfies a threshold (e.g., greater than or equal to the threshold) (908), the article message is output for display without a counterfeit or tamper warning. If the results differ from each other by a degree of dissimilarity that satisfies a threshold (e.g., less than a threshold), computing device 1 16 may determine the article messages are different between the images (912).
  • the threshold is pre-defined, hard-coded, user-specified, or machine generated. Examples of notifications may include email, text, phone call, user interface graphical element, sound, haptic feedback, or any other suitable notification.
  • computing device 1 16 may determine a confidence value of the OCR result in the visible spectrum is 96%, and determine a confidence value of the OCR result in infrared as 90% (914). The confidence value may indicate the degree of accuracy of the read in relation to the actual information that is read on the article. Computing device 1 16 may compare the confidence of the two reads with a threshold set to 5%. The difference between the confidence values of the two reads is calculated to 6%, therefore satisfying (e.g., greater than or equal to the threshold value). A notification or other operation is performed or generated by computing device 1 16 in response to determining the threshold is satisfied ( 18). If the difference in confidence values is less than the threshold ( 16), no notification or other operation may be performed or generated by computing device 1 16.
  • FIG. 10 is block diagram illustrating example operations of a computing device, in accordance with techniques of the present application.
  • FIG. 10 includes similar operations to those in FIG. 6, but further describes operations for a matching function executed by computing device 1 16. For purposes of illustration only, the example operations are described below as being performed by computing device 1 16 in FIG. 1. Some techniques of FIG. 10 may be performed by one or more image capture devices, computing devices or other hardware.
  • computing device 1 16 captures at least two images of a license plate (1002) and performs OCR on identifying information of the license plate under a first condition for a first OCR result (1004) and under a second condition for a second OCR result ( 1006), as generally shown and described in FIG. 1.
  • Computing device 1 16 may execute a pattern matching technique to process the two images, as shown in FIG. 10 ( 1008).
  • Example matching functions may include pixel comparisons, shape comparisons, feature (e.g., curvature, length, etc), comparison, or any other suitable matching function or combination of such matching functions.
  • a different image processing technique may be used in addition to or instead of OCR and pattern matching.
  • a similarity score instead of a confidence read level may be assigned to each image (especially when pattern matching is used) and compared to a
  • Computing device 1 16 may determine if the similarity of the images based on the matching function satisfies a threshold. If the threshold is satisfied (e.g., degree of similar or confidence level is greater than or equal to threshold), computing device 1 16 may refrain from outputting a notification or storing an indication of tampering or counterfeit article (1016). If the threshold is not satisfied (e.g., degree of similar or confidence level is less than threshold), computing device 1 16 may output a notification or storing an indication of tampering or counterfeit article.
  • the threshold e.g., degree of similar or confidence level is greater than or equal to threshold
  • computing device 1 16 may output a notification or storing an indication of tampering or counterfeit article.
  • FIG. 1 1 shows a detailed example of various devices that may be configured to execute program code to practice some examples in accordance with the current disclosure.
  • computing device 1100 may be a computing device that performs any of the techniques described herein.
  • a computing device 1 100 includes a processor 1 1 10 that is operable to execute program instructions or software, causing the computer to perform various methods or tasks.
  • Processor 1 1 10 is coupled via bus 1 120 to a memory 1 130, which is used to store information such as program instructions and other data while the computer is in operation.
  • a storage device 1 140 such as a hard disk drive, nonvolatile memory, or other non- transient storage device stores information such as program instructions, data files of the multidimensional data and the reduced data set, and other information.
  • the computer also includes various input-output elements 1 150, including parallel or serial ports, USB, Firewire or IEEE 1394, Ethernet, and other such ports to connect the computer to external device such as a printer, video camera, surveillance equipment or the like.
  • Other input-output elements may include wireless communication interfaces such as Bluetooth, Wi-Fi, and cellular data networks.
  • infrared refers to electromagnetic radiation with longer wavelengths than those of visible radiation, extending from the nominal red edge of the visible spectrum at around 700 nanometers (nm) to over 1000 nm. It is recognized that the infrared spectrum extends beyond this value.
  • near infrared refers to
  • electromagnetic radiation with wavelengths between 700 nm and 1300 nm.
  • visible spectrum may refer to the portion of the electromagnetic spectrum that is visible to (i.e., can be detected by) the human eye.
  • a typical human eye will respond to wavelengths from about 390 to 700 nm.
  • the term "substantially visible” may refer to the property of being discernible to most humans' naked eye when viewed at a distance of greater than 10 meters, (i.e., an observer can identify, with repeatable results, a sample with a unique marking from a group without the marking.)
  • substantially visible information can be seen by a human's naked eye when viewed either unaided and/or through a machine (e.g. , by using a camera, or in a printed or onscreen printout of a photograph taken at any wavelength of radiation) provided that no magnification is used.
  • the term “substantially invisible” may refer to the property of being not “substantially visible,” as defined above. For purposes of clarity, substantially invisible information cannot be seen by a human's naked eye when viewed by the naked eye and/or through a machine without magnification at a distance of greater than 10 meters.
  • the term “detectable” may refer to the ability of a machine vision system to extract a piece of information from an image through the use of standard image processing techniques such as, but not limited to, thresholding.
  • non-interfering may mean that information will not interfere with the extraction of other information that may overlap with the information to be extracted.
  • optical active with reference to an article may refer to an article that is at least one of reflective (e.g., aluminum plates), non-retroreflective or
  • the term "retroreflective” as used herein may refer to the attribute of reflecting an obliquely incident radiation ray in a direction generally antiparallel to its incident direction such that it returns to the radiation source or the immediate vicinity thereof.
  • the term "set" with respect to identifying information can include one or more individual pieces or portions.
  • Computer-readable media may include computer-readable storage media, which corresponds to a tangible medium such as data storage media, or communication media including any medium that facilitates transfer of a computer program from one place to another, e.g., according to a communication protocol.
  • computer-readable media generally may correspond to (1) tangible computer-readable storage media, which is non-transitory or (2) a communication medium such as a signal or carrier wave.
  • Data storage media may be any available media that can be accessed by one or more computers or one or more processors to retrieve instructions, code and/or data structures for implementation of the techniques described in this disclosure.
  • a computer program product may include a computer- readable medium.
  • such computer-readable storage media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage, or other magnetic storage devices, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium.
  • Disk and disc includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc, where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
  • processors such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • FPGAs field programmable logic arrays
  • processors may refer to any of the foregoing structure or any other structure suitable for implementation of the techniques described.
  • the functionality described may be provided within dedicated hardware and/or software modules. Also, the techniques could be fully implemented in one or more circuits or logic elements.
  • the techniques of this disclosure may be implemented in a wide variety of devices or apparatuses, including a wireless handset, an integrated circuit (IC) or a set of ICs (e.g., a chip set).
  • IC integrated circuit
  • a set of ICs e.g., a chip set.
  • Various components, modules, or units are described in this disclosure to emphasize functional aspects of devices configured to perform the disclosed techniques, but do not necessarily require realization by different hardware units. Rather, as described above, various units may be combined in a hardware unit or provided by a collection of interoperative hardware units, including one or more processors as described above, in conjunction with suitable software and/or firmware.
  • a computer-readable storage medium includes a non-transitory medium.
  • the term "non-transitory” indicates, in some examples, that the storage medium is not embodied in a carrier wave or a propagated signal.
  • a non-transitory storage medium stores data that can, over time, change (e.g., in RAM or cache).
  • an apparatus e.g., an image capture device
  • computing device comprises least one single core or multi core computer processing unit (CPU) and/or graphics processing unit (GPU).
  • the CPU is co-located with a camera, that is, disposed within close proximity to the camera.
  • the CPU is mounted on the same board as the camera.
  • the CPU is not co-located with the camera and is connected to the camera by other means of communication, such as, for example, coaxial cables and/or wireless connections.
  • the CPU substantially concurrently processes multiple frames via operating system provided services, such as, for example, time slicing and scheduling.
  • the apparatus further comprises at least one multi -core CPU.
  • an apparatus or computing device produces bundles of data including, for example, date, time, images, barcode read data, OCR read data, and other metadata, that may be useful in vehicle identification for, for example, parking, tolling and public safety applications.
  • an apparatus or computing device captures information for at least one vehicle. In some examples, this is accomplished by reading multiple sets of information on a traffic material (e.g., license plate).
  • a traffic material e.g., license plate
  • an apparatus or computing device captures information related to the vehicle transit. Any vehicle transit normally involves generating and processing dozens of images per channel. This is important as the camera performs automatic exposure bracketing, such that more than one single image is needed to cover different exposures. In addition, multiple reads are required as the license plate position and exposure change from frame to frame.
  • pre-processing may increase the rate of processing images.
  • intelligent selection is performed via field-programmable gate array (FPGA) preprocessing which can process multiple channels at 50fps.
  • FPGA field-programmable gate array
  • fifteen images may be processed by OCR from a first channel, but only three barcode images from a second channel may be processed during the same period. This difference in the number of images processed per channel may happen when one of the images (e.g., barcode image) is more complex.
  • the images of the traffic material may be captured at ambient radiation and/or under radiation conditions added by a designated radiation source (for example, coaxial radiation that directs radiation rays onto the traffic material when the camera is preparing to record an image).
  • a designated radiation source for example, coaxial radiation that directs radiation rays onto the traffic material when the camera is preparing to record an image.
  • the radiation rays produced by the coaxial radiation in combination with the reflective or retroreflective properties of the traffic material create a strong, bright (e.g., above a pre-defined threshold) signal coincident with the location of the traffic material in an otherwise large image scene.
  • the bright signal may be used to identify the location of the traffic material.
  • the method and/or system for reading traffic materials focuses on the region of interest (the region of brightness) and searches for matches to expected indicia or identifying information by looking for recognizable patterns of contrast.
  • the recognized indicia or identifying information are often provided with some assessment of the confidence in the match to another computer or other communication device for dispatching
  • the radiation detected by the camera can come from any of a number of sources. Of particular interest is the radiation reflected from the traffic material, and specifically, the amount of radiation reflected from each area inside that region of interest on the article.
  • the camera or detection system collects radiation from each region of the traffic material with the goal of creating a difference (contrast) between the background and/or between each indicia or piece of identifying information on the traffic material. Contrast can be effected in numerous ways, including the use of coaxial radiation to overwhelm the amount of ambient radiation.
  • the use of filters on the camera can help accentuate the differences between the indicia or identifying information and background by selectively removing undesired radiation wavelengths and passing only the desired radiation wavelengths.
  • the traffic material is one of a license plate or signage.
  • useful wavelengths of radiation at which to capture images of traffic materials are divided into the following spectral regions: visible and near infrared.
  • Typical cameras include sensors that are sensitive to both of these ranges, although the sensitivity of a standard camera system decreases significantly for wavelengths longer than 1 lOOnm.
  • Various radiation (or light) emitting diodes (LEDs) can emit radiation over the entire visible and near infrared spectra range, and typically most LEDs are characterized by a central wavelength and a narrow distribution around that central wavelength.
  • multiple radiation sources e.g., LEDs may be used.
  • the cameras and radiation sources for the systems of the present application are typically mounted to view, for example, license plates at some angle to the direction of vehicle motion.
  • Exemplary mounting locations include positions above the traffic flow or from the side of the roadway.
  • Images may be collected at an incidence angle of between about 10 degrees to about 60 degrees from normal incidence (head-on) to the license plate.
  • the images are collected at an incidence angle of between about 20 degrees to about 45 degrees from normal incidence (head-on) to the license plate.
  • Some exemplary preferred angles include, for example, 30 degrees, 40 degrees, and 45 degrees.
  • a sensor which is sensitive to infrared or ultraviolet radiation as appropriate would be used to detect retroreflected radiation outside of the visible spectrum.
  • exemplary commercially available cameras include but are not limited to the P372, P382, and P492 cameras sold by 3M Company.
  • the present apparatus further includes a third channel capable of detecting at a third wavelength and capable of producing a third image of the traffic material through the third channel.
  • the first, second and third wavelengths are all different from each other.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Inspection Of Paper Currency And Valuable Securities (AREA)

Abstract

Dans certains exemples, des techniques comprennent des étapes selon lesquelles un dispositif informatique : reçoit, à partir d'un dispositif de capture d'image, une première image d'un équipement de régulation du trafic dans une première condition d'éclairage, l'équipement de régulation du trafic comprenant des informations d'identification dont au moins une première partie comprend des informations de contrefaçon et au moins une seconde partie comprend des informations de non-contrefaçon ; reçoit une seconde image du matériau de trafic dans une seconde condition d'éclairage ; détermine, sur la base, au moins en partie, de la première partie d'informations de contrefaçon, que la première image de l'équipement de régulation du trafic et la seconde image de l'équipement de régulation du trafic sont différentes ; et effectue au moins une opération en réponse à la détermination que la première image de l'équipement de régulation du trafic et la seconde image de l'équipement de régulation du trafic sont différentes.
PCT/US2017/024894 2016-04-01 2017-03-30 Détection de contrefaçon d'équipements de régulation du trafic à l'aide d'images capturées dans des conditions d'éclairage multiples et différentes WO2017173017A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662317045P 2016-04-01 2016-04-01
US62/317,045 2016-04-01

Publications (1)

Publication Number Publication Date
WO2017173017A1 true WO2017173017A1 (fr) 2017-10-05

Family

ID=58530684

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/024894 WO2017173017A1 (fr) 2016-04-01 2017-03-30 Détection de contrefaçon d'équipements de régulation du trafic à l'aide d'images capturées dans des conditions d'éclairage multiples et différentes

Country Status (1)

Country Link
WO (1) WO2017173017A1 (fr)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10874759B2 (en) 2018-03-20 2020-12-29 3M Innovative Properties Company Sterilization process management
CN112602089A (zh) * 2018-08-17 2021-04-02 3M创新有限公司 用于机器辨识的路径制品中的结构化纹理嵌入
US11244439B2 (en) 2018-03-20 2022-02-08 3M Innovative Properties Company Vision system for status detection of wrapped packages
US20220292800A1 (en) * 2018-02-08 2022-09-15 Genetec Inc. Systems and methods for locating a retroreflective object in a digital image
US11462319B2 (en) 2018-03-20 2022-10-04 3M Innovative Properties Company Sterilization process management
WO2022271468A1 (fr) * 2021-06-21 2022-12-29 Getac Technology Corporation Techniques d'amélioration de la lisibilité d'une image à l'aide d'un ou plusieurs motifs
WO2023279058A1 (fr) * 2021-07-02 2023-01-05 Viavi Solutions Inc. Authentification d'article de sécurité
CN115578500A (zh) * 2022-10-17 2023-01-06 广州唯墨间科技有限公司 一种基于三维摄影测量建模的混合式照明方法
US11887375B2 (en) 2021-06-18 2024-01-30 Getac Technology Corporation Techniques for capturing enhanced images for pattern identifications

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US1591572A (en) 1925-02-05 1926-07-06 Jonathan C Stimson Process and apparatus for making central triple reflectors
US3190178A (en) 1961-06-29 1965-06-22 Minnesota Mining & Mfg Reflex-reflecting sheeting
US4025159A (en) 1976-02-17 1977-05-24 Minnesota Mining And Manufacturing Company Cellular retroreflective sheeting
US4588258A (en) 1983-09-12 1986-05-13 Minnesota Mining And Manufacturing Company Cube-corner retroreflective articles having wide angularity in multiple viewing planes
US4775219A (en) 1986-11-21 1988-10-04 Minnesota Mining & Manufacturing Company Cube-corner retroreflective articles having tailored divergence profiles
US5066098A (en) 1987-05-15 1991-11-19 Minnesota Mining And Manufacturing Company Cellular encapsulated-lens high whiteness retroreflective sheeting with flexible cover sheet
US5138488A (en) 1990-09-10 1992-08-11 Minnesota Mining And Manufacturing Company Retroreflective material with improved angularity
US5450235A (en) 1993-10-20 1995-09-12 Minnesota Mining And Manufacturing Company Flexible cube-corner retroreflective sheeting
US5557836A (en) 1993-10-20 1996-09-24 Minnesota Mining And Manufacturing Company Method of manufacturing a cube corner article
GB2354898A (en) * 1999-07-07 2001-04-04 Pearpoint Ltd Vehicle licence plate imaging using two-part optical filter
US7422334B2 (en) 2003-03-06 2008-09-09 3M Innovative Properties Company Lamina comprising cube corner elements and retroreflective sheeting
WO2010100485A1 (fr) * 2009-03-02 2010-09-10 E-Plate Limited Dispositif et procédé de collecte de données
US20110228085A1 (en) * 2010-03-16 2011-09-22 Yoram Hofman Method and Apparatus for Acquiring Images of Car License Plates
US20130034682A1 (en) 2010-04-15 2013-02-07 Michael Benton Free Retroreflective articles including optically active areas and optically inactive areas
US20130114142A1 (en) 2010-04-15 2013-05-09 3M Innovative Properties Company Retroreflective articles including optically active areas and optically inactive areas
US8865293B2 (en) 2008-12-15 2014-10-21 3M Innovative Properties Company Optically active materials and articles and systems in which they may be used
US20140368902A1 (en) 2011-09-23 2014-12-18 3M Innovatine Properties Company Retroreflective articles including a security mark
US20150043074A1 (en) 2011-09-23 2015-02-12 3M Innovative Properties Company Retroreflective articles including a security mark
WO2016025207A1 (fr) * 2014-08-13 2016-02-18 3M Innovative Properties Company Articles optiquement actifs et systèmes dans lesquels ils peuvent être utilisés

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US1591572A (en) 1925-02-05 1926-07-06 Jonathan C Stimson Process and apparatus for making central triple reflectors
US3190178A (en) 1961-06-29 1965-06-22 Minnesota Mining & Mfg Reflex-reflecting sheeting
US4025159A (en) 1976-02-17 1977-05-24 Minnesota Mining And Manufacturing Company Cellular retroreflective sheeting
US4588258A (en) 1983-09-12 1986-05-13 Minnesota Mining And Manufacturing Company Cube-corner retroreflective articles having wide angularity in multiple viewing planes
US4775219A (en) 1986-11-21 1988-10-04 Minnesota Mining & Manufacturing Company Cube-corner retroreflective articles having tailored divergence profiles
US5066098A (en) 1987-05-15 1991-11-19 Minnesota Mining And Manufacturing Company Cellular encapsulated-lens high whiteness retroreflective sheeting with flexible cover sheet
US5138488A (en) 1990-09-10 1992-08-11 Minnesota Mining And Manufacturing Company Retroreflective material with improved angularity
US5557836A (en) 1993-10-20 1996-09-24 Minnesota Mining And Manufacturing Company Method of manufacturing a cube corner article
US5450235A (en) 1993-10-20 1995-09-12 Minnesota Mining And Manufacturing Company Flexible cube-corner retroreflective sheeting
GB2354898A (en) * 1999-07-07 2001-04-04 Pearpoint Ltd Vehicle licence plate imaging using two-part optical filter
US7422334B2 (en) 2003-03-06 2008-09-09 3M Innovative Properties Company Lamina comprising cube corner elements and retroreflective sheeting
US8865293B2 (en) 2008-12-15 2014-10-21 3M Innovative Properties Company Optically active materials and articles and systems in which they may be used
WO2010100485A1 (fr) * 2009-03-02 2010-09-10 E-Plate Limited Dispositif et procédé de collecte de données
US20110228085A1 (en) * 2010-03-16 2011-09-22 Yoram Hofman Method and Apparatus for Acquiring Images of Car License Plates
US20130034682A1 (en) 2010-04-15 2013-02-07 Michael Benton Free Retroreflective articles including optically active areas and optically inactive areas
US20130114142A1 (en) 2010-04-15 2013-05-09 3M Innovative Properties Company Retroreflective articles including optically active areas and optically inactive areas
US20140368902A1 (en) 2011-09-23 2014-12-18 3M Innovatine Properties Company Retroreflective articles including a security mark
US20150043074A1 (en) 2011-09-23 2015-02-12 3M Innovative Properties Company Retroreflective articles including a security mark
WO2016025207A1 (fr) * 2014-08-13 2016-02-18 3M Innovative Properties Company Articles optiquement actifs et systèmes dans lesquels ils peuvent être utilisés

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220292800A1 (en) * 2018-02-08 2022-09-15 Genetec Inc. Systems and methods for locating a retroreflective object in a digital image
US11830256B2 (en) * 2018-02-08 2023-11-28 Genetec Inc. Systems and methods for locating a retroreflective object in a digital image
US10874759B2 (en) 2018-03-20 2020-12-29 3M Innovative Properties Company Sterilization process management
US11244439B2 (en) 2018-03-20 2022-02-08 3M Innovative Properties Company Vision system for status detection of wrapped packages
US11462319B2 (en) 2018-03-20 2022-10-04 3M Innovative Properties Company Sterilization process management
CN112602089A (zh) * 2018-08-17 2021-04-02 3M创新有限公司 用于机器辨识的路径制品中的结构化纹理嵌入
US11887375B2 (en) 2021-06-18 2024-01-30 Getac Technology Corporation Techniques for capturing enhanced images for pattern identifications
WO2022271468A1 (fr) * 2021-06-21 2022-12-29 Getac Technology Corporation Techniques d'amélioration de la lisibilité d'une image à l'aide d'un ou plusieurs motifs
WO2023279058A1 (fr) * 2021-07-02 2023-01-05 Viavi Solutions Inc. Authentification d'article de sécurité
CN115578500A (zh) * 2022-10-17 2023-01-06 广州唯墨间科技有限公司 一种基于三维摄影测量建模的混合式照明方法
CN115578500B (zh) * 2022-10-17 2023-04-28 广州唯墨间科技有限公司 一种基于三维摄影测量建模的混合式照明方法

Similar Documents

Publication Publication Date Title
WO2017173017A1 (fr) Détection de contrefaçon d'équipements de régulation du trafic à l'aide d'images capturées dans des conditions d'éclairage multiples et différentes
JP7018878B2 (ja) 光学活性物品上に配置された文字の相違性の増大
US10532704B2 (en) Retroreflective articles having a machine-readable code
US20170236019A1 (en) Optically active articles and systems in which they may be used
CN107924469B (zh) 对设置在光学活性制品上的符号中的数据进行编码
CN109476262B (zh) 使用安全要素对光学活性制品进行防伪检测
US9245203B2 (en) Collecting information relating to identity parameters of a vehicle
TW201541371A (zh) 能夠在alpr系統中使用之物件
Bhatt et al. Recognition of Criminal Faces From Wild VideosSurveillance System Using VGG-16 Architecture

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17716732

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17716732

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载