WO2007075519A2 - Digital image capture and processng system permitting modification and/or extension of system features and functions - Google Patents

Digital image capture and processng system permitting modification and/or extension of system features and functions Download PDF

Info

Publication number
WO2007075519A2
WO2007075519A2 PCT/US2006/048148 US2006048148W WO2007075519A2 WO 2007075519 A2 WO2007075519 A2 WO 2007075519A2 US 2006048148 W US2006048148 W US 2006048148W WO 2007075519 A2 WO2007075519 A2 WO 2007075519A2
Authority
WO
WIPO (PCT)
Prior art keywords
image capture
digital image
processing system
subsystem
illumination
Prior art date
Application number
PCT/US2006/048148
Other languages
French (fr)
Other versions
WO2007075519A3 (en
Inventor
Anatoly Kotlarsky
Xiaoxun Zhu
Original Assignee
Metrologic Instruments, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US11/305,895 external-priority patent/US7607581B2/en
Priority to EP06845674A priority Critical patent/EP1971952A4/en
Application filed by Metrologic Instruments, Inc. filed Critical Metrologic Instruments, Inc.
Publication of WO2007075519A2 publication Critical patent/WO2007075519A2/en
Priority to US11/880,087 priority patent/US8042740B2/en
Priority to US11/900,651 priority patent/US7954719B2/en
Priority to US11/977,432 priority patent/US7878407B2/en
Priority to US11/977,413 priority patent/US7546952B2/en
Priority to US11/977,422 priority patent/US7731091B2/en
Priority to US11/977,430 priority patent/US7614560B2/en
Priority to US11/978,535 priority patent/US7571858B2/en
Priority to US11/978,522 priority patent/US7588188B2/en
Priority to US11/978,525 priority patent/US7575170B2/en
Priority to US11/978,521 priority patent/US7661597B2/en
Priority to US11/980,192 priority patent/US7806336B2/en
Priority to US11/980,329 priority patent/US20080249884A1/en
Priority to US11/978,951 priority patent/US7775436B2/en
Priority to US11/980,078 priority patent/US7806335B2/en
Priority to US11/978,943 priority patent/US7665665B2/en
Priority to US11/978,981 priority patent/US7762465B2/en
Priority to US11/980,083 priority patent/US7784695B2/en
Priority to US11/980,317 priority patent/US7770796B2/en
Priority to US11/980,080 priority patent/US7784698B2/en
Priority to US11/980,084 priority patent/US7793841B2/en
Priority to US11/980,319 priority patent/US8172141B2/en
Publication of WO2007075519A3 publication Critical patent/WO2007075519A3/en
Priority to US12/283,439 priority patent/US20090134221A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10544Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
    • G06K7/10712Fixed beam scanning
    • G06K7/10722Photodetector array or CCD scanning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10544Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
    • G06K7/10554Moving beam scanning
    • G06K7/10594Beam path
    • G06K7/10683Arrangement of fixed elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10544Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
    • G06K7/10712Fixed beam scanning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10544Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
    • G06K7/10712Fixed beam scanning
    • G06K7/10722Photodetector array or CCD scanning
    • G06K7/10732Light sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10544Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
    • G06K7/10792Special measures in relation to the object to be scanned
    • G06K7/10801Multidistance reading
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10544Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
    • G06K7/10821Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices
    • G06K7/10831Arrangement of optical elements, e.g. lenses, mirrors, prisms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10544Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
    • G06K7/10821Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices
    • G06K7/10851Circuits for pulse shaping, amplifying, eliminating noise signals, checking the function of the sensing device

Definitions

  • the present invention relates to hand-supportable and portable area-type digital bar code readers having diverse modes of digital image processing for reading one-dimensional (ID) and two- dimensional (2D) bar code symbols, as well as other forms of graphically-encoded intelligence.
  • the state of the automatic-identification industry can be understood in terms of (i) the different classes of bar code symbologies that have been developed and adopted by the industry, and (ii) the kinds of apparatus developed and used to read such bar code symbologies in various user environments.
  • I D bar code symbologies such as UPC/EAN, Code 39, etc.
  • I D stacked bar code symbologies Code 49, PDF417, etc.
  • 2D data matrix symbologies two-dimensional (2D) data matrix symbologies.
  • One Dimensional optical bar code readers are well known in the art. Examples of such readers include readers of the Metrologic Voyager® Series Laser Scanner manufactured by Metrologic Instruments, Inc. Such readers include processing circuits that are able to read one dimensional (I D) linear bar code symbologies, such as the UPC/EAN code, Code 39, etc., that are widely used in supermarkets. Such 1 D linear symbologies are characterized by data that is encoded along a single axis, in the widths of bars and spaces, so that such symbols can be read from a single scan along that axis, provided that the symbol is imaged with a sufficiently high resolution along that axis.
  • I D linear bar code symbologies such as the UPC/EAN code, Code 39, etc.
  • the third class of bar code symbologies known as 2D matrix symbologies offer orientation- free scanning and greater data densities and capacities than their I D counterparts.
  • 2D matrix codes data is encoded as dark or light data elements within a regular polygonal matrix, accompanied by graphical finder, orientation and reference structures.
  • the horizontal and vertical relationships of the data elements are recorded with about equal resolution.
  • optical reader that is able to read symbols of any of these types, including their various subtypes, interchangeably and automatically. More particularly, it is desirable to have an optical reader that is able to read all three of the above-mentioned types of bar code symbols, without human intervention, i.e., automatically. This is turn, requires that the reader have the ability to automatically discriminate between and decode bar code symbols, based only on information read from the symbol itself. Readers that have this ability are referred to as "auto-discriminating" or having an “auto-discrimination” capability.
  • an auto-discriminating reader is able to read only I D bar code symbols (including their various subtypes), it may be said to have a ID auto-discrimination capability. Similarly, if it is able to read only 2D bar code symbols, it may be said to have a 2D auto-discrimination capability. If it is able to read both I D and 2D bar code symbols interchangeably, it may be said to have a 1 D/2D auto- discrimination capability. Often, however, a reader is said to have a 1 D/2D auto-discrimination capability even if it is unable to discriminate between and decode 1 D stacked bar code symbols.
  • Optical readers that are capable of 1 D auto-discrimination are well known in the art.
  • An early example of such a reader is Metrologic's VoyagerCG® Laser Scanner, manufactured by Metrologic Instruments, Inc.
  • Optical readers particularly hand held optical readers, that are capable of 1 D/2D auto- discrimination and based on the use of an asynchronously moving I D image sensor, are described in US Patent Nos. 5,288,985 and 5,354,977, which applications are hereby expressly incorporated herein by reference.
  • Optical readers whether of the stationary or movable type, usually operate at a fixed scanning rate, which means that the readers are designed to complete some fixed number of scans during a given amount of time.
  • This scanning rate generally has a value that is between 30 and 200 scans/sec for 1 D readers. In such readers, the results the successive scans are decoded in the order of their occurrence.
  • Imaging-based bar code symbol readers have a number advantages over laser scanning based bar code symbol readers, namely: they are more capable of reading stacked 2D symbologies, such as the PDF 417 symbology; more capable of reading matrix 2D symbologies, such as the Data Matrix symbology; more capable of reading bar codes regardless of their orientation; have lower manufacturing costs; and have the potential for use in other applications, which may or may not be related to bar code scanning, such as OCR, security systems, etc
  • Prior art imaging-based bar code symbol readers with integrated illumination subsystems also support a relatively short range of the optical depth of field. This limits the capabilities of such systems from reading big or highly dense bar code labels.
  • Prior art imaging-based bar code symbol readers generally require separate apparatus for producing a visible aiming beam to help the user to aim the camera's field of view at the bar code label on a particular target object.
  • Prior art imaging-based bar code symbol readers generally require capturing multiple frames of image data of a bar code symbol, and special apparatus for synchronizing the decoding process with the image capture process within such readers, as required in US Patent Nos. 5,932,862 and 5,942,741 assigned to Welch Allyn, Inc.
  • Prior art imaging-based bar code symbol readers generally require large arrays of LEDs in order to flood the field of view within which a bar code symbol might reside during image capture operations, oftentimes wasting larges amounts of electrical power which can be significant in portable or mobile imaging-based readers.
  • Prior art imaging-based bar code symbol readers generally require processing the entire pixel data set of capture images to find and decode bar code symbols represented therein.
  • some prior art imaging systems use the inherent programmable (pixel) windowing feature within conventional CMOS image sensors to capture only partial image frames to reduce pixel data set processing and enjoy improvements in image processing speed and thus imaging system performance.
  • Some prior art imaging-based bar code symbol readers generally require the use of a manually- actuated trigger to actuate the image capture and processing cycle thereof.
  • Prior art imaging-based bar code symbol readers generally require separate sources of illumination for producing visible aiming beams and for producing visible illumination beams used to flood the field of view of the bar code reader.
  • Prior art imaging-based bar code symbol readers generally utilize during a single image capture and processing cycle, and a single decoding methodology for decoding bar code symbols represented in captured images.
  • Some prior art imaging-based bar code symbol readers require exposure control circuitry integrated with the image detection array for measuring the light exposure levels on selected portions thereof.
  • imaging-based readers also require processing portions of captured images to detect the image intensities thereof and determine the reflected light levels at the image detection component of the system, and thereafter to control the LED-based illumination sources to achieve the desired image exposure levels at the image detector.
  • Prior art imaging-based bar code symbol readers employing integrated illumination mechanisms control image brightness and contrast by controlling the time the image sensing device is exposed to the light reflected from the imaged objects. While this method has been proven for the CCD-based bar code scanners, it is not suitable, however, for the CMOS-based image sensing devices, which require a more sophisticated shuttering mechanism, leading to increased complexity, less reliability and, ultimately, more expensive bar code scanning systems.
  • Prior art imaging-based bar code symbol readers generally require the use of tables and bar code menus to manage which decoding algorithms are to be used within any particular mode of system operation to be programmed by reading bar code symbols from a bar code menu.
  • dedicated image-processing based bar code symbol reading devices usually have very limited resources, such as the amount of volatile and non-volatile memories. Therefore, they usually do not have a rich set of tools normally available to universal computer systems. Further, if a customer or a third-party needs to enhance or alter the behavior of a conventional image- process ing based bar code symbol reading system or device, they need to contact the device manufacturer and negotiate the necessary changes in the "standard" software or the ways to integrate their own software into the device, which usually involves the re-design or re-compilation of the software by the original equipment manufacturer (OEM). This software modification process is both costly and time consuming.
  • OEM original equipment manufacturer
  • prior art imaging-based bar code symbol readers generally: (i) fail to enable users to read high-density 1 D bar codes with the ease and simplicity of laser scanning based bar code symbol readers and also 2D symbologies, such as PDF 417 and Data Matrix, and (iii) have not enabled end- users to modify the features and functionalities of such prior art systems without detailed knowledge about the hard-ware platform, communication interfaces and the user interfaces of such systems.
  • control operations in prior art image-processing bar code symbol reading systems have not been sufficiently flexible or agile to adapt to the demanding lighting conditions presented in challenging retail and industrial work environments where I D and 2D bar code symbols need to be reliably read.
  • a primary object of the present invention is to provide a novel method of and apparatus for enabling the recognition of graphically-encoded information, including 1 D and 2D bar code symbologies and alphanumerical character strings, using novel image capture and processing based systems and devices, which avoid the shortcomings and drawbacks of prior art methods and apparatus.
  • Another object of the present invention is to provide a digital image capture and processing system employing multi-layer software-based system architecture permitting modification of system features and functionalities by way of third party code plug-ins.
  • Another object of the present invention is to provide such an image capture and processing system that allows customers, VARs and third parties to modify and/or extend a set of standard features and functions of the system without needing to contact the system's OEM and negotiate ways of integrating their desired enhancements to the system.
  • Another object of the present invention is to provide such an image capture and processing system that allows customers, VARs and third parties to independently design their own software according to the OEM specifications, and plug this software into the system, thereby effectively changing the device's behavior, without detailed knowledge about the hard-ware platform of the system, its communications with outside environment, and user-related interfaces
  • Another object of the present invention is to provide a customer of the such an image capture and processing system, or any third-party thereof, with a way of and means for enhancing or altering the behavior of the system without interfering with underlying hardware, communications and user-related interfaces.
  • Another object of the present invention is to provide end-users of such an image capture and processing system, as well as third-parties, with a way of and means for designing, developing, and installing in the device, their own plug-in modules without a need for knowledge of details of the device's hardware.
  • Another object of the present invention is to provide original equipment manufacturers (OEM) with a way of and means for installing the OEM's plug-in modules into an image capture and processing system, without knowledge of the third-party's plug-in (software) modules that have been installed therein, provided established specifications for system features and functionalities for the third-party plug-ins are met.
  • OEM original equipment manufacturers
  • Another object of the present invention is to provide customers of an image capture and processing system, and third-parties thereof, with a way of and means for installing their own modules to enhance or alter the "standard" behavior of the device according to their own needs and independently from each other.
  • Another object of the present invention is to provide an image capture and processing system that supports designer/manufacturer-constrained system behavior modification, without requiring detailed knowledge about the hard-ware platform of the system, its communications with outside environment, and user-related interfaces.
  • Another object of the present invention is to provide a novel hand-supportable digital imaging- based bar code symbol reader capable of automatically reading ID and 2D bar code symbologies using the state-of-the art imaging technology, and at the speed and with the reliability achieved by conventional laser scanning bar code symbol readers.
  • Another object of the present invention is to provide a novel hand-supportable digital imaging- based bar code symbol reader that is capable of reading stacked 2D symbologies such as PDF417, as well as Data Matrix.
  • Another object of the present invention is to provide a novel hand-supportable digital imaging- based bar code symbol reader that is capable of reading bar codes independent of their orientation with respect to the reader.
  • Another object of the present invention is to provide a novel hand-supportable digital imaging- based bar code symbol reader that utilizes an architecture that can be used in other applications, which may or may not be related to bar code scanning, such as OCR, OCV, security systems, etc.
  • Another object of the present invention is to provide a novel hand-supportable digital imaging- based bar code symbol reader that is capable of reading high-density bar codes, as simply and effectively as "flying-spot" type laser scanners do.
  • Another object of the present invention is to provide a hand-supportable imaging-based bar code symbol reader capable of reading I D and 2D bar code symbologies in a manner as convenient to the end users as when using a conventional laser scanning bar code symbol reader.
  • Fig. IA is a schematic representation of a digital image capture and processing system of the present invention, employing a multi-tier software system architecture capable of supporting various subsystems providing numerous standard system features and functions that can be modified and/or extended using the innovative piug-in programming methods of the present invention;
  • Fig. IB is a schematic representation of the system architecture of the a digital image capture and processing system of the present invention, represented in Fig. I A;
  • Figs. 1C1-1C2 taken together, sets forth a table indicating the features and functions supported by each of the subsystems provided in the system architecture of the a digital image capture and processing system of the present invention, represented in Figs. 1 A and IB;
  • Fig. I D is a schematic representation indicating that the digital image capture and processing system of the present invention, shown in Figs. I A through 1 C2, can be implemented using a digital camera board and a printed circuit (PC) board that are interfaced together;
  • PC printed circuit
  • Fig. IE is a schematic representation indicating that the digital image capture and processing system of the present invention, shown in Figs. IA through 1C2, can be implemented using a single hybrid digital camera/PC board;
  • Fig. I F is a schematic representation illustrating that the digital image capture and processing system of the present invention, shown in Figs. I A through I E, can be integrated or embodied within third-party products, such as, for example, but not limited to digital image-processing based bar code symbol reading systems, OCR systems, object recognition systems, portable data terminals (PDTs), mobile phones, computer mice-type devices, personal computers, keyboards, consumer appliances, automobiles, ATMs, vending machines, reverse-vending machines, retail POS-based transaction systems, 2D or 2D digitizers, and CAT scanning systems, automobile identification systems, package inspection systems, personal identification systems and the like;
  • third-party products such as, for example, but not limited to digital image-processing based bar code symbol reading systems, OCR systems, object recognition systems, portable data terminals (PDTs), mobile phones, computer mice-type devices, personal computers, keyboards, consumer appliances, automobiles, ATMs, vending machines, reverse-vending machines, retail POS-based transaction systems, 2D or 2D
  • Fig. 2A is a rear perspective view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention
  • Fig. 2B is a front perspective view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention
  • Fig. 2C is an elevated left side view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention
  • Fig. 2D is an elevated right side view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention
  • Fig. 2E is an elevated rear view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention
  • Fig. 2F is an elevated front view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention, showing components associated with its illumination subsystem and its image capturing subsystem;
  • Fig. 2G is a bottom view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention
  • Fig. 2H is a top rear view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention
  • Fig. 21 is a first perspective exploded view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention.
  • Fig. 2 J is a second perspective exploded view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention
  • Fig. 2 K. is a third perspective exploded view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention.
  • Fig. 2Ll is a schematic block diagram representative of a system design for the hand- supportable digital imaging-based bar code symbol reading device illustrated in Figs. 2A through 2K, wherein the system design is shown comprising (1) a Multi-Mode Area-Type Image Formation and Detection (i.e.
  • Camera Subsystem having image formation (camera) optics for producing a field of view (FOV) upon an object to be imaged and a CMOS or like area-type image sensing array for detecting imaged light reflected off the object during illumination operations in either (i) a narrow-area image capture mode in which a few central rows of pixels on the image sensing array are enabled, or (ii) a wide-area image capture mode in which all rows of the image sensing array are enabled, (2) a Multi-Mode LED-Based Illumination Subsystem for producing narrow and wide area fields of narrowband illumination within the FOV of the Image Formation And Detection Subsystem during narrow and wide area modes of image capture, respectively, so that only light transmitted from the Multi-Mode Illumination Subsystem and reflected from the illuminated object and transmitted through a narrowband transmission-type optical filter realized within the hand-supportable housing (i.e.
  • an IR-based object presence and range detection subsystem for producing an IR-based object detection field within the FOV of the Image Formation and Detection Subsystem
  • an Automatic Light Exposure Measurement and Illumination Control Subsystem for controlling the operation of the LED-Based Multi-Mode Illumination Subsystem
  • an Image Capturing and Buffering Subsystem for capturing and buffering 2-D images detected by the Image Formation and Detection Subsystem
  • a Multimode Image-Processing Based Bar Code Symbol Reading Subsystem for processing images captured and buffered by the Image Capturing and Buffering Subsystem and reading I D and 2D bar code symbols represented
  • an Input/Output Subsystem for outputting processed image data and the like to an external host system or other information receiving or responding device, in which each said subsystem component is integrated about
  • Fig. 2L2 is a schematic block representation of the Multi-Mode Image-Processing Based Bar Code Symbol Reading Subsystem, realized using the three-tier computing platform illustrated in Fig. 2M;
  • Fig. 2M is a schematic diagram representative of a system implementation for the hand- supportable digital imaging-based bar code symbol reading device illustrated in Figs.
  • an illumination board 33 carrying components realizing electronic functions performed by the Multi-Mode LED-Based Illumination Subsystem and the Automatic Light Exposure Measurement And Illumination Control Subsystem
  • a CMOS camera board carrying a high resolution (1280 X 1024 7-bit 6 micron pixel size) CMOS image sensor array running at 25Mhz master clock, at 7 frames/second at 1280*1024 resolution with randomly accessible region of interest (ROl) window capabilities, realizing electronic functions performed by the multi-mode area-type Image Formation and Detection Subsystem
  • a CPU board i.e.
  • computing platform including (i) an Intel Sabinal 32-Bit Microprocessor PXA210 running at 200 Mhz 1.0 core voltage with a 16 bit lOOMhz external bus speed, (ii) an expandable (e.g. 7+ megabyte) Intel J3 Asynchronous 16-bit Flash memory, (iii) an 16 Megabytes of 100 MHz SDRAM, (iv) an Xilinx Spartan II FPGA FIFO 39 running at 50Mhz clock, frequency and 60MB/Sec data rate, configured to control the camera timings and drive an image acquisition process, (v) a multimedia card socket, for realizing the other subsystems of the system, (vi) a power management module for the MCU adjustable by the system bus, and (vii) a pair of UARTs (one for an IRDA port and one for a JTAG port), (4) an interface board for realizing the functions performed by the I/O subsystem, and (5) an IR-based object presence and range detection circuit for realizing the IR-based
  • Fig. 3 A is a schematic representation showing the spatial relationships between the near and far and narrow and wide area fields of narrow-band illumination within the FOV of the Multi-Mode Image Formation and Detection Subsystem during narrow and wide area image capture modes of operation;
  • Fig. 3B is a perspective partially cut-away view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment, showing the LED-Based Multi- Mode Illumination Subsystem transmitting visible narrow-band illumination through its narrow-band transmission-type optical filter system and illuminating an object with such narrow-band illumination, and also showing the image formation optics, including the low pass filter before the image sensing array, for collecting and focusing light rays reflected from the illuminated object, so that an image of the object is formed and detected using only the optical components of light contained within the narrow-band of illumination, while all other components of ambient light are substantially rejected before image detection at the image sensing array;
  • Fig. 3 C is a schematic representation showing the geometrical layout of the optical components used within the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment, wherein the red-wavelength reflecting high-pass lens element is positioned at the imaging window of the device before the image formation lens elements, while the low-pass filter is disposed before the image sensor of between the image formation elements, so as to image the object at the image sensing array using only optical components within the narrow-band of illumination, while rejecting all other components of ambient light; Fig.
  • Fig. 3E is a schematic representation of the lens holding assembly employed in the image formation optical subsystem of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment, showing a two-piece barrel structure which holds the lens elements, and a base structure which holds the image sensing array, wherein the assembly is configured so that the barrel structure slides within the base structure so as to focus the assembly;
  • Fig. 3Fl is a first schematic representation showing, from a side view, the physical position of the LEDs used in the Multi-Mode Illumination Subsystem, in relation to the image formation lens assembly, the image sensing array employed therein (e.g. a Motorola MCM20027 or " National Semiconductor LM9638 CMOS 2-D image sensing array having a 1280x1024 pixel resolution (1/2" format), 6 micron pixel size, 13.5 Mhz clock rate, with randomly accessible region of interest (ROI) window capabilities);
  • the image sensing array employed therein e.g. a Motorola MCM20027 or " National Semiconductor LM9638 CMOS 2-D image sensing array having a 1280x1024 pixel resolution (1/2" format), 6 micron pixel size, 13.5 Mhz clock rate, with randomly accessible region of interest (ROI) window capabilities
  • Fig. 3F2 is a second schematic representation showing, from an axial view, the physical layout of the LEDs used in the Multi-Mode Illumination Subsystem of the digital imaging-based bar code symbol reading device, shown in relation to the image formation lens assembly, and the image sensing array employed therein;
  • Fig. 4Al is a schematic representation specifying the range of narrow-area illumination, near- field wide-area illumination, and far-field wide-area illumination produced from the LED-Based Multi- Mode Illumination Subsystem employed in the hand-supportable digital imaging-based bar code symbol reading device of the present invention;
  • Fig. 4A2 is a table specifying the geometrical properties and characteristics of each illumination mode supported by the LED-Based Multi-Mode Illumination Subsystem employed in the hand- supportable digital imaging-based bar code symbol reading device of the present invention
  • Fig. 4B is a schematic representation illustrating the physical arrangement of LED light sources associated with the narrow-area illumination array and the near-field and far-field wide-area illumination arrays employed in the digital imaging-based bar code symbol reading device of the present invention, wherein the LEDs in the far-field wide-area illuminating arrays are located behind spherical lenses, the LEDs in the narrow-area illuminating array are disposed behind cylindrical lenses, and the LEDs in the near-field wide-area illuminating array are unlensed in the first illustrative embodiment of the Digital Imaging-Based Bar Code Reading Device;
  • Fig. 4Cl is a graphical representation showing the Lambertian emittance versus wavelength characteristics of the LEDs used to implement the narrow-area illumination array in the Multi-Mode Illumination Subsystem of the present invention
  • Fig. 4C2 is a graphical representation showing the Lambertian emittance versus polar angle characteristics of the LEDs used to implement the narrow-area illumination array in the Multi-Mode Illumination Subsystem of the present invention
  • Fig. 4C3 is a schematic representation of the cylindrical lenses used before the LEDs in the narrow-area (linear) illumination arrays in the digital imaging-based bar code symbol reading device of the present invention, wherein the first surface of the cylindrical lens is curved vertically to create a narrow-area (i.e. linear) illumination pattern, and the second surface of the cylindrical lens is curved horizontally to control the height of the of the narrow-area illumination pattern to produce a narrow- area (i.e. linear) illumination field;
  • Fig. 4C4 is a schematic representation of the layout of the pairs of LEDs and two cylindrical lenses used to implement the narrow-area (linear) illumination array employed in the digital imaging- based bar code symbol reading device of the present invention
  • Fig. 4C5 is a set of six illumination profiles for the narrow-area (linear) illumination fields produced by the narrow-area (linear) illumination array employed in the digital imaging-based bar code symbol reading device of the illustrative embodiment, taken at 30, 40, 50, 80, 120, and 220 millimeters along the field away from the imaging window (i.e. working distance) of the digital imaging-based bar code symbol reading device, illustrating that the spatial intensity of the narrow-area illumination field begins to become substantially uniform at about 80 millimeters;
  • Fig. 4Dl is a graphical representation showing the Lambertian emittance versus wavelength characteristics of the LEDs used to implement the wide area illumination arrays employed in the digital imaging-based bar code symbol reading device of the present invention
  • Fig. 4D2 is a graphical representation showing the Lambertian emittance versus polar angle characteristics of the LEDs used to implement the far-field and near-field wide-area illumination arrays employed in the digital imaging-based bar code symbol reading device of the present invention
  • Fig. 4D3 is a schematic representation of the plano-convex lenses used before the LEDs in the far-field wide-area illumination arrays in the illumination subsystem of the present invention
  • Fig. 4D4 is a schematic representation of the layout of LEDs and plano-convex lenses used to implement the far and narrow wide-area illumination array employed in the digital imaging-based bar code symbol reading device of the present invention, wherein the illumination beam produced therefrom is aimed by positioning the lenses at angles before the LEDs in the near-field (and far-field) wide-area illumination arrays employed therein;
  • Fig. 4D5 is a set of six illumination profiles for the near-field wide-area illumination fields produced by the near-field wide-area illumination arrays employed in the digital imaging-based bar code symbol reading device of the illustrative embodiment, taken at 10, 20, 30, 40, 60, and 100 millimeters along the field away from the imaging window (i.e. working distance) of the digital imaging-based bar code symbol reading device, illustrating that the spatial intensity of the near-field wide-area illumination field begins to become substantially uniform at about 40 millimeters; Fig.
  • 4D6 is a set of three illumination profiles for the far-field wide-area illumination fields produced by the far-field wide-area illumination arrays employed in the digital imaging-based bar code symbol reading device of the illustrative embodiment, taken at 100, 150 and 220 millimeters along the field away from the imaging window (i.e. working distance) of the digital imaging-based bar code symbol reading device, illustrating that the spatial intensity of the far-field wide-area illumination field begins to become substantially uniform at about 100 millimeters;
  • Fig. 4D7 is a table illustrating a preferred method of calculating the pixel intensity value for the center of the far-field wide-area illumination field produced from the Multi-Mode Illumination Subsystem employed in the digital imaging-based bar code symbol reading device of the present invention, showing a significant signal strength ( greater than 80 DN);
  • Fig. 5Al is a schematic representation showing the red-wavelength reflecting (high-pass) imaging window integrated within the hand-supportable housing of the digital imaging-based bar code symbol reading device, and the low-pass optical filter disposed before its CMOS image sensing array therewithin, cooperate to form a narrow-band optical filter subsystem for transmitting substantially only the very narrow band of wavelengths (e.g. 620-700 nanometers) of visible illumination produced from the Multi-Mode Illumination Subsystem employed in the digital imaging-based bar code symbol reading device, and rejecting all other optical wavelengths outside this narrow optical band however generated (i.e. ambient light sources);
  • a narrow-band optical filter subsystem for transmitting substantially only the very narrow band of wavelengths (e.g. 620-700 nanometers) of visible illumination produced from the Multi-Mode Illumination Subsystem employed in the digital imaging-based bar code symbol reading device, and rejecting all other optical wavelengths outside this narrow optical band however generated (i.e. ambient light sources);
  • Fig. 5A2 is a schematic representation of transmission characteristics (energy versus wavelength) associated with the low-pass optical filter element disposed after the red-wavelength reflecting high-pass imaging window within the hand-supportable housing of the digital imaging-based bar code symbol reading device, but before its CMOS image sensing array, showing that optical wavelengths below 620 nanometers are transmitted and wavelengths above 620 nm are substantially blocked (e.g. absorbed or reflected);
  • Fig. 5A3 is a schematic representation of transmission characteristics (energy versus wavelength) associated with the red-wavelength reflecting high-pass imaging window integrated within the hand-supportable housing of the digital imaging-based bar code symbol reading device of the present invention, showing that optical wavelengths above 700 nanometers are transmitted and wavelengths below 700 nm are substantially blocked (e.g. absorbed or reflected);
  • Fig. 5A4 is a schematic representation of the transmission characteristics of the narrow-based spectral filter subsystem integrated within the hand-supportable imaging-based bar code symbol reading device of the present invention, plotted against the spectral characteristics of the LED-emissions produced from the Multi-Mode Illumination Subsystem of the illustrative embodiment of the present invention;
  • Fig. 6A is a schematic representation showing the geometrical layout of the spherical/parabolic light reflecting/collecting mirror and photodiode associated with the Automatic Light Exposure Measurement and Illumination Control Subsystem, and arranged within the hand-supportable digital imaging-based bar code symbol reading device of the illustrative embodiment, wherein incident illumination is collected from a selected portion of the center of the FOV of the system using a spherical light collecting mirror, and then focused upon a photodiode for detection of the intensity of reflected illumination and subsequent processing by the Automatic Light Exposure Measurement and Illumination Control Subsystem, so as to then control the illumination produced by the LED-based Multi-Mode Illumination Subsystem employed in the digital imaging-based bar code symbol reading device of the present invention;
  • Fig. 6B is a schematic diagram of the Automatic Light Exposure Measurement and Illumination Control Subsystem employed in the hand-supportable digital imaging-based bar code symbol reading device of the present invention, wherein illumination is collected from the center of the FOV of the system and automatically detected so as to generate a control signal for driving, at the proper intensity, the narrow-area illumination array as well as the far-field and narrow-field wide-area illumination arrays of the Multi-Mode Illumination Subsystem, so that the CMOS image sensing array produces digital images of illuminated objects of sufficient brightness;
  • Figs. 6Cl and 6C2 taken together, set forth a schematic diagram of a hybrid analog/digital circuit designed to implement the Automatic Light Exposure Measurement and Illumination Control Subsystem of Fig. 6B employed in the hand-supportable digital imaging-based bar code symbol reading device of the present invention;
  • Fig. 6D is a schematic diagram showing that, in accordance with the principles of the present invention, the CMOS image sensing array employed in the digital imaging-based bar code symbol reading device of the illustrative embodiment, once activated by the System Control Subsystem (or directly by the trigger switch), and when all rows in the image sensing array are in a state of integration operation, automatically activates the Automatic Light Exposure Measurement and Illumination Control Subsystem which, in response thereto, automatically activates the LED illumination driver circuitry to automatically drive the appropriate LED illumination arrays associated with the Multi-Mode Illumination Subsystem in a precise manner and globally expose the entire CMOS image detection array with narrowly tuned LED-based illumination when all of its rows of pixels are in a state of integration, and thus have a common integration time, thereby capturing high quality images independent of the relative motion between the bar code reader and the object;
  • Fig. 6El and 6E2 taken together, set forth a flow chart describing the steps involved in carrying out the global exposure control method of the present invention, within the digital imaging- based bar code symbol reading device of the illustrative embodiments;
  • Fig. 7 is a schematic block diagram of the IR-based automatic Object Presence and Range Detection Subsystem employed in the hand-supportable digital imaging-based bar code symbol reading device of the present invention, wherein a first range indication control signal is generated upon detection of an object within the near-field region of the Multi-Mode Illumination Subsystem, and wherein a second range indication control signal is generated upon detection of an object within the far- field region of the Multi-Mode Illumination Subsystem; Fig.
  • FIG. 8 is a schematic representation of the hand-supportable digital imaging-based bar code symbol reading device of the present invention, showing that its CMOS image sensing array is operably connected to its microprocessor through a FIFO (realized by way of a FPGA) and a system bus, and that its SDRAM is also operably connected to the microprocessor by way of the system bus, enabling the mapping of pixel data captured by the imaging array into the SDRAM under the control of the direct memory access (DMA) module within the microprocessor;
  • DMA direct memory access
  • Fig. 9 is a schematic representation showing how the bytes of pixel data captured by the CMOS imaging array within the hand-supportable digital imaging-based bar code symbol reading device of the present invention, are mapped into the addressable memory storage locations of its SDRAM during each image capture cycle carried out within the device;
  • Fig. 10 is a schematic representation showing the software modules associated with the three- tier software architecture of the hand-supportable digital imaging-based bar code symbol reading device of the present invention, namely: the Main Task module, the CodeGate Task module, the Narrow-Area Illumination Task module, the Metroset Task module, the Application Events Manager module, the User Commands Table module, the Command Handler module, Plug-In Controller, and Plug-In Libraries and Configuration Files, all residing within the Application layer of the software architecture; the Tasks Manager module, the Events Dispatcher module, the Input/Output Manager module, the User Commands Manager module, the Timer Subsystem module, the Input/Output Subsystem module and the Memory Control Subsystem module residing with the System Core (SCORE) layer of the software architecture; and the Linux Kernal module in operable communication with the Plug-In Controller, the Linux File System module, and Device Drivers modules residing within the Linux Operating System (OS) layer of the software architecture, and in operable communication with an external (hostO Plug-In Development Platform via standard or proprietary
  • Fig. 1 1 is a perspective view of an illustrative embodiment of a computer software development platform for developing plug-ins for tasks within the application layer of the imaging-based bar code reading system of the present invention
  • Fig. 12A is a schematic representation of the Events Dispatcher software module which provides a means of signaling and delivering events to the Application Events Manager, including the starting of a new task, stopping a currently running task, doing something, or doing nothing and ignoring the event;
  • Fig. 12B is a table listing examples of system-defined events which can occur and be dispatched within the hand-supportable digital imaging-based bar code symbol reading device of the present invention, namely: SCORE_EVENT_POWER_UP which signals the completion of system start-up and involves no parameters;_SCORE_EVENT_TIMEOUT which signals the timeout of the logical timer, and involves the parameter "pointer to timer id"; SCORE_EVENT_UNEXPECTED_INPUT which signals that the unexpected input data is available and involves the parameter "pointer to connection id"; SCORE_EVENT_TRIG_ON which signals that the user pulled the trigger switch and involves no parameters; SCORE_EVENT_TR1G_OFF which signals that the user released the trigger switch and involves no parameters; SCORE_EVENT_OBJECT_DETECT_ON which signals that the object is positioned under the bar code reader and involves no parameters; SCORE_EVENT_OBJECT_DETECT_OFF which signals that the object is removed from the field of view
  • Fig. 12C is a schematic representation of the Tasks Manager software module which provides a means for executing and stopping application specific tasks (i.e. threads);
  • Fig. 12D is a schematic representation of the Input/Output Manager software module (i.e Input/Output Subsystem), which runs in the background and monitors activities of external devices and user connections, and signals appropriate events to the Application Layer, which such activities are detected;
  • Input/Output Manager software module i.e Input/Output Subsystem
  • Figs. 12El and 12E2 set forth a schematic representation of the Input/Output Subsystem software module which provides a means for creating and deleting input/output connections, and communicating with external systems and devices;
  • Figs. 12Fl and 12F2 set forth a schematic representation of the Timer Subsystem which provides a means for creating, deleting, and utilizing logical timers;
  • Figs. 12Gl and 12G2 set forth a schematic representation of the Memory Control Subsystem which provides an interface for managing the thread-level dynamic memory with the device, fully compatible with standard dynamic memory management functions, as well as a means for buffering collected data;
  • Fig. 12H is a schematic representation of the user commands manager which provides a standard way of entering user commands, and executing application modules responsible for handling the same;
  • Fig. 121 is a schematic representation of the device driver software modules, which includes trigger switch drivers for establishing a software connection with the hardware-based manually- actuated trigger switch employed on the digital imaging-based bar code symbol reading device, an image acquisition driver for implementing image acquisition functionality aboard the digital imaging- based bar code symbol reading device, and an IR driver for implementing object detection functionality aboard the imaging-based bar code symbol reading device;
  • Fig. 13A is an exemplary flow chart representation showing how when the user points the bar code reader towards a bar code symbol, the IR device drivers detect that object within the field, and then wakes up the Input/Output Manager software module at the System Core Layer;
  • Fig. 13B is an exemplary flow chart representation showing how upon detecting an object, the Input/Output Manager posts the SCORE_OBJECTJDETECT_ON event to the Events Dispatcher software module;
  • Fig. 13C is an exemplary flow chart representation showing how, in response to detecting an object, the Events Dispatcher software module passes the SCORE_OBJECT_DETECT_ON event to the Application Layer;
  • Fig. 13D is an exemplary flow chart representation showing how upon receiving the SCORE_OBJECT_DETECT_ON event at the Application Layer, the Application Events Manager executes an event handling routine which activates the narrow-area illumination array associated with the Multi-Mode Illumination Subsystem, and executes either the CodeGate Task described in Fig. 13E (when required by System Mode in which the Device is programmed) or the Narrow-Area Illumination Task described in Fig. 13M (when required by System Mode in which the Device is programmed);
  • Fig. 13E is an exemplary flow chart representation showing how what operations are carried out when the CodeGate Task is (enabled and) executed within the Application Layer;
  • Fig. 13F is an exemplary flow chart representation showing how, when the user pulls the trigger switch on the bar code reader while the Code Task is executing, the trigger device driver wakes up the Input/Output Manager at the System Core Layer;
  • Fig. 13 G is an exemplary flow chart representation showing how, in response to waking up, the Input/Output Manager posts the SCORE_TRIGGER_ON event to the Events Dispatcher;
  • Fig. 13H is an exemplary flow chart representation showing how the Events Dispatcher passes on the SCORE_TRIGGER_ON event to the Application Events Manager at the Application Layer;
  • Figs. 1311 and 1312 taken together, set forth an exemplary flow chart representation showing how the Application Events Manager responds to the SCORE_TRIGGER_ON event by invoking a handling routine within the Task Manager at the System Core Layer which deactivates the narrow-area illumination array associated with the Multi-Mode Illumination Subsystem, cancels the CodeGate Task or the Narrow-Area Illumination Task (depending on which System Mode the Device is programmed), and executes the Main Task;
  • Fig. 13J is an exemplary flow chart representation showing what operations are carried out when the Main Task is (enabled and) executed within the Application Layer;
  • Fig. 13K is an exemplary flow chart representation showing what operations are carried out when the Data Output Procedure, called in the Main Task, is executed within the Input/Output Subsystem software module in the Application Layer;
  • Fig. 13L is an exemplary flow chart representation showing decoded symbol character data being sent from the Input/Output Subsystem to the Device Drivers within the Linux OS Layer of the system;
  • Fig. 13M is an exemplary flow chart representation showing what operations are carried out when the Narrow-Area Illumination Task is (enabled and) executed within the Application Layer;
  • Figs. 13Ml through 13M3, taken together, -t ⁇ set forth a flow chart describing a novel method of generating wide-area illumination, for use during the Main Task routine so as to illuminate objects with a wide-area illumination field in a manner, which substantially reduces specular-type reflection at the CMOS image sensing array in the digital im aging-based bar code reading device of the present invention
  • Fig. 14 is a table listing various bar code symbologies supported by the Multi-Mode Bar Code Symbol Reading Subsystem module employed within the hand-supportable digital imaging-based bar code reading device of the present invention
  • Fig. 15 is a table listing the four primary modes in which the Multi-Mode Bar Code Symbol Reading Subsystem module can be programmed to operate, namely: the Automatic Mode wherein the Multi-Mode Bar Code Symbol Reading Subsystem is configured to automatically process a captured frame of digital image data so as to search for one or more bar codes represented therein in an incremental manner, and to continue searching until the entire image is processed; the Manual Mode wherein the Multi-Mode Bar Code Symbol Reading Subsystem is configured to automatically process a captured frame of digital image data, starting from the center or sweep spot of the image at which the user would have aimed the bar code reader, so as to search for (i.e.
  • the Multi-Mode Bar Code Symbol Reading Subsystem is configured to automatically process a specified "region of interest" (ROI) in a captured frame of digital image data so as to search for one or more bar codes represented therein, in response to coordinate data specifying the location of the bar code within the field of view of the multi-mode image formation and detection system; the NoFinder Mode wherein the Multi-Mode Bar Code Symbol Reading Subsystem is configured to automatically process a captured narrow-area (linear) frame of digital image data, without feature extraction and marking operations used in the Automatic and Manual Modes, so as read one or more bar code symbols represented therein; and the Omniscan Mode, wherein the Multi-Mode Bar Code Symbol Reading Subsystem is configured to automatically process a captured frame of digital image data along any one or
  • Fig. 16 is an exemplary flow chart representation showing the steps involved in setting up and cleaning up the software sub-Application entitled "Multi-Mode Image-Processing Based Bar Code Symbol Reading Subsystem", once called from either (i) the CodeGate Task software module at the Block entitled READ BAR CODE(S) IN CAPTURED NARROW-AREA IMAGE indicated in Fig. 13E, or (ii) the Main Task software module at the Block entitled "READ BAR CODE(S) IN CAPTURED WIDE-AREA IMAGE” indicated in Fig. 13J;
  • Figs. 17A and 17B provide a table listing the primary Programmable Modes of Bar Code Reading Operation supported within the hand-supportable Digital Imaging-Based Bar Code Symbol Reading Device of the present invention, namely: Programmed Mode of System Operation No. 1 --Manually-Triggered Single-Attempt I D Single-Read Mode Employing the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem;
  • Programmable Mode of Operation No. 14 Semi- Automatic-Triggered Multiple- Attempt 1D/2D Multiple-Read Mode Employing The No-Finder Mode And The Omniscan Modes Of the Multi- Mode Bar Code Reading Subsystem;
  • Programmable Mode of Operation No. 15 Continuously-Automatically-Triggered Multiple- Attempt 1 D/2D Multiple-Read Mode Employing The Automatic, Manual and/or Omniscan Modes Of the Multi-Mode Bar Code Reading Subsystem;
  • Fig. 18 is a schematic representation specifying the four modes of illumination produced from the Multi-Mode Illumination Subsystem employed in the second illustrative embodiment of the Digital Imaging-Based Bar Code Symbol Reader of the present invention, which supports both near and far fields of narrow-area illumination generated during the narrow-area image capture mode of its Multi- Mode Image Formation and Detection Subsystem;
  • Fig. 19 is a schematic representation illustrating the physical arrangement of LEDs and light focusing lenses associated with the near and far field narrow-area and wide-area illumination arrays employed in the digital imaging-based bar code reading device according to the second illustrative embodiment of the present invention
  • Fig. 2OA is a first perspective view of a second illustrative embodiment of the portable POS digital imaging-based bar code reading device of the present invention, shown having a hand- supportable housing of a different form factor than that of the first illustrative embodiment, and configured for use in its hands-free/presentation mode of operation, supporting primarily wide-area image capture;
  • Fig. 2OB is a second perspective view of the second illustrative embodiment of the portable POS digital imaging-based bar code reading device of the present invention, shown configured and operated in its hands-free/presentation mode of operation, supporting primarily wide-area image capture;
  • Fig. 2OC is a third perspective view of the second illustrative embodiment of the portable digital imaging-based bar code reading device of the present invention, showing configured and operated in a hands-on type mode, supporting both narrow and wide area modes of image capture;
  • Fig. 21 is a perspective view of a third illustrative embodiment of the digital imaging-based bar code reading device of the present invention, realized in the form of a Multi-Mode Image Capture And Processing Engine that can be readily integrated into various kinds of information collection and processing systems, including wireless portable data terminals (PDTs), reverse-vending machines, retail product information kiosks and the like;
  • PDTs wireless portable data terminals
  • reverse-vending machines retail product information kiosks and the like
  • Fig. 22 is a schematic representation of a wireless bar code-driven portable data terminal embodying the imaging-based bar code symbol reading engine of the present invention, shown configured and operated in a hands-on mode;
  • Fig. 23 is a perspective view of the wireless bar code-driven portable data terminal of Fig. 22 shown configured and operated in a hands-on mode, wherein the imaging-based bar code symbol reading engine embodied therein is used to read a bar code symbol on a package and the symbol character data representative of the read bar code is being automatically transmitted to its cradle- providing base station by way of an RF-enabled 2-way data communication link;
  • Fig. 24 is a side view of the wireless bar code-driven portable data terminal of Figs. 31 and 32 shown configured and operated in a hands-free mode, wherein the imaging-based bar code symbol reading engine is configured in a wide-area image capture mode of operation, suitable for presentation- type bar code reading at point of sale (POS) environments;
  • POS point of sale
  • Fig. 25 is a block schematic diagram showing the various subsystem blocks associated with a design model for the Wireless Hand-Supportable Bar Code Driven Portable Data Terminal System of Figs. 31, 32 and 33, shown interfaced with possible host systems and/or networks;
  • Fig. 26 is a schematic block diagram representative of a system design for the hand-supportable digital imaging-based bar code symbol reading device according to an alternative embodiment of the present invention, wherein the system design is similar to that shown in Fig. 2Al , except that the Automatic Light Exposure Measurement and Illumination Control Subsystem is adapted to measure the light exposure on a central portion of the CMOS image sensing array and control the operation of the LED-Based Multi-Mode Illumination Subsystem in cooperation with a software-based illumination metering program realized within the Multi-Mode Image Processing Based Bar Code Symbol Reading Subsystem, involving the real-time analysis of captured digital images for unacceptable spatial-intensity distributions;
  • Fig. 26A is a schematic representation of the system illustrated in , showing in greater detail how the current illumination duration determined by the Automatic Light Exposure Measurement and Illumination Control Subsystem is automatically over-ridden by the illumination duration computed by a software-implemented, image-processing based illumination metering program carried out within the Image-Processing Based Bar Code Symbol Reading Subsystem, and used to control the illumination produced during the next image frame captured by the system, in accordance with this enhanced auto- illumination control scheme of the present invention;
  • Fig. 26B is a flow chart setting forth the steps involved in carrying out the enhanced auto- illumination control scheme illustrated in Fig. 26A;
  • Figs. 27A and 27B taken together, set forth a flow chart illustrating the steps involved in carrying out the adaptive method of controlling system operations (e.g. illumination, image capturing, image processing, etc.) within th ⁇ multi-mode image-processing based bar code symbol reader system of the illustrative embodiment of the present invention, wherein the "exposure quality" of captured digital images is automatically analyzed in real-time and system control parameters (SCPs) are automatically reconfigured based on the results of such exposure quality analysis;
  • system operations e.g. illumination, image capturing, image processing, etc.
  • Fig. 27C is a schematic representation illustrating the Single Frame Shutter Mode of operation of the CMOS image sensing array employed within the multi-mode image-processing based bar code symbol reader system of the illustrative embodiment of the present invention, while the system is operated in its Global Exposure Mode of Operation illustrated in Figs. 6D through 6E2;
  • Fig. 27D is a schematic representation illustrating the Rolling Shutter Mode of operation of the CMOS image sensing array employed within the multi-mode image-processing based bar code symbol reader system of the illustrative embodiment of the present invention, while the system is operated according to its adaptive control method illustrated in Figs. 27 A through 27B;
  • Fig. 27E is a schematic representation illustrating the Video Mode of operation of the CMOS image sensing array employed within the multi-mode image-processing based bar code symbol reader system of the illustrative embodiment of the present invention, while the system is operated according to its adaptive control method illustrated in Figs. 27A through 27B;
  • Fig. 28 is a perspective view of a hand-supportable image-processing based bar code symbol reader employing an image cropping zone (ICZ) targeting/marking pattern, and automatic post-image capture cropping methods to abstract the ICZ within which the targeted object to be imaged has been encompassed during illumination and imaging operations;
  • ICZ image cropping zone
  • Fig. 29 is a schematic system diagram of the hand-supportable image-processing based bar code symbol reader shown in Fig. 28, shown employing an image cropping zone (ICZ) illumination targeting/marking source(s) operated under the control of the System Control Subsystem;
  • ICZ image cropping zone
  • Fig. 30 is a flow chart setting forth the steps involved in carrying out the first illustrative embodiment of the image cropping zone targeting/marking and post-image capture cropping process of the present invention embodied within the bar code symbol reader illustrated in Figs. 28 and 29;
  • Fig. 31 is a perspective view of another illustrative embodiment of the hand-supportable image- processing based bar code symbol reader of the present invention, showing its visible illumination- based Image Cropping Pattern (ICP) being projected within the field of view (FOV) of its Multi-Mode Image Formation And Detection Subsystem;
  • ICP visible illumination- based Image Cropping Pattern
  • Fig. 32 is a schematic block diagram representative of a system design for the hand-supportable digital imaging-based bar code symbol reading device illustrated in Fig. 31 , wherein the system design is shown comprising (1 ) a Multi-Mode Area-Type Image Formation and Detection (i.e.
  • Camera Subsystem having image formation (camera) optics for producing a field of view (FOV) upon an object to be imaged and a CMOS or like area-type image sensing array for detecting imaged light reflected off the object during illumination operations in either (i) a narrow-area image capture mode in which a few central rows of pixels on the image sensing array are enabled, or (ii) a wide-area image capture mode in which substantially all rows of the image sensing array are enabled, (2) a Multi-Mode LED-Based Illumination Subsystem for producing narrow and wide area fields of narrow-band illumination within the FOV of the Image Formation And Detection Subsystem during narrow and wide area modes of image capture, respectively, so that only light transmitted from the Multi-Mode Illumination Subsystem and reflected from the illuminated object and transmitted through a narrow-band transmission-type optical filter realized within the hand-supportable housing (i.e.
  • an Image Cropping Pattern Generator for generating a visible illumination- based Image Cropping Pattern (ICP) projected within the field of view (FOV) of the Multi-Mode Area- type Image Formation and Detection Subsystem, (3) an IR-based object presence and range detection subsystem for producing an IR-based object detection field within the FOV of the Image Formation and Detection Subsystem, (4) an Automatic Light Exposure Measurement and Illumination Control Subsystem for measuring illumination levels in the FOV and controlling the operation of the LED- Based Multi-Mode Illumination Subsystem, (5) an Image Capturing and Buffering Subsystem for capturing and buffering 2-D images detected by the Image Formation and Detection Subsystem, (6) an Image Processing and Cropped Image Locating Module for processing captured and buffered images to locate the image region corresponding to the region def
  • Fig. 33A is a schematic representation of a first illustrative embodiment of the VLD-based Image Cropping Pattern Generator of the present invention, comprising a VLD located at the symmetrical center of the focal plane of a pair of flat-convex lenses arranged before the VLD, and capable of generating and projecting a two (2) dot image cropping pattern (ICP) within the field of view of the of the Multi-Mode Area-type Image Formation and Detection Subsystem;
  • ICP dot image cropping pattern
  • Fig. 33 B and 33C taken together provide a composite ray-tracing diagram for the first illustrative embodiment of the VLD-based Image Cropping Pattern Generator depicted in Fig. 33A, showing that the pair of flat-convex lenses focus naturally diverging light rays from the VLD into two substantially parallel beams of laser illumination which to produce a two (2) dot image cropping pattern (ICP) within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem, wherein the distance between the two spots of illumination in the ICP is a function of distance from the pair of lenses;
  • ICP dot image cropping pattern
  • Fig. 33Dl is a simulated image of the two dot Image Cropping Pattern produced by the ICP Generator of Fig. 33A, at a distance of 40mm from its pair of flat-convex lenses, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem;
  • Fig. 33D2 is a simulated image of the two dot Image Cropping Pattern produced by the ICP Generator of Fig. 33A, at a distance of 80mm from its pair of flat-convex lenses, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem;
  • Fig. 33D3 is a simulated image of the two dot Image Cropping Pattern produced by the ICP Generator of Fig. 33 A, at a distance of 120mm from its pair of flat-convex lenses, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem;
  • Fig. 33D4 is a simulated image of the two dot Image Cropping Pattern produced by the ICP Generator of Fig. 33A, at a distance of 160mm from its pair of flat-convex lenses, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem;
  • Fig. 33D5 is a simulated image of the two dot Image Cropping Pattern produced by the ICP Generator of Fig. 33A, at a distance of 200mm from its pair of flat-convex lenses, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem;
  • Fig. 34A is a schematic representation of a second illustrative embodiment of the VLD-based Image Cropping Pattern Generator of the present invention, comprising a VLD located at the focus of a biconical lens (having a biconical surface and a cylindrical surface) arranged before the VLD, and four flat-convex lenses arranged in four corners, and which optical assembly is capable of generating and projecting a four (4) dot image cropping pattern (ICP) within the field of view of the of the Multi-Mode Area-type Image Formation and Detection Subsystem;
  • ICP dot image cropping pattern
  • Fig. 34B and 34C taken together provide a composite ray-tracing diagram for the third illustrative embodiment of the VLD-based Image Cropping Pattern Generator depicted in Fig. 34A, showing that the biconical lens enlarges naturally diverging light rays from the VLD in the cylindrical direction (but not the other) and thereafter, the four flat-convex lenses focus the enlarged laser light beam to generate a four parallel beams of laser illumination which form a four (4) dot image cropping pattern (ICP) within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem, wherein the spacing between the four dots of illumination in the ICP is a function of distance from the flat-convex lens;
  • ICP dot image cropping pattern
  • Fig. 34Dl is a simulated image of the linear Image Cropping Pattern produced by the ICP Generator of Fig. 34A, at a distance of 40mm from its flat-convex lens, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem;
  • Fig. 34D2 is a simulated image of the linear Image Cropping Pattern produced by the ICP Generator of Fig. 34A, at a distance of 80mm from its flat-convex lens, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem;
  • Fig. 34D3 is a simulated image of the linear Image Cropping Pattern produced by the ICP Generator of Fig. 34A, at a distance of 120mm from its flat-convex lens, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem;
  • Fig. 34D4 is a simulated image of the linear Image Cropping Pattern produced by the ICP Generator of Fig. 34A, at a distance of 160mm from its flat-convex lens, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem;
  • Fig. 34D5 is a simulated image of the linear Image Cropping Pattern produced by the ICP Generator of Fig. 34A, at a distance of 200mm from its fiat-convex lens, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem;
  • Fig. 35 is a schematic representation of a third illustrative embodiment of the VLD-based Image Cropping Pattern Generator of the present invention, comprising a VLD and a light diffractive optical (DOE) element (e.g. volume holographic optical element) forming an optical assembly which is capable of generating and projecting a four (4) dot image cropping pattern (ICP) within the field of view of the of the Multi-Mode Area-type Image Formation and Detection Subsystem, similar to that generated using the refractive optics based device shown in Fig. 36A;
  • DOE light diffractive optical
  • ICP dot image cropping pattern
  • Fig. 36 is a schematic representation of a digital image captured within the field of view (FOV) of the bar code symbol reader illustrated in Figs. 3 1 and 32, wherein the clusters of pixels indicated by reference characters (a,b,c,d) represent the four illumination spots (i.e. dots) associated with the Image Cropping Pattern (ICP) projected in the FOV;
  • FOV field of view
  • ICP Image Cropping Pattern
  • Fig. 37 is a flow chart setting forth the steps involved in carrying out the second illustrative embodiment of the image cropping pattern targeting/marking and post-image capture cropping process of the present invention embodied in embodied within the bar code symbol reader illustrated in Figs. 31 and 32;
  • Fig. 38A is a first perspective view of an alternative housing design for use with the unitary PLITM-based object identification and attribute acquisition subsystem of the present invention
  • Fig. 38Al is a schematic representation of a first illustrative embodiment of the bioptical PLIFM-based product dimensioning, analysis and identification system of the present invention, comprising a pair of PLIlM-based object identification and attribute acquisition subsystems, wherein each PLIIM-based subsystem employs visible laser diodes (VLDs) having different color producing wavelengths to produce a multi-spectral planar laser illumination beam (PLIB), and a 1 -D (linear-type) CCD image detection array within the compact system housing to capture images of objects (e.g. produce) that are processed in order to determine the shape/geometry, dimensions and color of such products in diverse retail shopping environments;
  • VLDs visible laser diodes
  • PLIB multi-spectral planar laser illumination beam
  • 1 -D CCD image detection array within the compact system housing to capture images of objects (e.g. produce) that are processed in order to determine the shape/geometry, dimensions and color of such products in diverse retail shopping environments
  • Fig. 38B2 is a schematic representation of the bioptical PLIIM-based product dimensioning, analysis and identification system, showing its PLIIM-based subsystems and 2-D scanning volume in greater detail;
  • Fig. 38Cl is a schematic representation of a second illustrative embodiment of the bioptical PLIIM-based product dimensioning, analysis and identification system of the present invention, comprising a pair of PLIIM-based object identification and attribute acquisition subsystems, wherein each PLIIM-based subsystem employs visible laser diodes (VLDs) having different color producing wavelengths to produce a multi-spectral planar laser illumination beam (PLIB), and a 2-D (area-type) CCD image detection array within the compact system housing to capture images of objects (e.g. produce) that are processed in order to determine the shape/geometry, dimensions and color of such products in diverse retail shopping environments;
  • VLDs visible laser diodes
  • PLIB multi-spectral planar laser illumination beam
  • 2-D (area-type) CCD image detection array within the compact system housing to capture images of objects (e.g. produce) that are processed in order to determine the shape/geometry, dimensions and color of such products in diverse retail shopping environments
  • Fig. 38C2 is a schematic representation of the bioptical PLIIM-based product dimensioning, analysis and identification system of Fig. 38Cl, showing its PLIIM-based subsystems and 3-D scanning volume in greater detail;
  • Fig. 39A is a perspective view of a first illustrative embodiment of the PLIIM-based hand- supportable linear imager of the present invention which contains within its housing, (1) a PLIIM-based image capture and processing engine comprising a dual-VLD PLIA and a 1-D (i.e. linear) image detection array with vertically-elongated image detection elements and configured within an optical assembly that operates in accordance with a method of speckle-pattern noise reduction of the present invention, (2) a LCD display panel for displaying images captured by said engine and information provided by a host computer system or other information supplying device, and (3) a manual data entry keypad for manually entering data into the imager during diverse types of information-related transactions supported by the PLIIM-based hand-supportable imager;
  • a PLIIM-based image capture and processing engine comprising a dual-VLD PLIA and a 1-D (i.e. linear) image detection array with vertically-elongated image detection elements and configured within an optical assembly that operates in accordance with
  • Fig. 39B is a schematic presentation of a transportable PLIIM-based 3-D digitization device ("3-D digitizer") of the present invention capable of producing 3-D digitized data models of scanned objects, for viewing on a LCD view finder integrated with the device housing (or on the display panel of an external computer graphics workstation), wherein the object under analysis is controllably rotated through a single planar laser illumination beam (PLIB) and a single amplitude modulated (AM) laser scanning beam generated by the 3-D digitization device so as to optically scan the object and automatically capture linear images and range-profile maps thereof relative to a coordinate reference system symbolically embodied within the 3-D digitization device, for subsequent reconstruction therein using computer-assisted tomographic (CAT) techniques to generate a 3-D digitized data model of the object for display, viewing and use in diverse applications;
  • PLIB planar laser illumination beam
  • AM amplitude modulated
  • Fig. 39C is a schematic presentation of a transportable PLIIM-based 3-D digitization device ("3-D digitizer") of the present invention capable of producing 3-D digitized data models of scanned objects, for viewing on a LCD view finder integrated with the device housing (or on the display panel of an external computer graphics workstation), wherein a single planar laser illumination beam (PLIB) and a single amplitude modulated (AM) laser scanning beam are generated by the 3-D digitization device and automatically swept through the 3-D scanning volume in which the object under analysis resides so as to optically scan the object and automatically capture linear images and range-profile maps thereof relative to a coordinate reference system symbolically embodied within the 3-D digitization device, for subsequent reconstruction therein using computer-assisted tomographic (CAT) techniques to generate a 3-D digitized data model of the object for display, viewing and use in diverse applications;
  • CAT computer-assisted tomographic
  • Fig. 39D is a perspective view of a "vertical-type" 3-D PLIIM-based CAT scanning system of the present invention capable of producing 3-D geometrical models of human beings, animals, and other objects, for viewing on a computer graphics workstation, wherein a three orthogonal planar laser illumination beams (PLIBs) and three orthogonal amplitude modulated (AM) laser scanning beams are controllably transported vertically through the 3-D scanning volume disposed above the support platform of the system so as to optically scan the object under analysis and capture linear images and range-profile maps thereof relative to a global coordinate reference system, for subsequent reconstruction in the computer workstation using computer-assisted tomographic (CAT) techniques to generate a 3-D geometrical model of the object;
  • PLIBs planar laser illumination beams
  • AM orthogonal amplitude modulated
  • Fig. 40 is a perspective view of the digital image capture and processing engine of the present invention, showing the projection of a visible illumination-based Image Cropping Pattern (ICP) within the field of view (FOV) of the engine, during object illumination and image capture operations;
  • ICP visible illumination-based Image Cropping Pattern
  • Fig. 41 is a close-up, perspective view of the digital image capture and processing engine of the present invention depicted in Fig. 40, showing the assembly of an illumination/targeting optics panel, an illumination board, a lens barrel assembly, a camera housing, and a camera board, into a an ultra-compact form factor offering advantages of light-weight construction, excellent thermal management, and exceptional image capture performance;
  • Fig. 42 is a side perspective view of the digital image capture and processing engine of Fig. 40, showing how the various components are arranged with respect to each other;
  • Fig. 43 is an elevated front view of the digital image capture and processing engine of Fig. 40 , taken along the optical axis of its image formation optics;
  • Fig. 44 is a bottom view of the digital image capture and processing engine of Fig. 40, showing the bottom of its mounting base for use in mounting the engine within diverse host systems;
  • Fig. 45 is a top view of the digital image capture and processing engine of Fig. 40;
  • Fig. 46 is a first side view of the digital image capture and processing engine of Fig. 40;
  • Fig. 47 is a second partially cut-away side view of the digital image capture and processing engine taken in Fig. 46, revealing the light conductive pipe used to collect and conduct light energy from the FOV of the Multi-Mode Area-Type Image Formation and Detection Subsystem, and direct it to the photo-detector associated with the Automatic Light Exposure Measurement and Illumination Control Subsystem;
  • Fig. 48 is a first cross-sectional view of the digital image capture and processing engine taken in Fig. 46, revealing the light conductive pipe used to collect and conduct light energy from the FOV of the Multi-Mode Area- Type Image Formation and Detection Subsystem;
  • Fig. 49 is a second cross-sectional view of the digital image capture and processing engine taken in Fig. 46, revealing the light conductive pipe used to collect and conduct light energy from the FOV of the Multi-Mode Area-Type Image Formation and Detection Subsystem;
  • Fig. 50 is an exploded, perspective view of the digital, image capture and processing engine of Fig. 40, showing how the illumination/targeting optics panel, the illumination board, the lens barrel assembly, the camera housing, the camera board and its assembly pins are arranged and assembled with respect to each other in accordance with the principles of the present invention;
  • Fig. 51 is a perspective view of the illumination/targeting optics panel, the illumination board and the camera board of digital image capture and processing engine of Fig. 40, showing assembled with the lens barrel assembly and the camera housing removed for clarity of illustration;
  • Fig. 52 is a perspective view of the illumination/targeting optics panel and the illumination board of the engine of the present invention assembled together as a subassembly using the assembly pins;
  • Fig. 53 is a perspective view of the subassembly of Fig. 52 arranged in relation to the lens barrel assembly, the camera housing and the camera board of the engine of the present invention, and showing how these system components are assembled together to produce the digital image capture and processing engine of Fig. 40;
  • Fig. 54 is a schematic block diagram representative of a system design for the digital image capture and processing engine illustrated in Figs. 40 through 53, wherein the system design is shown comprising (1 ) a Multi-Mode Area-Type Image Formation and Detection (i.e.
  • Camera Subsystem having image formation (camera) optics for producing a field of view (FOV) upon an object to be imaged and a CMOS or like area-type image sensing array for detecting imaged light reflected off the object during illumination operations in either (i) a narrow-area image capture mode in which a few central rows of pixels on the image sensing array are enabled, or (ii) a wide-area image capture mode in which substantially all rows of the image sensing array are enabled, (2) a LED-Based Illumination Subsystem for producing a wide area field of narrow-band illumination within the FOV of the Image Formation And Detection Subsystem during the image capture mode, so that only light transmitted from the LED-Based Illumination Subsystem and reflected from the illuminated object and transmitted through a narrow-band transmission-type optical filter realized within the hand-supportable housing (i.e.
  • an Image Cropping Pattern Generator for generating a visible illumination-based Image Cropping Pattern (ICP) projected within the field of view (FOV) of the Multi-Mode Area-type Image Formation and Detection Subsystem, (3) an IR-based object presence and range detection subsystem for producing an IR-based object detection field within the FOV of the Image Formation and Detection Subsystem, (4) an Automatic Light Exposure Measurement and Illumination Control Subsystem for measuring illumination levels in the FOV and controlling the operation of the LED-Based Multi-Mode Illumination Subsystem, during the image capture mode, (5) an Image Capturing and Buffering Subsystem for capturing and buffering 2-D images detected by the Image Formation and Detection Subsystem, (6) an Image Processing and Cropped Image Locating Module for processing captured and buffered images to locate the image region corresponding
  • Fig. 55Al is a perspective view of an alternative illustrative embodiment of the digital image capture and processing engine shown in Figs. 40 through 53, adapted for POS applications and reconfigured so that the illumination/aiming subassembly shown in Fig. 52 is mounted adjacent the light transmission window of the engine housing, whereas the remaining subassembly is mounted relative to the bottom of the engine housing so that the optical axis of the camera lens is parallel with the light transmission aperture, and a field of view (FOV) folding mirror is mounted beneath the illumination/aiming subassembly for directing the FOV of the system out through the central aperture formed in the illumination/aiming subassembly;
  • FOV field of view
  • Fig. 55A2 is a schematic block diagram representative of a system design for the digital image capture and processing engine of the present invention shown in Fig. 55Al , wherein the system design is similar to that shown in Fig. 2Al , except that the Automatic Light Exposure Measurement and Illumination Control Subsystem is adapted to measure the light exposure on a central portion of the CMOS image sensing array and control the operation of the LED-Based Multi-Mode Illumination Subsystem in cooperation with a software-based illumination metering program realized within the Multi-Mode Image Processing Based Bar Code Symbol Reading Subsystem, involving the real-time exposure quality analysis of captured digital images in accordance with the adaptive system control method of the present invention, illustrated in Figs. 27A through 27E;
  • Fig. 55Bl is a perspective view of an automatic imaging-based bar code symbol reading system of the present invention supporting a presentation-type mode of operation using wide-area illumination and video image capture and processing techniques, and employing the general engine design shown in Fig. 56Al ;
  • Fig. 55B2 is a cross-sectional view of the system shown in Fig. 55Bl ;
  • Fig. 55B3 is a schematic block diagram representative of a system design for the digital image capture and processing engine of the present invention shown in Fig. 55B l, wherein the system design is similar to that shown in Fig. 2Al, except that the Automatic Light Exposure Measurement and Illumination Control Subsystem is adapted to measure the light exposure on a central portion of the CMOS image sensing array and control the operation of the LED-Based Multi-Mode Illumination Subsystem in cooperation with a software-based illumination metering program realized within the Multi-Mode Image Processing Based Bar Code Symbol Reading Subsystem, performing the real-time exposure quality analysis of captured digital images in accordance with the adaptive system control method of the present invention, illustrated in Figs. 27A through 27E;
  • Fig. 55Cl is a perspective view of an automatic imaging-based bar code symbol reading system of the present invention supporting a pass-through mode of operation using narrow-area illumination and video image capture and processing techniques, as well as a presentation-type mode of operation using wide-area illumination and video image capture and processing techniques
  • Fig. 55C2 is a schematic representation illustrating the system of Fig. 55C l operated in its Pass-Through Mode of system operation;
  • Fig. 55C3 is a schematic representation illustrating the system of Fig. 55Cl operated in its Presentation Mode of system operation;
  • Fig. 55C4 is a schematic block diagram representative of a system design for the digital image capture and processing engine of the present invention shown in Figs. 55Cl and 55C2, wherein the system design is similar to that shown in Fig. 2Al, except for the following differences: (1 ) the Automatic Light Exposure Measurement and Illumination Control Subsystem is adapted to measure the light exposure on a central portion of the CMOS image sensing array and control the operation of the LED-Based Multi-Mode Illumination Subsystem in cooperation with the Multi-Mode Image Processing Based Bar Code Symbol Reading Subsystem, carrying out real-time quality analysis of captured digital images in accordance with the adaptive system control method of the present invention, illustrated in Figs.
  • the narrow-area field of illumination and image capture is oriented in the vertical direction with respect to the counter surface of the POS environment, to support the Pass- Through Mode of the system, as illustrated in Fig. 55C2; and (3) the IR-based object presence and range detection system employed in Fig. 55A2 is replaced with an automatic IR-based object presence and direction detection subsystem which comprises four independent IR-based object presence and direction detection channels;
  • Fig. 55C5 is a schematic block diagram of the automatic IR-based object presence and direction detection subsystem employed in the bar code reading system illustrated in Figs. 55Cl and 55C4, showing four independent IR-based object presence and direction detection channels which automatically generate activation control signals for four orthogonal directions within the FOW of the system, which are received and processed by a signal analyzer and control logic block;
  • Fig. 56A is a perspective view of a first illustrative embodiment of a projection-type POS image-processing based bar code symbol reading system, employing the digital image capture and processing engine showing in Fig. 55;
  • Fig. 56B is a perspective view of a second illustrative embodiment of a projection-type POS image-processing based bar code symbol reading system, employing the digital image capture and processing engine showing in Fig. 55;
  • Fig. 56C is a perspective view of a third illustrative embodiment of a projection-type POS image-processing based bar code symbol reading system, employing the digital image capture and processing engine showing in Fig. 55;
  • Fig. 57 is a perspective view of a price lookup unit (PLU) system employing a digital image capture and processing subsystem of the present invention identifying bar coded consumer products in retail store environments, and displaying the price thereof on the LCD panel integrated in the system;
  • PLU price lookup unit
  • Fig. 58 is a high-level flow chart illustrating the steps involving carrying out the method of the present invention, wherein the system behavior (i.e. features) of the imaging-based bar code symbol reading system of the present invention can be modified by the end-user, within a set of manufacturer- defined constraints (i.e. imposed on modifiable features and functions within features), by the end-user developing, installing/deploying and configuring "plug-in modules" (i.e. libraries) for any modifiable task within the Application Layer of the system, so as to allow the end user to flexible modify and/or extend standard (i.e. prespecified) features and functionalities of the system, and thus satisfy customized end-user application requirements, but without requiring detailed knowledge about the hard-ware platform of the system, its communication with the environment, and/or its user interfaces.
  • the system behavior i.e. features
  • the imaging-based bar code symbol reading system of the present invention can be modified by the end-user, within a set of manufacturer- defined constraints (i.e.
  • Fig. 59 is an exemplary flow chart representation showing what operations are carried out when the "Modifiable” Main Task is (enabled and) executed within the Application Layer of the system;
  • Fig. 59A is an exemplary flow chart representation showing what operations are carried out when the system feature called "Image Preprocessing" is executed within the Image-Processing Based Bar Code Symbol Reading Subsystem software module in the Application Layer of the system;
  • Fig. 59B is an exemplary flow chart representation showing what operations are carried out when the system feature called "Image Processing and Bar Code Decoding" is executed within the Modifiable Main Task software module in the Application Layer of the system;
  • Fig. 59C is an exemplary flow chart representation showing what operations are carried out when the system feature called "Data Output Procedure" is executed within the Modifiable Main Task in the Application Layer of the system;
  • Fig. 59Cl is an exemplary flow chart representation showing what operations are carried out when the system feature called "Data Formatting Procedure” is executed within the Data Output Procedure software module in the Application Layer of the system;
  • Fig. 59C2 is an exemplary flow chart representation showing what operations are carried out when the system feature called "Scanner Configuration Procedure" is executed within the Data Output Procedure software module in the Application Layer of the system.
  • the present invention addresses the shortcomings and drawbacks of prior art digital image capture and processing systems and devices, including laser and digital imaging-based bar code symbol readers, by providing a novel system architecture, platform and development environment which enables VARs, OEMs and others (i.e. other than the original system designers) to modify and/or extend the standard system features and functions of a very broad class of digital image capture and processing systems and devices, without requiring such third-parties to possess detailed knowledge about the hardware platform of the system, its communications with outside environment, and/or its user-related interfaces.
  • the digital image capture and processing system of the present invention 1000 employs a multi-tier software system architecture capable of supporting various subsystems providing numerous standard system features and functions that can be modified and/or extended using the innovative plug-in programming methods of the present invention.
  • such subsystems include: an object presence detection subsystem; an object range detection subsystem; an object velocity detection subsystem; an object dimensioning subsystem; a field of view (FOV) illumination subsystem; an imaging formation and detection (IFD) subsystem; a digital image processing subsystem; a sound indicator output subsystem; a visual indictor output subsystem; a power management subsystem; an image time/space stamping subsystem; a network (IP) address storage subsystem; a remote monitoring/servicing subsystem; an input/output subsystem; and a system control and/or coordination subsystem, generally integrated as shown.
  • Triggering Feature i.e. Trigger Event Generation: Object Presence Detection Subsystem
  • IR Object Presence Detection e.g. OM, OFF
  • Manual Triggering e.g. ON, OFF
  • Object Range Detection Feature Object Range Detection Subsystem Standard System Functions:
  • IR-Based Long/Short Range Detection e.g. ON, OFF
  • IR-Based Quantized/Incremental Range Detection e.g. ON, OFF
  • Object Velocity Detection Feature Object Velocity Detection Subsystem Standard System Functions:
  • LIDAR-Based Object Velocity Detection e.g. ON, OFF
  • IR-PULSE-DOPPLER Object Velocity Detection e.g. ON, OFF
  • Object Dimensioning Feature Object Dimensioning Subsystem Standard System Functions:
  • LlDAR-based Object Dimensioning e.g. ON or OFF
  • Structured-Laser Light Object Dimensioning e.g. ON or OFF
  • Illumination Mode e.g. Ambient/OFF, LED Continuous, and LED Strobe/Flash
  • Illumination Field Type e.g. Narrow-Area Near-Field Illumination, Wide-Area Far-Field Illumination,
  • Imaging Formation and Detection Feature Imaging Formation and Detection (IFD) Subsystem
  • Image Capture Mode e.g. Narrow-Area Image Capture Mode, Wide-Area Image Capture Mode
  • Image Capture Control e.g. Single Frame, Video Frames
  • Exposure Time For Each Block Of Imaging Pixels Within The Image Sensing Array (e.g. programmable in increments of milliseconds)
  • Field Of View Marking e.g. One Dot Pattern; Two Dot Pattern; Four Dot Pattern; Visible Line Pattern;
  • Image Cropping Pattern on Image Sensing Array e.g. Xl,y2,x2,y2,x3,y3,x4,y4
  • Image frames e.g. digital filter 1, digital filter 2, ... digital filter n
  • Post-Processing e.g. Digital Data Filter 1 , Digital Data Filter 2, etc.
  • Sound Pitch e.g. freq. 1, freq2, freq3,...sound 1 ,... sound N
  • Visual Indictor Output Feature Visual lndictor Output Subsystem Standard System Functions:
  • Indicator Brightness e.g. High, Low, Medium Brightness
  • Indicator Color e.g. red, green, yellow, blue, white
  • GPS-Based Time/Space Stamping (e.g. ON, OFF)
  • Network Server Time Assignment (e.g. ON, OFF)
  • IP Address Storage Feature IP Address Storage Subsystem
  • Remote Monitoring/Servicing Feature Remote Monitoring/Servicing Subsystem
  • TCP/IP Connection (e.g. ON, OFF)
  • SNMP Agent e.g. ACTIVE or DEACTlVE
  • Output Image File Formats e.g. JPG/EXIF, TIFF, PICT, PDF, etc
  • Output Video File Formats e.g. MPEG, AVI, etc
  • LCD Graphical Display
  • the digital image capture and processing system of the present invention 1000 can be implemented using a digital camera board and a printed circuit (PC) board that are interfaced together.
  • the digital image capture and processing system of the present invention 1000 can also be implemented using a single hybrid digital camera/PC board, as shown.
  • the digital image capture and processing system of the present invention can be integrated or embodied within third-party products, such as, for example, but not limited to, image-processing based bar code symbol reading systems, OCR systems, object recognition systems, portable data terminals (PDTs), mobile phones, computer mice-type devices, personal computers, keyboards, consumer appliances, automobiles, ATMs, vending machines, reverse-vending machines, retail POS-based transaction systems, 2D or 2D digitizers, and CAT scanning systems, automobile identification systems, package inspection systems, and personal identification systems, and the like.
  • third-party products such as, for example, but not limited to, image-processing based bar code symbol reading systems, OCR systems, object recognition systems, portable data terminals (PDTs), mobile phones, computer mice-type devices, personal computers, keyboards, consumer appliances, automobiles, ATMs, vending machines, reverse-vending machines, retail POS-based transaction systems, 2D or 2D digitizers, and CAT scanning systems, automobile identification systems, package inspection systems, and personal identification systems, and the
  • the digital image capture and processing system of the present invention has a set of standard features and functions as described above, and a set of custom features and functionalities that satisfy customized end-user application requirements, which typically aim to modify and/or extend such standard system features and functions for particular applications at hand.
  • the digital image capture and processing system of the present invention (regardless of the third-product into which the system is integrated or embodied), generally comprises: a digital camera subsystem for projecting a field of view (FOV) upon an object to be imaged in said FOV, and detecting imaged light reflected off the object during illumination operations in an image capture mode in which one or more digital images of the object are formed and detected by said digital camera subsystem; a digital image processing subsystem for processing digital images and producing raw or processed output data or recognizing or acquiring information graphically represented therein, and producing output data representative of the recognized information; an input/output subsystem for transmitting said output data to an external host system or other information receiving or responding device; a system control system for controlling and/or coordinating the operation of the subsystems above; and a computing platform for supporting the implementation of one or more of the subsystems above, and the features and functions of the digital image capture and processing
  • FOV field of view
  • the computing platform includes (i) memory for storing pieces of original product code written by the original designers of the digital image capture and processing system, and (ii) a microprocessor for running one or more Applications by calling and executing pieces of said original product code in a particular sequence, so as support a set of standard features and functions which characterize a standard behavior of the digital image capture and processing system.
  • these pieces of original product code have a set of place holders into which third-party product code can be inserted or plugged by third parties, including value-added resellers (VARs), original equipment manufacturers (OEMs), and also end-users of the digital image capture and processing system.
  • VARs value-added resellers
  • OEMs original equipment manufacturers
  • one or more pieces of third- party code are inserted or plugged into the set of place holders, and operate to extend the standard features and functions of the digital image capture and processing system, and modify the standard behavior thereof into a custom behavior for the digital image capture and processing system.
  • the digital image capture and processing system will further comprise a housing having a light transmission window, wherein the FOV is projected through the light transmission window and upon an object to be imaged in the FOV.
  • these pieces of original product code as well as third-party product code are maintained in one or more libraries supported in the memory structure of the computing platform.
  • such memory comprises a memory architecture having different kinds of memory, each having a different access speed and performance characteristics.
  • the end-user such a value-added reseller (VAR) or original equipment manufacturer (OEM)
  • VAR value-added reseller
  • OEM original equipment manufacturer
  • the end-user can write such pieces of third-party code (i.e. plug-ins) according to specifications set by the original system designers, and these pieces of custom code can be plugged into the place holders, so as to modify and extend the features and functions of the digital image capture and processing system (or third-party product into which the system is integrated or embodied), and modify the standard behavior of the digital image capture and processing system into a custom behavior for the digital image capture and processing system, without permanently modifying the standard features and functions of the digital image capture and processing system.
  • VAR value-added reseller
  • OEM original equipment manufacturer
  • the digital camera system comprises: a digital image formation and detection subsystem having (i) image formation optics for projecting the FOV through a light transmission window and upon the object to be imaged in the FOV, and (ii) an image sensing array for detecting imaged light reflected off the object during illumination operations in an image capture mode in which sensor elements in the image sensing array are enabled so as to detect one or more digital images of the object formed on the image sensing array; an illumination subsystem having an illumination array for producing and projecting a field of illumination through the light transmission window and within the FOV during the image capture mode; and an image capturing and buffering subsystem for capturing and buffering these digital images detected by the image formation and detection subsystem.
  • the image sensing array can be realized by a digital image sensing structure selected from the group consisting of an area-type image sensing array, and a linear-type image sensing array.
  • the memory employed in the computing platform of the system maintains system parameters used to configure the functions of the digital image capture and processing system.
  • the memory comprises a memory architecture that supports a three-tier modular software architecture characterized by an Operating System (OS) layer, a System CORE (SCORE) layer, and an Application layer and responsive to the generation of a triggering event within said digital-imaging based code symbol reading system.
  • the OS layer includes one or more software modules selected from the group consisting of an OS kernal module, an OS file system module, and device driver modules.
  • the SCORE layer includes one or more of software modules selected from the group consisting of a tasks manager module, an events dispatcher module, an input/output manager module, a user commands manager module, the timer subsystem module, an input/output subsystem module and an memory control subsystem module.
  • the application layer includes one or more software modules selected from the group consisting of a code symbol decoding module, a function programming module, an application events manager module, a user commands table module, and a command handler module.
  • the field of illumination projected from the illumination subsystem can be narrow-band illumination produced from an array of light emitting diodes (LEDs).
  • the digital image processing subsystem is typically adapted to process captured digital images so as to read one or more code symbols graphically represented in the digital images, and produces output data in the form of symbol character data representative of the read one or more code symbols.
  • Each code symbol can be a bar code symbol selected from the group consisting of a 1 D bar code symbol, a 2D bar code symbol, and a data matrix type code symbol structure.
  • the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention 1 is shown in detail comprising a hand-supportable housing 2 having a handle portion 2A and a head portion 2B that is provided with a light transmission window 3 with a high-pass (red-wavelength reflecting) optical filter element 4A having light transmission characteristics set forth in Fig. 5A2, in the illustrative embodiment.
  • high-pass optical filter element 4A cooperates within an interiorly mounted low-pass optical filter element 4B characterized in Fig.
  • the hand-supportable housing 2 of the illustrative embodiment comprises: left and right housing handle halves 2Al and 2A2; a foot-like structure 2A3 which is mounted between the handle halves 2Al and 2A2; a trigger switch structure 2C which snap fits within and pivots within a pair of spaced apart apertures 2Dl and 2D2 provided in the housing halves; a light transmission window panel 5 through which light transmission window 3 is formed and supported within a recess formed by handle halves 2A 1 and 2A2 when they are brought together, and which supports all LED illumination arrays provided by the system; an optical bench 6 for supporting electro-optical components and operably connected an orthogonally-mounted PC board 7 which is mounted within the handle housing halves; a top housing portion 2Bl for connection with the housing handle halves 2Al and 2A2 and enclosing the head portion of the housing; light pipe lens element 7 for mounting over an array of light emitting diodes (LEDs)
  • LEDs light emitting diodes
  • the form factor of the hand-supportable housing might be different.
  • the housing need not even be hand-supportable, but rather might be designed for stationary support on a desktop or countertop surface, or for use in a commercial or industrial application.
  • the hand-supportable Digital Imaging-Based Bar Code Symbol Reading Device 1 of the illustrative embodiment comprises: an IR-based Object Presence and Range Detection Subsystem 12; a Multi-Mode Area-type Image Formation and Detection (i.e.
  • Subsystem 13 having narrow-area mode of image capture, near-field wide-area mode of image capture, and a far-field wide-area mode of image capture; a Multi-Mode LED-Based Illumination Subsystem 14 having narrow-area mode of illumination, near-field wide-area mode of illumination, and a far-field wide-area mode of illumination; an Automatic Light Exposure Measurement and Illumination Control Subsystem 15; an Image Capturing and Buffering Subsystem 16; a Multi-Mode Image-Processing Bar Code Symbol Reading Subsystem 17 having five modes of image-processing based bar code symbol reading indicated in Fig.
  • the primary function of the IR-based Object Presence and Range Detection Subsystem 12 is to automatically produce an IR-based object detection field 20 within the FOV of the Multi-Mode Image Formation and Detection Subsystem 13, detect the presence of an object within predetermined regions of the object detection field (2OA, 20B), and generate control activation signals Al which are supplied to the System Control Subsystem 19 for indicating when and where an object is detected within the object detection field of the system.
  • the Multi-Mode Image Formation And Detection (I.E. Camera) Subsystem 13 has image formation (camera) optics 21 for producing a field of view (FOV) 23 upon an object to be imaged and a CMOS area-image sensing array 22 for detecting imaged light reflected off the object during illumination and image acquisition/capture operations.
  • image formation camera
  • FOV field of view
  • CMOS area-image sensing array 22 for detecting imaged light reflected off the object during illumination and image acquisition/capture operations.
  • the primary function of the Multi-Mode LED-Based Illumination Subsystem 14 is to produce a narrow-area illumination field 24, near-field wide-area illumination field 25, and a far-field wide-area illumination field 25, each having a narrow optical- bandwidth and confined within the FOV of the Multi-Mode Image Formation And Detection Subsystem 13 during narrow-area and wide-area modes of imaging, respectively.
  • This arrangement is designed to ensure that only light transmitted from the Multi-Mode Illumination Subsystem 14 and reflected from the illuminated object is ultimately transmitted through a narrow-band transmission-type optical filter subsystem 4 realized by (1) high-pass (i.e.
  • Fig. 5A4 sets forth the resulting composite transmission characteristics of the narrow-band transmission spectral filter subsystem 4, plotted against the spectral characteristics of the emission from the LED illumination arrays employed in the Multi-Mode Illumination Subsystem 14.
  • the primary function of the narrow-band integrated optical filter subsystem 4 is to ensure that the CMOS image sensing array 22 only receives the narrow-band visible illumination transmitted by the three sets of LED-based illumination arrays 27, 28 and 29 driven by LED driver circuitry 30 associated with the Multi-Mode Illumination Subsystem 14, whereas all other components of ambient light collected by the light collection optics are substantially rejected at the image sensing array 22, thereby providing improved SNR thereat, thus improving the performance of the system.
  • the primary function of the Automatic Light Exposure Measurement and Illumination Control Subsystem 15 is to twofold: (1 ) to measure, in real-time, the power density [joules/cm] of photonic energy (i.e. light) collected by the optics of the system at about its image sensing array 22, and generate Auto-Exposure Control Signals indicating the amount of exposure required for good image formation and detection; and (2) in combination with Illumination Array Selection Control Signal provided by the System Control Subsystem 19, automatically drive and control the output power of selected LED arrays 27, 28 and/or 29 in the Multi-Mode Illumination Subsystem, so that objects within the FOV of the system are optimally exposed to LED-based illumination and optimal images are formed and detected at the image sensing array 22.
  • the primary function of the Image Capturing and Buffering Subsystem 16 is to (1) detect the entire 2-D image focused onto the 2D image sensing array 22 by the image formation optics 21 of the system, (2) generate a frame of digital pixel data 31 for either a selected region of interest of the captured image frame, or for the entire detected image, and then (3) buffer each frame of image data as it is captured.
  • a single 2D image frame (31) is captured during each image capture and processing cycle, or during a particular stage of a processing cycle, so as to eliminate the problems associated with image frame overwriting, and synchronization of image capture and decoding processes, as addressed in US Patents Nos. 5,932,862 and 5,942,741 assigned to Welch Allyn, and incorporated herein by reference.
  • the primary function of the Multi-Mode Imaging-Based Bar Code Symbol Reading Subsystem 17 is to process images that have been captured and buffered by the Image Capturing and Buffering Subsystem 16, during both narrow-area and wide-area illumination modes of system operation.
  • image processing operation includes image-based bar code decoding methods illustrated in Figs. 14 through 25, and described in detail hereinafter.
  • the primary function of the Input/Output Subsystem 18 is to support standard and/or proprietary communication interfaces with external host systems and devices, and output processed image data and the like to such external host systems or devices by way of such interfaces. Examples of such interfaces, and technology for implementing the same, are given in US Patent No. 6,619,549, incorporated herein by reference in its entirety.
  • the primary function of the System Control Subsystem 19 is to provide some predetermined degree of control or management signaling services to each subsystem component integrated, as shown. While this subsystem can be implemented by a programmed microprocessor, in the illustrative embodiment, it is implemented by the three-tier software architecture supported on computing platform shown in Fig. 2M, and as represented in Figs. 1 1 A through 13L, and described in detail hereinafter.
  • the primary function of the manually-activatable Trigger Switch 2C integrated with the hand- supportable housing is to enable the user to generate a control activation signal upon manually depressing the Trigger Switch 2C, and to provide this control activation signal to the System Control Subsystem 19 for use in carrying out its complex system and subsystem control operations, described in detail herein.
  • the primary function of the System Mode Configuration Parameter Table 70 is to store (in nonvolatile/persistent memory) a set of configuration parameters for each of the available Programmable Modes of System Operation specified in the Programmable Mode of Operation Table shown in Figs. 26A and 26B, and which can be read and used by the System Control Subsystem 19 as required during its complex operations.
  • Fig. 2B shows a schematic diagram of a system implementation for the hand-supportable Digital Imaging-Based Bar Code Symbol Reading Device 1 illustrated in Figs. 2A through 2L.
  • the bar code symbol reading device is realized using a number of hardware component comprising: an illumination board 33 carrying components realizing electronic functions performed by the LED-Based Multi-Mode Illumination Subsystem 14 and Automatic Light Exposure Measurement And Illumination Control Subsystem 15; a CMOS camera board 34 carrying high resolution (1280 X 1024 7-bit 6 micron pixel size) CMOS image sensing array 22 running at 25Mhz master clock, at 7 frames/second at 1280* 1024 resolution with randomly accessible region of interest (ROI) window capabilities, realizing electronic functions performed by the Multi-Mode Image Formation and Detection Subsystem 13; a CPU board 35 (i.e.
  • computing platform including (i) an Intel Sabinal 32-Bit Microprocessor PXA210 36 running at 200 mHz 1 .0 core voltage with a 16 bit lOOMhz external bus speed, (ii) an expandable (e.g. 7+ megabyte) Intel J3 Asynchronous 16-bit Flash memory 37, (iii) an 16 Megabytes of 100 MHz SDRAM 38, (iv) an Xilinx Spartan II FPGA FIFO 39 running at 50Mhz clock frequency and 60MB/Sec data rate, configured to control the camera timings and drive an image acquisition process, (v) a multimedia card socket 40, for realizing the other subsystems of the system, (vi) a power management module 41 for the MCU adjustable by the I2C bus, and (vii) a pair of UARTs 42A and 42B (one for an IRDA port and one for a JTAG port); an interface board 43 for realizing the functions performed by the I/O subsystem 18; and an IR-based object presence and range
  • the image formation optics 21 supported by the bar code reader provides a field of view of 103 mm at the nominal focal distance to the target, of approximately 70 mm from the edge of the bar code reader.
  • the minimal size of the field of view (FOV) is 62 mm at the nominal focal distance to the target of approximately 10 mm.
  • Fig. 4B the distance on Fig. 4B is given from the position of the image sensing array 22, which is located inside the bar code symbol reader approximately 80 mm from the edge.
  • the depth of field of the image formation optics varies from approximately 69 mm for the bar codes with resolution of 5 mils per narrow module, to 181 mm for the bar codes with resolution of 13 mils per narrow module.
  • the Multi-Mode Illumination Subsystem 14 is designed to cover the optical field of view (FOV) 23 of the bar code symbol reader with sufficient illumination to generate high-contrast images of bar codes located at both short and long distances from the imaging window.
  • the illumination subsystem also provides a narrow-area (thin height) targeting beam 24 having dual purposes: (a) to indicate to the user where the optical view of the reader is; and (b) to allow a quick scan of just a few lines of the image and attempt a super-fast bar code decoding if the bar code is aligned properly.
  • the entire field of view is illuminated with a wide-area illumination field 25 or 26 and the image of the entire field of view is acquired by Image Capture and Buffering Subsystem 16 and processed by Multi-Mode Bar Code Symbol Reading Subsystem 17, to ensure reading of a bar code symbol presented therein regardless of its orientation.
  • the interface board 43 employed within the bar code symbol reader provides the hardware communication interfaces for the bar code symbol reader to communicate with the outside world.
  • the interfaces implemented in system will typically include RS232, keyboard wedge, and/or USB, or some combination of the above, as well as others required or demanded by the particular application at hand.
  • the Multi-Mode Image Formation And Detection (I FD) Subsystem 13 has a narrow-area image capture mode (i.e. where only a few central rows of pixels about the center of the image sensing array are enabled) and a wide-area image capture mode of operation (i.e. where all pixels in the image sensing array are enabled).
  • the CMOS image sensing array 22 in the Image Formation and Detection Subsystem 13 has image formation optics 21 which provides the image sensing array with a field of view (FOV) 23 on objects to be illuminated and imaged. As shown, this FOV is illuminated by the Multi-Mode Illumination Subsystem 14 integrated within the bar code reader.
  • FOV field of view
  • the Multi-Mode Illumination Subsystem 14 includes three different LED-based illumination arrays 27, 28 and 29 mounted on the light transmission window panel 5, and arranged about the light transmission window 4A. Each illumination array is designed to illuminate a different portion of the FOV of the bar code reader during different modes of operation. During the narrow-area (linear) illumination mode of the Multi-Mode Illumination Subsystem 14, the central narrow-wide portion of the FOV indicated by 23 is illuminated by the narrow-area illumination array 27, shown in Fig. 3A.
  • the near-field wide-area illumination mode of the Multi-Mode Illumination Subsystem 14 which is activated in response to the IR Object Presence and Range Detection Subsystem 12 detecting an object within the near-field portion of the FOV 3 the near-field wide-area portion of the FOV is illuminated by the near-field wide-area illumination array 28, shown in Fig. 3A.
  • the far-field wide-area illumination mode of the Multi-Mode Illumination Subsystem 14 which is activated in response to the IR Object Presence and Range Detection Subsystem 12 detecting an object within the far-field portion of the FOV
  • the far-field wide-area portion of the FOV is illuminated by the far-field wide-area illumination array 29, shown in Fig. 3A.
  • the spatial relationships are shown between these fields of narrow-band illumination and the far and near field portions the FOV of the Image Formation and Detection Subsystem 13.
  • the Multi-Mode LED-Based Illumination Subsystem 14 is shown transmitting visible narrow-band illumination through its narrow-band transmission-type optical filter subsystem 4, shown in Fig. 3C and integrated within the hand-supportable Digital Imaging-Based Bar Code Symbol Reading Device.
  • the narrow-band illumination from the Multi-Mode Illumination Subsystem 14 illuminates an object with the FOV of the image formation optics of the Image Formation and Detection Subsystem 13, and light rays reflected and scattered therefrom are transmitted through the high-pass and low-pass optical filters 4A and 4B and are ultimately focused onto image sensing array 22 to form of a focused detected image thereupon, while all other components of ambient light are substantially rejected before reaching image detection at the image sensing array 22.
  • the red-wavelength reflecting high-pass optical filter element 4A is positioned at the imaging window of the device before the image formation optics 21, whereas the low-pass optical filter element 4B is disposed before the image sensing array 22 between the focusing lens elements of the image formation optics 21.
  • This forms narrow-band optical filter subsystem 4 which is integrated within the bar code reader to ensure that the object within the FOV is imaged at the image sensing array 22 using only spectral components within the narrow-band of illumination produced from Subsystem 14, while rejecting substantially all other components of ambient light outside this narrow range (e.g. 15 nm).
  • these lenses are held together within a lens holding assembly 45, as shown in Fig. 3E, and form an image formation subsystem arranged along the optical axis of the CMOS image sensing array 22 of the bar code reader.
  • the lens holding assembly 45 comprises: a barrel structure 45Al , 45A2 for holding lens elements 2 I A, 2 IB and 21 C; and a base structure 45B for holding the image sensing array 22; wherein the assembly is configured so that the barrel structure 45A slides within the base structure 45B so as to focus the fixed-focus lens assembly during manufacture.
  • the lens holding assembly 45 and imaging sensing array 22 are mounted along an optical path defined along the central axis of the system.
  • the image sensing array 22 has, for example, a 1280x1024 pixel resolution (1/2" format), 6 micron pixel size, with randomly accessible region of interest (ROl) window capabilities. It is understood, though, that many others kinds of imaging sensing devices (e.g. CCD) can be used to practice the principles of the present invention disclosed herein, without departing from the scope or spirit of the present invention.
  • the LED-Based Multi-Mode Illumination Subsystem 14 comprises: narrow-area illumination array 27; near-field wide-area illumination array 28; and far-field wide-area illumination array 29.
  • the three fields of narrow-band illumination produced by the three illumination arrays of subsystem 14 are schematically depicted in Fig. 4Al . As will be described hereinafter, with reference to Figs.
  • narrow-area illumination array 27 can be realized as two independently operable arrays, namely: a near-field narrow-area illumination array and a far-field narrow-area illumination array, which are activated when the target object is detected within the near and far fields, respectively, of the automatic IR-based Object Presence and Range Detection Subsystem 12 during wide-area imaging modes of operation.
  • the first illustrative embodiment of the present invention employs only a single field narrow-area (linear) illumination array which is designed to illuminate over substantially entire working range of the system, as shown in Fig. 4Al .
  • the narrow-area (linear) illumination array 27 includes two pairs of LED light sources 27Al and 27A2 provided with cylindrical lenses 27Bl and 27B2, respectively, and mounted on left and right portions of the light transmission window panel 5.
  • the narrow-area (linear) illumination array 27 produces narrow-area illumination field 24 of narrow optical- bandwidth within the FOV of the system.
  • narrow-area illumination field 24 has a height less than 10 mm at far field, creating the appearance of substantially linear or rather planar illumination field.
  • the near-field wide-area illumination array 28 includes two sets of (flattop) LED light sources 28A1-28A6 and 28A7-28A13 without any lenses mounted on the top and bottom portions of the light transmission window panel 5, as shown in Fig. 4B.
  • the near-field wide-area illumination array 28 produces a near-field wide-area illumination field 25 of narrow optical-bandwidth within the FOV of the system,
  • the far-field wide-area illumination array 29 includes two sets of LED light sources 29A1-29A6 and 29A7-29A13 provided with spherical (i.e. plano-convex) lenses 29B1 -29B6 and 29B7-29B13, respectively, and mounted on the top and bottom portions of the light transmission window panel 5.
  • the far-field wide-area illumination array 29 produces a far- field wide-area illumination beam of narrow optical-bandwidth within the FOV of the system.
  • the narrow-area (linear) illumination field 24 extends from about 30mm to about 200 mm within the working range of the system, and covers both the near and far fields of the system.
  • the near-field wide-area illumination field 25 extends from about 0 mm to about 100 mm within the working range of the system.
  • the far-field wide-area illumination field 26 extends from about 100 mm to about 200 mm within the working range of the system.
  • the Table shown in Fig. 4A2 specifies the geometrical properties and characteristics of each illumination mode supported by the Multi-Mode LED-based Illumination Subsystem 14 of the present invention.
  • the narrow-area illumination array 27 employed in the Multi-Mode LED-Based Illumination Subsystem 14 is optically designed to illuminate a thin area at the center of the field of view (FOV) of the imaging-based bar code symbol reader, measured from the boundary of the left side of the field of view to the boundary of its right side, as specified in Fig. 4Al .
  • FOV field of view
  • the narrow-area illumination field 24 is automatically generated by the Multi-Mode LED- Based Illumination Subsystem 14 in response to the detection of an object within the object detection field of the automatic IR-based Object Presence and Range Detection Subsystem 12.
  • the object detection field of the IR-based Object Presence and Range Detection Subsystem 12 and the FOV of the Image Formation and Detection Subsystem 13 are spatially co-extensive and the object detection field spatially overlaps the FOV along the entire working distance of the imaging-based bar code symbol reader.
  • the narrow-area illumination field 24, produced in response to the detection of an object, serves a dual purpose: it provides a visual indication to an operator about the location of the optical field of view of the bar code symbol reader, thus, serves as a field of view aiming instrument; and during its image acquisition mode, the narrow-area illumination beam is used to illuminated a thin area of the FOV within which an object resides, and a narrow 2-D image of the object can be rapidly captured (by a small number of rows of pixels in the image sensing array 22), buffered and processed in order to read any linear bar code symbols that may be represented therewithin.
  • Fig. 4Cl shows the Lambertian emittance versus wavelength characteristics of the LEDs used to implement the narrow-area illumination array 27 in the Multi-Mode Illumination Subsystem 14.
  • Fig. 4C2 shows the Lambertian emittance versus polar angle characteristics of the same LEDs.
  • Fig. 4C3 shows the cylindrical lenses used before the LEDs (633 nm InGaAlP) in the narrow-area (linear) illumination arrays in the illumination subsystem of the present invention. As shown, the first surface of the cylindrical lens is curved vertically to create a narrow-area (linear) illumination pattern, and the second surface of the cylindrical lens is curved horizontally to control the height of the of the linear illumination pattern to produce a narrow-area illumination pattern.
  • Fig. 4Cl shows the Lambertian emittance versus wavelength characteristics of the LEDs used to implement the narrow-area illumination array 27 in the Multi-Mode Illumination Subsystem 14.
  • Fig. 4C2 shows the Lambertian emittance versus polar angle characteristics
  • FIG. 4C4 shows the layout of the pairs of LEDs and two cylindrical lenses used to implement the narrow-area illumination array of the illumination subsystem of the present invention.
  • each LED produces about a total output power of about 1 1.7 mW under typical conditions.
  • Fig. 4C5 sets forth a set of six illumination profiles for the narrow-area illumination fields produced by the narrow-area illumination arrays of the illustrative embodiment, taken at 30, 40, 50, 80, 120, and 220 millimeters along the field away from the imaging window (i.e. working distance) of the bar code reader of the present invention, illustrating that the spatial intensity of the area-area illumination field begins to become substantially uniform at about 80 millimeters.
  • the narrow-area illumination beam is usable beginning 40 mm from the light transmission/imaging window.
  • the near-field wide-area illumination array 28 employed in the LED-Based Multi-Mode Illumination Subsystem 14 is optically designed to illuminate a wide area over a near-field portion of the field of view (FOV) of the imaging-based bar code symbol reader, as defined in Fig. 4Al .
  • FOV field of view
  • the near-field wide-area illumination field 28 is automatically generated by the LED-based Multi-Mode Illumination Subsystem 14 in response to: (1 ) the detection of any object within the near-field of the system by the IR-based Object Presence and Range Detection Subsystem 12; and (2) one or more of following events, including, for example: (i) failure of the image processor to successfully decode process a linear bar code symbol during the narrow-area illumination mode; (ii) detection of code elements such as control words associated with a 2-D bar code symbol; and/or (iii) detection of pixel data in the image which indicates that object was captured in a state of focus.
  • the object detection field of the IR-based Object Presence and Range Detection Subsystem 12 and the FOV of the Image Formation And Detection Subsystem 13 are spatially coextensive and the object detection field spatially overlaps the FOV along the entire working distance of the imaging-based bar code symbol reader.
  • the intensity of the near-field wide-area illumination field during object illumination and image capture operations is determined by how the LEDs associated with the near-field wide array illumination arrays 28 are electrically driven by the Multi-Mode Illumination Subsystem 14.
  • the degree to which the LEDs are driven is determined by the intensity of reflected light measured near the image formation plane by the automatic light exposure and control subsystem 15.
  • the Automatic Light Exposure Measurement and Illumination Control Subsystem 15 will drive the LEDs more intensely (i.e. at higher operating currents).
  • Fig. 4Dl shows the Lambertian emittance versus wavelength characteristics of the LEDs used to implement the wide area illumination arrays in the illumination subsystem of the present invention.
  • Fig. 4D2 shows the Lambertian emittance versus polar angle characteristics of the LEDs used to implement the near field wide-area illumination arrays in the Multi-Mode Illumination Subsystem 14.
  • Fig. 4D4 is geometrical the layout of LEDs used to implement the narrow wide-area illumination array of the Multi-Mode Illumination Subsystem 14, wherein the illumination beam produced therefrom is aimed by angling the lenses before the LEDs in the near-field wide-area illumination arrays of the Multi-Mode Illumination Subsystem 14.
  • Fig. 4Dl shows the Lambertian emittance versus wavelength characteristics of the LEDs used to implement the wide area illumination arrays in the illumination subsystem of the present invention.
  • Fig. 4D2 shows the Lambertian emittance versus polar angle characteristics of the LEDs used to implement the near field wide-area illumination arrays
  • 4D5 sets forth a set of six illumination profiles for the near-field wide-area illumination fields produced by the near-field wide-area illumination arrays of the illustrative embodiment, taken at 10, 20, 30, 40, 60, and 100 millimeters along the field away from the imaging window (i.e. working distance) of the imaging-based bar code symbol reader 1.
  • the far-field wide-area illumination array 26 employed in the Multi-Mode LED-based Illumination Subsystem 14 is optically designed to illuminate a wide area over a far-field portion of the field of view (FOV) of the imaging-based bar code symbol reader, as defined in Fig. 4Al .
  • FOV field of view
  • the far-field wide-area illumination field 26 is automatically generated by the LED-Based Multi-Mode Illumination Subsystem 14 in response to: (1) the detection of any object within the near-field of the system by the IR-based Object Presence and Range Detection Subsystem 12; and (2) one or more of following events, including, for example: (i) failure of the image processor to successfully decode process a linear bar code symbol during the narrow-area illumination mode; (ii) detection of code elements such as control words associated with a 2-D bar code symbol; and/or (iii) detection of pixel data in the image which indicates that object was captured in a state of focus.
  • the object detection field of the IR-based ' Object Presence and Range Detection Subsystem 12 and the FOV 23 of the image detection and formation subsystem 13 are spatially coextensive and the object detection field 20 spatially overlaps the FOV 23 along the entire working distance of the imaging-based bar code symbol reader.
  • the intensity of the far-field wide-area illumination field during object illumination and image capture operations is determined by how the LEDs associated with the far-field wide-area illumination array 29 are electrically driven by the Multi-Mode Illumination Subsystem 14.
  • the degree to which the LEDs are driven is determined by the intensity of reflected light measured near the image formation plane by the Automatic Light Exposure Measurement And Illumination Control Subsystem 15.
  • the Automatic Light Exposure Measurement and Illumination Control Subsystem 15 will drive the LEDs more intensely (i.e. at higher operating currents).
  • the Automatic Light Exposure Measurement and Illumination Control Subsystem i.e. module 15 measures and controls the time duration which the Multi-Mode Illumination Subsystem 14 exposes the image sensing array 22 to narrow-band illumination (e.g. 633 nanometers, with approximately 15 nm bandwidth) during the image capturing/acquisition process, and automatically terminates the generation of such illumination when such computed time duration expires.
  • this global exposure control process ensures that each and every acquired image has good contrast and is not saturated, two conditions essential for consistent and reliable bar code reading
  • Fig. 4Dl shows the Lambertian emittance versus wavelength characteristics of the LEDs used to implement the far-field wide-area illumination arrays 29 in the Multi-Mode Illumination Subsystem 14.
  • Fig. 4D2 shows the Lambertian emittance versus polar angle characteristics of the LEDs used to implement the same.
  • Fig. 4D3 shows the plano-convex lenses used before the LEDs in the far-field wide-area illumination arrays in the Multi-Mode Illumination Subsystem 14.
  • Fig. 4Dl shows the Lambertian emittance versus wavelength characteristics of the LEDs used to implement the far-field wide-area illumination arrays 29 in the Multi-Mode Illumination Subsystem 14.
  • Fig. 4D2 shows the Lambertian emittance versus polar angle characteristics of the LEDs used to implement the same.
  • Fig. 4D3 shows the plano-convex lenses used before the LEDs in the far-field wide-area illumination arrays in the Multi-Mode Illumination Subsystem 14.
  • FIG. 4D4 shows a layout of LEDs and plano-convex lenses used to implement the far wide-area illumination array 29 of the illumination subsystem, wherein the illumination beam produced therefrom is aimed by angling the lenses before the LEDs in the far-field wide-area illumination arrays of the Multi-Mode Illumination Subsystem 14.
  • Fig. 4D6 sets forth a set of three illumination profiles for the far-field wide-area illumination fields produced by the far-field wide-area illumination arrays of the illustrative embodiment, taken at 100, 150 and 220 millimeters along the field away from the imaging window (i.e. working distance) of the imaging-based bar code symbol reader 1, illustrating that the spatial intensity of the far-field wide-area illumination field begins to become substantially uniform at about 100 millimeters.
  • Fig. 4D7 shows a table illustrating a preferred method of calculating the pixel intensity value for the center of the far field wide-area illumination field produced from the Multi-Mode Illumination Subsystem 14, showing a significant signal strength (greater than 80 DN at the far center field).
  • the hand-supportable housing of the bar code reader of the present invention has integrated within its housing, narrow-band optical filter subsystem 4 for transmitting substantially only the very narrow band of wavelengths (e.g. 620-700 nanometers) of visible illumination produced from the narrow-band Multi-Mode Illumination Subsystem 14, and rejecting all other optical wavelengths outside this narrow optical band however generated (i.e. ambient light sources).
  • narrow-band optical filter subsystem 4 comprises: red-wavelength reflecting (high-pass) imaging window filter 4A integrated within its light transmission aperture 3 formed on the front face of the hand-supportable housing; and low pass optical filter 4B disposed before the CMOS image sensing array 22.
  • optical filters 4A and 4B cooperate to form the narrow-band optical filter subsystem 4 for the purpose described above.
  • the light transmission characteristics (energy versus wavelength) associated with the low-pass optical filter element 4B indicate that optical wavelengths below 620 nanometers are transmitted therethrough, whereas optical wavelengths above 620 nm are substantially blocked (e.g. absorbed or reflected).
  • the light transmission characteristics (energy versus wavelength) associated with the high-pass imaging window filter 4A indicate that optical wavelengths above 700 nanometers are transmitted therethrough, thereby producing a red-color appearance to the user, whereas optical wavelengths below 700 nm are substantially blocked (e.g. absorbed or reflected) by optical filter 4A.
  • spectral band-pass filter subsystem 4 greatly reduces the influence of the ambient light, which falls upon the CMOS image sensing array 22 during the image capturing operations.
  • a optical shutter mechanism is eliminated in the system.
  • the optical filter can reject more than 85% of incident ambient light, and in typical environments, the intensity of LED illumination is significantly more than the ambient light on the CMOS image sensing array 22.
  • the imaging-based bar code reading system of the present invention effectively manages the exposure time of narrow-band illumination onto its CMOS image sensing array 22 by simply controlling the illumination time of its LED-based illumination arrays 27, 28 and 29 using control signals generated by Automatic Light Exposure Measurement and Illumination Control Subsystem 15 and the CMOS image sensing array 22 while controlling illumination thereto by way of the band-pass optical filter subsystem 4 described above.
  • the result is a simple system design, without moving parts, and having a reduced manufacturing cost.
  • band-pass optical filter subsystem 4 is shown comprising a high-pass filter element 4A and low-pass filter element 4B, separated spatially from each other by other optical components along the optical path of the system
  • subsystem 4 may be realized as an integrated multi-layer filter structure installed in front of the image formation and detection (IFD) module 13, or before its image sensing array 22, without the use of the high-pass window filter 4A, or with the use thereof so as to obscure viewing within the imaging-based bar code symbol reader while creating an attractive red- colored protective window.
  • the red-color window filter 4A will have substantially planar surface characteristics to avoid focusing or defocusing of light transmitted therethrough during imaging operations.
  • the primary function of the Automatic Light Exposure Measurement and Illumination Control Subsystem 15 is to control the brightness and contrast of acquired images by (i) measuring light exposure at the image plane of the CMOS imaging sensing array 22 and (ii) controlling the time duration that the Multi-Mode Illumination Subsystem 14 illuminates the target object with narrow-band illumination generated from the activated LED illumination array.
  • the Automatic Light Exposure Measurement and Illumination Control Subsystem 15 eliminates the need for a complex shuttering mechanism for CMOS-based image sensing array 22. This novel mechanism ensures that the imaging-based bar code symbol reader of the present invention generates non-saturated images with enough brightness and contrast to guarantee fast and reliable image-based bar code decoding in demanding end-user applications.
  • the Automatic Light Exposure Measurement and Illumination Control Subsystem 15 measures the amount of light reflected from the target object, calculates the maximum time that the CMOS image sensing array 22 should be kept exposed to the actively-driven LED-based illumination array associated with the Multi-Mode Illumination Subsystem 14, and then automatically deactivates the illumination array when the calculated time to do so expires (i.e. lapses).
  • the Automatic Light Exposure Measurement and Illumination Control Subsystem 15 comprises: a parabolic light-collecting mirror 55 mounted within the head portion of the hand-supportable housing, for collecting narrow-band LED- based light reflected from a central portion of the FOV of the system, which is then transmitted through the narrow-band optical filter subsystem 4 eliminating wide band spectral interference; a light-sensing device (e.g.
  • photo-diode 56 mounted at the focal point of the light collection mirror 55, for detecting the filtered narrow-band optical signal focused therein by the light collecting mirror 55; and an electronic circuitry 57 for processing electrical signals produced by the photo-diode 56 indicative of the intensity of detected light exposure levels within the focal plane of the CMOS image sensing array 22.
  • incident narrow-band LED-based illumination is gathered from the center of the FOV of the system by the spherical light collecting mirror 55 and narrow-band filtered by the narrow-band optical filter subsystem 4 before being focused upon the photodiode 56 for intensity detection.
  • the photo-diode 56 converts the detected light signal into an electrical signal having an amplitude which directly corresponds to the intensity of the collected light signal.
  • the System Control Subsystem 19 generates an illumination array selection control signal which determines which LED illumination array (i.e. the narrow-area illumination array 27 or the far-field and narrow-field wide-area illumination arrays 28 or 29) will be selectively driven at any instant in time of system operation by LED Array Driver Circuitry 64 in the Automatic Light Exposure Measurement and Illumination Control Subsystem 15.
  • LED Array Driver Circuitry 64 processes the electrical signal from photo-detector 56 and generates an auto exposure control signal for the selected LED illumination array.
  • this auto exposure control signal is provided to the LED array driver circuitry 64, along with an illumination array selection control signal from the System Control Subsystem 19, for selecting and driving (i.e.
  • the illumination array selection control signal is generated by the System Control Subsystem 19 in response to (i) reading the system mode configuration parameters from the system mode configuration parameter table 70, shown in Fig. 2Al , for the programmed mode of system operation at hand, and (ii) detecting the output from the automatic IR-based Object Presence and Range Detection Subsystem 12.
  • LED-based illumination arrays 27, 28 and 29 which can be selected for activation by the System Control Subsystem 19, and the upper and/or lower LED subarrays in illumination arrays 28 and 29 can be selectively activated or deactivated on a subarray-by-subarray basis, for various purposes taught herein, including automatic specular reflection noise reduction during wide-area image capture modes of operation.
  • Each one of these illumination arrays can be driven to different states depending on the auto- exposure control signal generated by electronic signal processing circuit 57, which will be generally a function of object distance, object surface reflectivity and the ambient light conditions sensed at photo- detector 56, and measured by signal processing circuit 57.
  • the operation of signal processing circuitry 57 will now be detailed below.
  • the narrow-band filtered optical signal that is produced by the parabolic light focusing mirror 55 is focused onto the photo-detector Dl 56 which generates an analog electrical signal whose amplitude corresponds to the intensity of the detected optical signal.
  • This analog electrical signal is supplied to the signal processing circuit 57 for various stages of processing.
  • the first step of processing involves converting the analog electrical signal from a current-based signal to a voltage-based signal which is achieved by passing it through a constant-current source buffer circuit 58, realized by one half of transistor Ql (58). This inverted voltage signal is then buffered by the second half of the transistor Ql (58) and is supplied as a first input to a summing junction 59.
  • the CMOS image sensing array 22 produces, as output, a digital electronic rolling shutter (ERS) pulse signal 60, wherein the duration of this ERS pulse signal 60 is fixed to a maximum exposure time allowed in the system.
  • the ERS pulse signal 60 is buffered through transistor Q2 61 and forms the other side of the summing junction 59.
  • the outputs from transistors Ql and Q2 form an input to the summing junction 59.
  • a capacitor C5 is provided on the output of the summing junction 59 and provides a minimum integration time sufficient to reduce any voltage overshoot in the signal processing circuit 57.
  • the output signal across the capacitor C5 is further processed by a comparator UI 62.
  • the comparator reference voltage signal is set to 1.7 volts. This reference voltage signal sets the minimum threshold level for the light exposure measurement circuit 57.
  • the output signal from the comparator 62 is inverted by inverter U3 63 to provide a positive logic pulse signal which is supplied, as auto exposure control signal, to the input of the LED array driver circuit 64 shown in Fig. 1C.
  • the LED array driver circuit 64 shown in Fig. 7C automatically drives an activated LED illuminated array, and the operation of LED array driver circuit 64 depends on the mode of operation in which the Multi-Mode Illumination Subsystem 14 is configured.
  • the mode of operation in which the Multi-Mode Illumination Subsystem 14 is configured at any moment in time will typically depend on (i) the state of operation of the Object Presence and Range Detection Subsystem 12 and (ii) the programmed mode of operation in which the entire Imaging-Based Bar Code Symbol Reading System is configured using system mode configuration parameters read from Table 70 shown in Fig. 2Al .
  • the LED array driver circuit 64 comprises analog and digital circuitry which receives two input signals: (i) the auto exposure control signal from signal processing circuit 57; and (ii) the illumination array selection control signal.
  • the LED array driver circuit 64 generates, as output, digital pulse-width modulated (PCM) drive signals provided to either the narrow-area illumination array 27, the upper and/or lower LED subarray employed in the near-field wide-area illumination array 28, and/or the upper and/or lower LED subarrays employed in the far-field wide-area illumination array 29.
  • PCM digital pulse-width modulated
  • the LED array driver circuit 64 will drive one or more of the above- described LED illumination arrays during object illumination and imaging operations.
  • LED illumination array(s) are automatically driven by the LED array driver circuit 64 at an intensity and for duration computed (in an analog manner) by the Automatic Light Exposure and Illumination Control Subsystem 15 so as to capture digital images having good contrast and brightness, independent of the light intensity of the ambient environment and the relative motion of target object with respect to the imaging-based bar code symbol reader.
  • the CMOS image sensing array 22 is operated in its Single Frame Shutter Mode (i.e. rather than its Continuous Frame Shutter Mode) as shown in Fig. 6D, and employs a novel exposure control method which ensure that all rows of pixels in the CMOS image sensing array 22 have a common integration time, thereby capturing high quality images even when the object is in a state of high speed motion.
  • This novel exposure control technique shall be referred to as "the global exposure control method" of the present invention, and the flow chart of Figs. 6El and 6E2 describes clearly and in great detail how this method is implemented in the imaging-based bar code symbol reader of the illustrative embodiment.
  • the global exposure control method will now be described in detail below.
  • Step A in the global exposure control method involves selecting the single frame shutter mode of operation for the CMOS imaging sensing array provided within an imaging-based bar code symbol reading system employing an automatic light exposure measurement and illumination control subsystem, a multi-mode illumination subsystem, and a system control subsystem integrated therewith, and image formation optics providing the CMOS image sensing array with a field of view into a region of space where objects to be imaged are presented.
  • Step B in the global exposure control method involves using the automatic light exposure measurement and illumination control subsystem to continuously collect illumination from a portion of the field of view, detect the intensity of the collected illumination, and generate an electrical analog signal corresponding to the detected intensity, for processing.
  • Step C in the global exposure control method involves activating (e.g. by way of the system control subsystem 19 or directly by way of trigger switch 2C) the CMOS image sensing array so that its rows of pixels begin to integrate photonically generated electrical charge in response to the formation of an image onto the CMOS image sensing array by the image formation optics of the system.
  • Step D in the global exposure control method involves the CMOS image sensing array 22 automatically (i) generating an electronic rolling shutter (ERS) digital pulse signal when all rows of pixels in the image sensing array are operated in a state of integration, and providing this ERS pulse signal to the Automatic Light Exposure Measurement And Illumination Control Subsystem 15 so as to activate light exposure measurement and illumination control functions/operations therewithin.
  • ERS electronic rolling shutter
  • Step E in the global exposure control method involves, upon activation of light exposure measurement and illumination control functions within Subsystem 15, (i) processing the electrical analog signal being continuously generated therewithin, (ii) measuring the light exposure level within a central portion of the field of view 23 (determined by light collecting optics 55 shown in Fig. 6A), and (iii) generating an auto-exposure control signal for controlling the generation of visible field of illumination from at least one LED-based illumination array (27, 28 and/or 29) in the Multi-Mode Illumination Subsystem 14 which is selected by an illumination array selection control signal produced by the System Control Subsystem 19.
  • Step F in the global exposure control method involves using (i) the auto exposure control signal and (ii) the illumination array selection control signal to drive the selected LED-based illumination array(s) and illuminate the field of view of the CMOS image sensing array 22 in whatever image capture mode it may be configured, precisely when all rows of pixels in the CMOS image sensing array are in a state of integration, as illustrated in Fig. 6D, thereby ensuring that all rows of pixels in the CMOS image sensing array have a common integration time.
  • CMOS image sensing array 22 By enabling all rows of pixels in the CMOS image sensing array 22 to have a common integration time, high-speed "global exposure control" is effectively achieved within the imaging-based bar code symbol reader of the present invention, and consequently, high quality images are captured independent of the relative motion between the bar code symbol reader and the target object.
  • IR-wavelength based Automatic Object Presence and Range Detection Subsystem 12 is realized in the form of a compact optics module 76 mounted on the front portion of optics bench 6, as shown in Fig. IJ.
  • the object presence and range detection module 12 of the illustrative embodiment comprises a number of subcomponents, namely: an optical bench 77 having an ultra-small footprint for supporting optical and electro-optical components used to implement the subsystem 12; at least one IR laser diode 78 mounted on the optical bench 77, for producing a low power IR laser beam 79; IR beam shaping optics 80, supported on the optical bench for shaping the IR laser beam (e.g.
  • IR light collection/focusing optics 81 supported on the optical bench 77
  • AM amplitude modulation circuit 82 supported on the optical bench 77, for modulating the amplitude of the IR laser beam produced from the IR laser diode at a frequency / 0 (e.g. 75Mhz) with up to 7.5 milliWatts of optical power
  • optical detector e.g.
  • an avalanche-type IR photo-detector mounted at the focal point of the IR light collection/focusing optics 81, for receiving the IR optical signal reflected off an object within the object detection field, and converting the received optical signal Control Subsystem 15 so as to activate light exposure measurement and illumination control functions/operations therewithin.
  • Step E in the global exposure control method involves, upon activation of light exposure measurement and illumination control functions within Subsystem 15, (i) processing the electrical analog signal being continuously generated therewithin, (ii) measuring the light exposure level within a central portion of the field of view 23 (determined by light collecting optics 55 shown in Fig. 6A), and (iii) generating an auto-exposure control signal for controlling the generation of visible field of illumination from at least one LED-based illumination array (27, 28 and/or 29) in the Multi-Mode Illumination Subsystem 14 which is selected by an illumination array selection control signal produced by the System Control Subsystem 19.
  • Step F in the global exposure control method involves using (i) the auto exposure control signal and (ii) the illumination array selection control signal to drive the selected LED-based illumination array(s) and illuminate the field of view of the CMOS image sensing array 22 in whatever image capture mode it may be configured, precisely when all rows of pixels in the CMOS image sensing array are in a state of integration, as illustrated in Fig. 6D, thereby ensuring that all rows of pixels in the CMOS image sensing array have a common integration time.
  • CMOS image sensing array 22 By enabling all rows of pixels in the CMOS image sensing array 22 to have a common integration time, high-speed "global exposure control" is effectively achieved within the imaging-based bar code symbol reader of the present invention, and consequently, high quality images are captured independent of the relative motion between the bar code symbol reader and the target object.
  • IR-wavelength based Automatic Object Presence and Range Detection Subsystem 12 is realized in the form of a compact optics module 76 mounted on the front portion of optics bench 6, as shown in Fig. IJ.
  • the object presence and range detection module 12 of the illustrative embodiment comprises a number of subcomponents, namely: an optical bench 77 having an ultra-small footprint for supporting optical and electro-optical components used to implement the subsystem 12; at least one IR laser diode 78 mounted on the optical bench 77, for producing a low power IR laser beam 79; IR beam shaping optics 80, supported on the optical bench for shaping the IR laser beam (e.g.
  • IR light collection/focusing optics 81 supported on the optical bench 77
  • AM amplitude modulation circuit 82 supported on the optical bench 77, for modulating the amplitude of the IR laser beam produced from the IR laser diode at a frequency / 0 (e.g. 75Mhz) with up to 7.5 milliWatts of optical power
  • optical detector e.g.
  • an avalanche-type IR photo-detector mounted at the focal point of the IR light collection/focusing optics 81, for receiving the IR optical signal reflected off an object within the object detection field, and converting the received optical signal 84 into an electrical signal 85;
  • an amplifier and filter circuit 86 mounted on the optical bench 77, for isolating the / 0 signal component and amplifying it;
  • a limiting amplifier 87 mounted on the optical bench, for maintaining a stable signal level;
  • a phase detector 88 mounted on the optical bench 77, for mixing the reference signal component / 0 from the AM circuit 82 and the received signal component / 0 reflected from the packages and producing a resulting signal which is equal to a DC voltage proportional to the Cosine of the phase difference between the reference and the reflected / 0 signals;
  • an amplifier circuit 89 mounted on the optical bench 77, for amplifying the phase difference signal;
  • a received signal strength indicator (RSSI) 90 mounted on the optical bench 77
  • range analysis circuitry 93 is to analyze the digital range data from the A/D converter 90 and generate two control activation signals, namely: (i) "an object presence detection” type of control activation signal A IA indicating simply whether an object is presence or absent from the object detection field, regardless of the mode of operation in which the Multi-Mode Illumination Subsystem 14 might be configured; and (ii) "a near-field/far-field” range indication type of control activation signal A I B indicating whether a detected object is located in either the predefined near-field or far-field portions of the object detection field, which correspond to the near-field and far- field portions of the FOV of the Multi-Mode Image Formation and Detection Subsystem 13.
  • Automatic Object Presence and Range Detection Subsystem 12 operates as follows. In System Modes of Operation requiring automatic object presence and/or range detection, Automatic Object Presence and Range Detection Subsystem 12 will be activated at system start-up and operational at all times of system operation, typically continuously providing the System Control Subsystem 19 with information about the state of objects within both the far and near portions of the object detection field 20 of the imaging-based symbol reader. In general, this Subsystem detects two basic states of presence and range, and therefore has two basic states of operation.
  • the IR-based automatic Object Presence and Range Detection Subsystem 12 In its first state of operation, automatically detects an object within the near-field region of the FOV 20, and in response thereto generates a first control activation signal which is supplied to the System Control Subsystem 19 to indicate the occurrence of this first fact. In its second state of operation, the IR-based automatic Object Presence and Range Detection Subsystem 12 automatically detects an object within the far-field region of the FOV 20, and in response thereto generates a second control activation signal which is supplied to the System Control Subsystem 19 to indicate the occurrence of this second fact.
  • control activation signals are used by the System Control Subsystem 19 during particular stages of the system control process, such as determining (i) whether to activate either the near-field and/or far-field LED illumination arrays, and (ii) how strongly should these LED illumination arrays be driven to ensure quality image exposure at the CMOS image sensing array 22.
  • the CMOS image sensing array 22 employed in the digital imaging-based bar code symbol reading device hereof is operably connected to its microprocessor 36 through FIFO 39 (realized by way of a FPGA) and system bus shown in Fig. 2M.
  • SDRAM 38 is also operably connected to the microprocessor 36 by way of the system bus, thereby enabling the mapping of pixel data captured by the CMOS image sensing array 22 into the SDRAM 38 under the control of the direct memory access (DMA) module within the microprocessor 36.
  • DMA direct memory access
  • CMOS image sensing array 22 are automatically mapped (i.e. captured and stored) into the addressable memory storage locations of its SDRAM 38 during each image capture cycle carried out within the hand-supportable imaging-based bar code reading device of the present invention.
  • the CMOS image sensing array 22 sends 7-bit gray-scale data bytes over a parallel data connection to FPGA 39 which implements a FIFO using its internal SRAM.
  • the FIFO 39 stores the pixel data temporarily and the microprocessor 36 initiates a DMA transfer from the FIFO (which is mapped to address OXOCOOOOOO, chip select 3) to the SDRAM 38.
  • the DMA module will contain a 32-byte buffer.
  • the DMA module can be programmed to read data from the FIFO 39, store read data bytes in the DMA's buffer, and subsequently write the data to the SDRAM 381
  • a DMA module can reside in FPGA 39 to directly write the FIFO data into the SDRAM 38. This is done by sending a bus request signal to the microprocessor 36, so that the microprocessor 36 releases control of the bus to the FPGA 39 which then takes over the bus and writes data into the SDRAM 38.
  • pixel data output from the CMOS image sensing array 22 is stored in the SDRAM 38, and how the microprocessor (i.e. implementing a decode algorithm) 36 accesses such stored pixel data bytes.
  • Fig. 9F represents the memory space of the SDRAM 38.
  • a reserved memory space of 1.3 MB is used to store the output of the CMOS image sensing array 22.
  • This memory space is a 1 :1 mapping of the pixel data from the CMOS image sensing array 22.
  • Each byte represents a pixel in the image sensing array 22.
  • Memory space is a mirror image of the pixel data from the image sensing array 22.
  • the decode program (36) accesses the memory, it is as if it is accessing the raw pixel image of the image sensing array 22. No time code is needed to track the data since the modes of operation of the bar code reader guarantee that the microprocessor 36 is always accessing the up-to-date data, and the pixel data sets are a true representation of the last optical exposure. To prevent data corruption, i.e. new data coming in while old data are still being processed, the reserved space is protected by disabling further DMA access once a whole frame of pixel data is written into memory. The DMA module is re-enabled until either the microprocessor 36 has finished going through its memory, or a timeout has occurred.
  • the image pixels are sequentially read out of the image sensing array 22. Although one may choose to read and column-wise or row-wise for some CMOS image sensors, without loss of generality, the row-by-row read out of the data is preferred.
  • the pixel image data set is arranged in the SDRAM 38 sequentially, starting at address OXAOEC0000. To randomly access any pixel in the SDRAM 38 is a straightforward matter: the pixel at row y 1/4 column x located is at address (OXAOEC0000+ y x 1280 + x).
  • each image frame always has a frame start signal out of the image sensing array 22, that signal can be used to start the DMA process at address OXAOEC0000, and the address is continuously incremented for the rest of the frame. But the reading of each image frame is started at address OXAOECOOOO to avoid any misalignment of data. Notably, however, if the microprocessor 36 has programmed the CMOS image sensing array 22 to have a ROI window, then the starting address will be modified to (OXAOECOOOO + 1280 X R 1 ), where R
  • the hand-supportable digital imaging-based bar code symbol reading device of the present invention 1 is provided with a three-tier software architecture comprising the following software modules: (1) the Main Task module, the CodeGate Task module, the Metroset Task module, the Application Events Manager module, the User Commands Table module, the Command Handler module, the Plug-In Controller (Manager) and Plug-In Libraries and Configuration Files, each residing within the Application layer of the software architecture; (2) the Tasks Manager module, the Events Dispatcher module, the Input/Output Manager module, the User Commands Manager module, the Timer Subsystem module, the Input/Output Subsystem module and the Memory Control Subsystem module, each residing within the System Core (SCORE) layer of the software architecture; and (3) the Linux K.ernal module, the Linux File System module, and Device Drivers modules, each residing within the Linux Operating System (OS) layer of the software architecture.
  • OS Linux Operating System
  • the operating system layer of the imaging-based bar code symbol reader is based upon the Linux operating system, it is understood that other operating systems can be used (e.g. Microsoft Windows, Max OXS, Unix, etc), and that the design preferably provides for independence between the main Application Software Layer and the Operating System Layer, and therefore, enables of the Application Software Layer to be potentially transported to other platforms.
  • the system design principles of the present invention provides an extensibility of the system to other future products with extensive usage of the common software components, which should make the design of such products easier, decrease their development time, and ensure their robustness.
  • the above features are achieved through the implementation of an event-driven multi-tasking, potentially multi-user, Application layer running on top of the System Core software layer, called SCORE.
  • the SCORE layer is statically linked with the product Application software, and therefore, runs in the Application Level or layer of the system.
  • the SCORE layer provides a set of services to the Application in such a way that the Application would not need to know the details of the underlying operating system, although all operating system APIs are, of course, available to the application as well.
  • the SCORE software layer provides a real-time, event-driven, OS- independent framework for the product Application to operate.
  • the event-driven architecture is achieved by creating a means for detecting events (usually, but not necessarily, when the hardware interrupts occur) and posting the events to the Application for processing in real-time manner.
  • the event detection and posting is provided by the SCORE software layer.
  • the SCORE layer also provides the product Application with a means for starting and canceling the software tasks, which can be running concurrently, hence, the multi-tasking nature of the software system of the present invention.
  • the SCORE layer provides a number of services to the Application layer.
  • the Tasks Manager provides a means for executing and canceling specific application tasks (threads) at any time during the product Application run.
  • the Events Dispatcher provides a means for signaling and delivering all kinds of internal and external synchronous and asynchronous events
  • the Events Dispatcher dispatches them to the Application Events Manager, which acts on the events accordingly as required by the Application based on its current state. For example, based on the particular event and current state of the application, the Application Events Manager can decide to start a new task, or stop currently running task, or do something else, or do nothing and completely ignore the event.
  • the Input/Output Manager provides a means for monitoring activities of input / output devices and signaling appropriate events to the Application when such activities are detected.
  • the Input/Output Manager software module runs in the background and monitors activities of external devices and user connections, and signals appropriate events to the Application Layer, which such activities are detected.
  • the Input/Output Manager is a high-priority thread that runs in parallel with the Application and reacts to the input/output signals coming asynchronously from the hardware devices, such as serial port, user trigger switch 2C, bar code reader, network connections, etc. Based on these signals and optional input/output requests (or lack thereof) from the Application, it generates appropriate system events, which are delivered through the Events Dispatcher to the Application Events Manager as quickly as possible as described above.
  • the User Commands Manager provides a means for managing user commands, and utilizes the User Commands Table provided by the Application, and executes appropriate User Command Handler based on the data entered by the user.
  • the Input/Output Subsystem software module provides a means for creating and deleting input/output connections and communicating with external systems and devices
  • the Timer Subsystem provides a means of creating, deleting, and utilizing all kinds of logical timers.
  • the Memory Control Subsystem provides an interface for managing the multi-level dynamic memory with the device, fully compatible with standard dynamic memory management functions, as well as a means for buffering collected data.
  • the Memory Control Subsystem provides a means for thread-level management of dynamic memory.
  • the interfaces of the Memory Control Subsystem are fully compatible with standard C memory management functions.
  • the system software architecture is designed to provide connectivity of the device to potentially multiple users, which may have different levels of authority to operate with the device.
  • the User Commands Manager which provides a standard way of entering user commands, and executing application modules responsible for handling the same.
  • Each user command described in the User Commands Table is a task that can be launched by the User Commands Manager per user input, but only if the particular user's authority matches the command's level of security.
  • the Events Dispatcher software module provides a means of signaling and delivering events to the Application Events Manager, including the starting of a new task, stopping a currently running task, or doing something or nothing and simply ignoring the event.
  • Fig. 12B provides a Table listing examples of System-Defined Events which can occur and be dispatched within the hand-supportable digital imaging-based bar code symbol reading device of the present invention, namely: SCORE_EVENT_POWER_UP which signals the completion of system start-up and involves no parameters;_SCORE_EVENT_TlMEOUT which signals the timeout of the logical timer, and involves the parameter "pointer to timer id"; SCORE_EVEMT_UNEXPECTED_INPUT which signals that the unexpected input data is available and involves the parameter "pointer to connection id"; SCORE_EVENT_TRIG_ON which signals that the user pulled the trigger and involves no parameters; SCORE_EVENT_TR1G_OFF which signals that the user released the trigger and involves no parameters; SCORE_EVENT_OBJECT_DETECT_ON which signals that the object is positioned under the bar code reader and involves no parameters; SCORE_EVENT_OBJECT_DETECT_OFF which signals that the object is removed from the field
  • the imaging-based bar code symbol reading device of the present invention provides the user with a command-line interface (CLI), which can work over the standard communication lines, such as RS232, available in the bar code reader.
  • CLI command-line interface
  • the CLI is used mostly for diagnostic purposes, but can also be used for configuration purposes in addition to the MetroSet® and MetroSelect® programming functionalities.
  • a user To send commands to the bar code reader utilizing the CLI, a user must first enter the User Command Manager by typing in a special character, which could actually be a combination of multiple and simultaneous keystrokes, such Ctrl and S for example. Any standard and widely available software communication tool, such as Windows HyperTerminal, can be used to communicate with the bar code reader.
  • the bar code reader acknowledges the readiness to accept commands by sending the prompt, such as "MTLG>" back to the user.
  • the user can now type in any valid Application command.
  • a user must enter another special character, which could actually be a combination of multiple and simultaneous keystrokes, such Ctrl and R for example.
  • An example of the valid command could be the "Save Image” command, which is used to upload an image from the bar code reader's memory to the host PC.
  • This command has the following CLI format: save [ filename [ compr ] ] where
  • filename is the name of the file the image gets saved in. If omitted, the default filename is "image.bmp".
  • compr is the compression number, from 0 to 10. If omitted, the default compression number is 0, meaning no compression. The higher compression number, the higher image compression ratio, the faster image transmission, but more distorted the image gets.
  • the imaging-based bar code symbol reader of the present invention can have numerous commands. All commands are described in a single table (User Commands Table shown in Fig. 10) contained in the product Applications software layer. For each valid command, the appropriate record in the table contains the command name, a short description of the command, the command type, and the address of the function that implements the command.
  • the User Command Manager When a user enters a command, the User Command Manager looks for the command in the table. If found, it executes the function the address of which is provided in the record for the entered command. Upon return from the function, the User Command Manager sends the prompt to the user indicating that the command has been completed and the User Command Manager is ready to accept a new command.
  • the image processing software employed within the system hereof performs its bar code reading function by locating and recognizing the bar codes within the frame of a captured image comprising pixel data.
  • the modular design of the image processing software provides a rich set of image processing functions, which could be utilized in the future for other potentiai applications, related or not related to bar code symbol reading, such as: optical character recognition (OCR) and verification (OCV); reading and verifying directly marked symbols on various surfaces; facial recognition and other biometrics identification; etc.
  • the CodeGate Task in an infinite loop, performs the following task. It illuminates a "thin" narrow horizontal area at the center of the f ⁇ eld-of-view (FOV) and acquires a digital image of that area. It then attempts to read bar code symbols represented in the captured frame of image data using the image processing software facilities supported by the Image-Processing Bar Code Symbol Reading Subsystem 17 of the present invention to be described in greater detail hereinafter. If a bar code symbol is successfully read, then Subsystem 17 saves the decoded data in the special Decode Data Buffer. Otherwise, it clears the Decode Data Buffer. Then, it continues the loop.
  • the CodeGate Task routine never exits on its own. It can be canceled by other modules in the system when reacting to other events.
  • the event TRIGG ER_ON is posted to the application.
  • the Application software responsible for processing this event checks if the CodeGate Task is running, and if so, it cancels it and then starts the Main Task.
  • the CodeGate Task can also be canceled upon OBJECT_DETECT_OFF event, posted when the user moves the bar code reader away from the object, or when the user moves the object away from the bar code reader.
  • the CodeGate Task routine is enabled (with Main Task) when "semi-automatic-triggered" system modes of programmed operation (Modes of System Operation Nos. 1 1 -14 in Fig. 17AFig. 17A) are to be implemented on the illumination and imaging platform of the present invention.
  • the Narrow-Area Illumination Task illustrated in Fig. 13M is a simple routine which is enabled (with Main Task) when "manually-triggered" system modes of programmed operation (Modes of System Operation Nos. 1-5 in Fig. 17AFig. 17A) are to be implemented on the illumination and imaging platform of the present invention.
  • this routine is never enabled simultaneously with CodeGate Task.
  • either CodeGate Task or Narrow-Area Illumination Task are enabled with the Main Task routine to realize the diverse kinds of system operation described herein.
  • Main Task will typically perform differently, but within the limits described in Fig. 13J.
  • the Main Task first checks if the Decode Data Buffer contains data decoded by the CodeGate Task. If so, then it immediately sends the data out to the user by executing the Data Output procedure and exits. Otherwise, in a loop, the Main Task does the following: it illuminates an entire area of the field-of-view and acquires a full-frame image of that area.
  • the Main Task attempts to read a bar code symbol the captured image. If it successfully reads a bar code symbol, then it immediately sends the data out to the user by executing the Data Output procedure and exits. Otherwise, it continues the loop. Notably, upon successful read and prior to executing the Data Output procedure, the Main Task analyzes the decoded data for a "reader programming" command or a sequence of commands. If necessary, it executes the MetroSelect functionality. The Main Task can be canceled by other modules within the system when reacting to other events. For example, the bar code reader of the present invention can be re-configured using standard Metrologic configuration methods, such as MetroSelec® and MetroSet®. The MetroSelect functionality is executed during the Main Task.
  • the MetroSet functionality is executed by the special MetroSet Task.
  • the Focus RS232 software driver detects a special NULL-signal on its communication lines, it posts the METROSET_ON event to the Application.
  • the Application software responsible for processing this event starts the MetroSet task. Once the MetroSet Task is completed, the scanner returns to its normal operation.
  • the function of the Plug-In Controller (i.e. Manager) is to read configuration files and find plug-in libraries within the Plug-In and Configuration File Library, and install plug-in into the memory of the operating system, which returns back an address to the Plug-In Manager indicating where the plug-in- has been installed, for future access.
  • the Plug-In Development Platform support development of plug-ins that enhance , extend and/or modify the features and functionalities of the image-processing based bar code symbol reading system, and once developed, to upload developed plug-ins within the file system of the operating system layer, while storing the addresses of such plug-ins within the Plug-In and Configuration File Library in the Application Layer.
  • the Devices Drivers software modules which includes trigger drivers, provide a means for establishing a software connection with the hardware-based manually-actuated trigger switch 2C employed on the imaging-based device, an image acquisition driver for implementing image acquisition functionality aboard the imaging-based device, and an IR driver for implementing object detection functionality aboard the imaging-based device.
  • the Device Drive software modules include: trigger drivers for establishing a software connection with the hardware-based manually-actuated trigger switch 2C employed on the imaging-based bar code symbol reader of the present invention; an image acquisition driver for implementing image acquisition functionality aboard the imaging-based bar code symbol reader; and an IR driver for implementing object detection functionality aboard the imaging-based bar code symbol reader.
  • Figs 13A through 13L the basic systems operations supported by the three-tier software architecture of the digital imaging-based bar code symbol reader of the present invention are schematically depicted.
  • these basic operations represent functional modules (or building blocks) with the system architecture of the present invention, which can be combined in various combinations to implement the numerous Programmable Modes of System Operation listed in Fig. 23 and described in detail below, using the image acquisition and processing platform disclosed herein.
  • these basic system operations will be described below with reference to Programmable Mode of System Operation " No. 12: Semi-Automatic-Triggered Multiple-Attempt 1 D/2D Single-Read Mode Employing The No- Finder Mode And The Manual Or Automatic Modes Of the Multi-Mode Bar Code Reading Subsystem 17.
  • Fig. 13A shows the basic operations carried out within the System Core Layer of the system when the user points the bar code reader towards a bar code symbol on an object. Such operations include the by IR device drivers enabling automatic detection of the object within the field, and waking up of the Input/Output Manager software module.
  • the Input/Output Manager posts the SCOREJDBJ ECT_DETECT_ON event to the Events Dispatcher software module in response to detecting an object.
  • the Events Dispatcher software module passes the SCORE_OBJECT_DETECT_ON event to the Application Layer.
  • the ' Application Events Manager Upon receiving the SCORE_OBJECT_DETECT_ON event at the Application Layer, the ' Application Events Manager executes an event handling routine (shown in Fig. 13D) which activates the narrow-area (linear) illumination array 27 (i.e. during narrow-area illumination and image capture modes), and then depending on whether the presentation mode has been selected and whether CodeGate Task or Narrow-Area Illumination Mode has been enabled during system configuration, this even handling routine executes either Main Task described in Fig. 13J, CodeGate Task described in Fig. 13E, or Narrow-Area Illumination Task described in 13M. As shown in the flow chart of Fig. 13 D, the system event handling routine first involves determining whether the Presentation Mode has been selected (i.e.
  • the event handling routine determines whether the CodeGate Task or Narrow-Area Illumination Routines have been enabled (with Main Task). If CodeGate Task has been enabled, then Application Layer starts CodeGate Task. If the Narrow-Area Illumination Task has been enabled, then the Application Layer starts the Narrow-Area Illumination Task, as shown.
  • the Application Layer executes the CodeGate Task by first activating the narrow-area image capture mode in the Multi-Mode Image Formation and Detection Subsystem 13 (i.e. by enabling a few middle rows of pixels in the CMOS sensor array 22), and then acquiring/capturing a narrow image at the center of the FOV of the Bar Code Reader. CodeGate Task then performs image processing operations on the captured narrow-area image using No-Finder Module which has been enabled by the selected Programmable Mode of System Operation No. 12.
  • the Codegate Task saves the decoded symbol character data in the Codegate Data Buffer; and if not, then the task clears the Codegate Data Buffer, and then returns to the main block of the Task where image acquisition reoccurs.
  • the trigger switch driver in the OS Layer automatically wakes up the Input/Output Manager at the System Core Layer.
  • the Input/Output Manager in response to being woken up by the trigger device driver, posts the SCORE_TRIGGER_ON event to the Events Dispatcher also in the System Core Layer.
  • the Events Dispatcher then passes on the SCORE_TRIGGER_ON event to the Application Events Manager at the Application Layer.
  • the Application Events Manager responds to the SCORE_TRIGGER_ON event by invoking a handling routine (Trigger On Event) within the Task Manager at the System Core Layer.
  • the routine determines whether the Presentation Mode (i.e. Programmed Mode of System Operation No. 10) has been enabled, and if so, then the routine exits. If the routine determines that the Presentation Mode (i.e. Programmed Mode of System Operation No. 10) has not been enabled, then it determines whether the CodeGate Task is running, and if it is running, then it first cancels the CodeGate Task and then deactivates the narrow- area illumination array 27 associated with the Multi-Mode Illumination Subsystem 14, and thereafter executes the Main Task.
  • the routine determines whether the Presentation Mode (i.e. Programmed Mode of System Operation No. 10) has been enabled, and if so, then the routine exits. If the routine determines that the Presentation Mode (i.e. Programmed Mode of System Operation No. 10) has not been enabled, then it determines whether the CodeGate Task is running, and if it is running, then it first cancels the CodeGate Task and then deactivates the narrow- area illumination array 27 associated with the Multi-Mode Illumina
  • the routine determines whether Narrow-Area Illumination Task is running, and if it is not running, then Main Task is started. However, if Narrow- Area Illumination Task is running, then the routine increases the narrow-illumination beam to full power and acquires a narrow-area image at the center of the field of view of the system, then attempts to read the bar code in the captured narrow-area image. If the read attempt is successful, then the decoded (symbol character) data is saved in the Decode Data Buffer, the Narrow-Area Illumination Task is canceled, the narrow-area illumination beam is stopped, and the routine starts the Main Task, as shown. If the read attempt is unsuccessful, then the routine clears the Decode Data Buffer, the Narrow-Area Illumination Task is canceled, the narrow-area illumination beam is stopped, and the routine starts the Main Task, as shown.
  • the Narrow- Area Task routine is an infinite loop routine that simply keeps a narrow-area illumination beam produced and directed at the center of the field of view of the system in a recursive manner (e.g. typically at half or less power in comparison with the full-power narrow-area illumination beam produced during the running of CodeGate Task).
  • the first step performed in the Main Task by the Application Layer is to determine whether CodeGate Data is currently available (i.e. stored in the Decode Data Buffer), and if such data is available, then the Main Task directly executes the Data Output Procedure described in Fig. 13K. However, if the Main Task determines that no such data is currently, available, then it starts the Read TimeOut Timer, and then acquires a wide-area image of the detected object, within the time frame permitted by the Read Timeout Timer.
  • this wide-area image acquisition process involves carrying out the following operations, namely: (i) first activating the wide-area illumination mode in the Multi-Mode Illumination Subsystem 14 and the wide-area capture mode in the CMOS image formation and detection module; (ii) determining whether the object resides in the near-field or far-field portion of the FOV (through object range measurement by the IR-bascd Object Presence and Range Detection Subsystem 12); and (iii) then activating either the near or far field wide-area illumination array to illuminate either the object in either the near or far field portions of the FOV using either the near-field illumination array 28 or the far-field illumination array 29 (or possibly both 28 and 29 in special programmed cases) at an intensity and duration determined by the automatic light exposure measurement and control subsystem 15; while (iv) sensing the spatial intensity of light imaged onto the CMOS image sensing array 22 in accordance with the Global Exposure Control Method of the present invention, described in detail hereinabove.
  • the Main Task performs image processing operations on the captured image using either the Manual, ROI-Specific or Automatic Modes of operation (although it is understood that other image-processing based reading methods taught herein, such as Automatic or OmniScan (as well we other suitable alternative decoding algorithms/processes not disclosed herein), can be used depending on which Programmed Mode of System Operation has been selected by the end user for the imaging-based bar code symbol reader of the present invention.
  • the time duration of each image acquisition/processing frame is set by the Start Read Timeout Timer and Stop Read Timeout Timer blocks shown therein, and that within the Programmed Mode of System Operation No. 12, the Main Task will support repeated (i.e.
  • Main Task will then execute the Data Output Procedure.
  • Main Task e.g. Main Task No. 2
  • Main Task No. 2 would be executed to enable the required system behavior during run-time.
  • the main point to be made here is that the selection and application of image-processing based bar code reading methods will preferably occur through the selective activation of the different modes available within the multi-mode image- processing based bar code symbol reading Subsystem 17, in response to information learned about the graphical intelligence represented within the structure of the captured image, and that such dynamic should occur in accordance with principles of dynamic adaptive learning commonly used in advanced image processing systems, speech understanding systems, and alike.
  • This general approach is in marked contrast with the approaches used in prior art imaging-based bar code symbol readers, wherein permitted methods of bar code reading are pre-selected based on statically defined modes selected by the end user, and not in response to detected conditions discovered in captured images on a real-time basis.
  • the first step carried out by the Data Output Procedure involves determining whether the symbol character data generated by the Main Task is for programming the bar code reader or not. If the data is not for programming the bar code symbol reader, then the Data Output Procedure sends the data out according to the bar code reader system configuration, and then generates the appropriate visual and audio indication to the operator, and then exits the procedure. If the data is for programming the bar code symbol reader, then the Data Output Procedure sets the appropriate elements of the bar code reader configuration (file) structure, and then saves the Bar Code Reader Configuration Parameters in non-volatile RAM (i.e. NOVRAM).
  • NOVRAM non-volatile RAM
  • the Data Output Procedure then reconfigures the bar code symbol reader and then generates the appropriate visual and audio indication to the operator, and then exits the procedure.
  • decoded data is sent from the Input/Output Module at the System Core Layer to the Device Drivers within the Linux OS Layer of the system.
  • This control routine can be called during the acquisition of wide-area image step in the Main Task routine, shown in Fig. 13 J.
  • the first step of the illumination control method involves using the Automatic Light Exposure Measurement And Illumination Control Subsystem 15 to measure the ambient light level to which the CMOS image sensing array 22 is exposed prior to commencing each illumination and imaging cycle within the Bar Code Symbol Reading System.
  • the illumination control method involves using the Automatic IR-based Object Presence and Range Detection Subsystem 12 to measure the presence and range of the object in either the near or far field portion of the field of view (FOV) of the System.
  • FOV field of view
  • the illumination control method involves using the detected range and the measured light exposure level to drive both the upper and lower LED illumination subarrays associated with either the near-field wide-area illumination array 28 or far-field wide-area illumination array 29.
  • the illumination control method involves capturing a wide-area image at the CMOS image sensing array 22 using the illumination field produced during Step C.
  • the illumination control method involves rapidly processing the captured wide-area image during Step D to detect the occurrence of high spatial-intensity levels in the captured wide-area image, indicative of a specular reflection condition.
  • the illumination control method involves determining if a specular reflection condition is detected in the processed wide-area image, and if so then driving only the upper LED illumination subarray associated with either the near-field or far-field wide-area illumination array. Also, if a specular reflection condition is not detected in the processed wide-area image, then the detected range and the measured light exposure level is used to drive both the upper and lower LED subarrays associated with either the near-field or far-field wide-area illumination array.
  • the illumination control method involves capturing a wide-area image at the CMOS image sensing array 22 using the illumination field produced during Step F.
  • the illumination control method involves rapidly processing the captured wide-area image during Step G to detect the occurrence of high spatial-intensity levels in the captured wide-area image, indicative of a specular reflection condition.
  • the illumination control method involves determining if a specular reflection condition is still detected in the processed wide-area image, and if so, then drive the other LED subarray associated with either the near-field or far-field wide-area illumination array. If a specular reflection condition is not detected in the processed wide-area image, then the detected Range and the measured Light Exposure Level is used to drive the same LED illumination subarray (as in Step C) associated with either the near-field wide-area illumination array 28 or far field wide-area illumination array 29. As indicated at Step J, the illumination control method involves capturing a wide-area image at the CMOS image sensing array using the illumination field produced during Step I.
  • the illumination control method involves rapidly processing the captured wide-area image during Step J to detect the absence of high spatial-intensity levels in the captured wide-area image, confirming the elimination of the earlier detected specular reflection condition.
  • the illumination control method involves determining if no specular reflection condition is detected in the processed wide-area image at Step K, and if not, then the wide- area image is processed using the mode(s)selected for the Multi-Mode Image-Processing Bar Code Reading Subsystem 17. If a specular reflection condition is still detected in the processed wide-area image, then the control process returns to Step A repeats Steps A through K, as described above.
  • Fig. 14 lists the various bar code symbologies supported by the Multi-Mode Bar Code Symbol Reading Subsystem 17 employed within the hand-supportable digital imaging-based bar code symbol reading device of the present invention. As shown therein, these bar code symbologies include: Code 128; Code 39; I2of5; Code93; Codabar; UPC/EAN; Telepen; UK-Plessey; Trioptic; Matrix 2of5; Ariline 2of5; Straight 2of5; MSI-Plessey; Codel l ; and PDF417.
  • the Multi-Mode Image-Processing Based Bar Code Symbol Reading Subsystem 17 of the illustrative embodiment supports five primary modes of operation, namely: the Automatic Mode of Operation; the Manual Mode of Operation; the ROI-Specific Mode of Operation; the No-Finder Mode of Operation; and Omniscan Mode of Operation. As will be described in greater detail herein, various combinations of these modes of operation can be used during the lifecycle of the image-processing based bar code reading process of the present invention.
  • Fig. 16 is a exemplary flow chart representation showing the steps involved in setting up and cleaning up the software sub-Application entitled "Multi-Mode I mage- Process ing Based Bar Code Symbol Reading Subsystem 17", once called from either (i) the CodeGate Task software module at the Block entitled READ BAR CODE(S) IN CAPTURED NARROW-AREA IMAGE indicated in Fig. 13E, or (ii) the Main Task software module at the Block entitled "READ BAR CODE(S) IN CAPTURED WIDE-AREA IMAGE” indicated in Fig. 13J.
  • the Multi-Mode Bar Code Symbol Reading Subsystem 17 is configured to automatically start processing a captured frame of digital image data, prior to the complete buffering thereof, so as to search for one or more bar codes represented therein in an incremental manner, and to continue searching until the entire image is processed.
  • This mode of image-based processing enables bar code locating and reading when no prior knowledge about the location of, or the orientation of, or the number of bar codes that may be present within an image, is available.
  • the Multi-Mode Bar Code Symbol Reading Subsystem 17 starts processing the image from the top-left corner and continues until it reaches the bottom-right corner, reading any potential bar codes as it encounters them.
  • the Multi-Mode Bar Code Symbol Reading Subsystem 17 is configured to automatically process a captured frame of digital image data, starting from the center or sweep spot of the image at which the user would have aimed the bar code reader, so as to search for (i.e. find) a at least one bar code symbol represented therein. Unlike the Automatic Mode, this is done by searching in a helical manner through frames or blocks of extracted image feature data, and then marking the same and image-processing the corresponding raw digital image data until a bar code symbol is recognized/read within the captured frame of image data.
  • This mode of image processing enables bar code locating and reading when the maximum number of bar codes that could be present within the image is known a priori and when portions of the primary bar code have a high probability of spatial location close to the center of the image.
  • the Multi- Mode Bar Code Symbol Reading Subsystem 17 starts processing the image from the center, along rectangular strips progressively further from the center and continues until either the entire image has been processed or the programmed maximum number of bar codes has been read.
  • the Multi-Mode Bar Code Symbol Reading Subsystem 17 is configured to automatically process a captured frame of digital image data, starting from the region of interest (ROI) in the captured image, specified by coordinates acquired during a previous mode of operation within the Multi-Mode Bar Code Symbol Reading Subsystem 17. Unlike the Manual Mode, this is done by analyzing the received ROI-specified coordinates, derived during either a previous NoFinder Mode, Automatic Mode, or Omniscan Mode of operation, and then immediately begins processing image feature data, and image-processing the corresponding raw digital image data until a bar code symbol is recognized/read within the captured frame of image data.
  • the ROI-Specific Mode is used in conjunction with other modes of the Multi-Mode Bar Code Symbol Reading Subsystem 17.
  • This mode of image processing enables bar code locating and reading when the maximum number of bar codes that could be present within the image is known a priori and when portions of the primary bar code have a high probability of spatial location close to specified ROI in the image.
  • the Multi-Mode Bar Code Symbol Reading Subsystem starts processing the image from these initially specified image coordinates, and then progressively further in a helical manner from the ROI-specified region, and continues until either the entire image has been processed or the programmed maximum number of bar codes have been read.
  • the Multi-Mode Bar Code Symbol Reading Subsystem 17 is configured to automatically process a captured narrow-area (linear) frame of digital image data, without the feature extraction and marking operations used in the Automatic, Manual and ROI-Specific Modes, so as to read a one or more bar code symbols represented therein.
  • This mode enables bar code reading when it is known, a priori, that the image contains at most one (1 -dimensional) bar code symbol, portions of which have a high likelihood of spatial location close to the center of the image and when the bar code is known to be oriented at zero degrees relative to the horizontal axis.
  • this is typically the case when the bar code reader is used in a hand-held mode of operation, where the bar code symbol reader is manually pointed at the bar code symbol to be read.
  • the Multi-Mode Bar Code Symbol Reading Subsystem 17 starts at the center of the image, skips all bar code location steps, and Filters the image at zero (0) degrees and 180 degrees relative to the horizontal axis. Using the "bar-and-space-count" data generated by the filtration step, it reads the potential bar code symbol.
  • the Multi-Mode Bar Code Symbol Reading Subsystem 17 is configured to automatically process a captured frame of digital image data along any one or more predetermined virtual scan line orientations, without feature extraction and marking operations used in the Automatic, Manual and ROI-Specific Modes, so as to read a single bar code symbol represented in the processed image.
  • Multi-Mode Bar Code Symbol Reading Subsystem 17 starts at the center of the image, skips all bar code location steps, and filters the image at different start-pixel positions and at different scan-angles. Using the bar-and-space-count data generated by the filtration step, the Omniscan Mode reads the potential bar code symbol.
  • the imaging-based bar code symbol reader of the present invention has at least seventeen (17) Programmable System Modes of Operation, namely: Programmed Mode of System Operation No. 1 —Manually-Triggered Single-Attempt ID Single-Read Mode Employing the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem; Programmed Mode Of System Operation No. 2— Manually-Triggered Multiple-Attempt I D Single-Read Mode Employing the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem; Programmed Mode Of System Operation No.
  • these Modes Of System Operation can programmed by reading a sequence of bar code symbols from a programming menu as taught, for example, in US Patent No. 6,565,005, which describes a bar code scanner programming technology developed by Metrologic Instruments, Inc., and marketed under the name MetroSelect® Single Line Configuration Programming Method.
  • GUI MetroSet® Graphical User Interface
  • CLI Command Line Interface
  • the bar code reader hereof boots up, its FPGA is programmed automatically with 12.5/50/25 MHz clock firmware and all required device drivers are also installed automatically.
  • the login to the Operating System is also done automatically for the user "root", and the user is automatically directed to the /root/ directory.
  • the IR object detection software driver is installed automatically.
  • the narrow-area illumination software drivers are automatically installed, so that a Pulse Width Modulator (PWM) is used to drive the narrow-area LED-based illumination array 27.
  • PWM Pulse Width Modulator
  • the operating system calls the/tmp/ directory first ("cd /tmp"), and then the focusapp program, located in /root/ directory is run, because the /root/ directory is located in Flash ROM, and to save captured images, the directory /tmp/ should be the current directory where the image is stored in transition to the host), which is located in RAM.
  • the hand-supportable image-processing bar code symbol reader of the present invention can be programmed to operate in any one of a number of different "manually-triggered" modes of system operation, as identified in Nos. 1 through 5 in Fig. 17AFig. 17 A. However, during each of these manually-triggered modes of operation, the image-processing bar code symbol reader controls and coordinates its subsystem components in accordance with a generalized method of manually-triggered operation.
  • the IR-based object presence detection subsystem automatically generates an object detection event, and in response thereto, the multi-mode LED-based illumination subsystem automatically produces a narrow-area field of narrow-band illumination within the FOV of said image formation and detection subsystem.
  • the image capturing and buffering subsystem automatically captures and buffers a narrow- area digital image of the object using the narrow-area field of narrow-band illumination within the FOV, during the narrow-area image capture mode of said multi-mode image formation and detection subsystem;
  • the image processing bar code symbol reading subsystem automatically processes said I D digital image attempts processes the narrow-area digital image in effort to read a 1 D bar code symbol represented therein, and upon successfully decoding a 1 D bar code symbol therein, automatically produces symbol character data representative thereof.
  • the multi-mode LED-based illumination subsystem automatically produces a wide-area field of narrow-band illumination within the FOV of the multi-mode image formation and detection subsystem
  • the image capturing and buffering subsystem captures and buffers a wide-area digital image during the wide-area image capture mode of the image capturing and buffering subsystem
  • the image processing bar code symbol reading subsystem processes the wide-area digital image in effort to read a I D or 2D bar code symbol represented therein, and upon successfully decoding a I D or 2D bar code symbol therein, automatically produces symbol character data representative thereof.
  • Programmed Mode of System Operation No. 1 involves configuration of the system as follows: disabling the IR-based Object Presence and Range Detection Subsystem 12; and enabling the use of manual-trigger activation, the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode in the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Symbol Reading Subsystem 17. Then, the bar code reader illuminates the target object using narrow-area illumination, captures a narrow-area image of the target object, and launches the No-Finder Mode of the Multi-Mode Bar Code Symbol Reading Subsystem 17. The captured image is then processed using the No-Finder Mode.
  • a single cycle of programmed image processing results in the successful reading of a I D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system. If single cycle of programmed image processing is not result in a successful reading of a I D bar code symbol, then the cycle is terminated, all subsystems are deactivated, and the bar code reader returns to its sleep mode of operation, and wait for the next event (e.g. manually pulling trigger switch 2C) which will trigger the system into active operation.
  • the next event e.g. manually pulling trigger switch 2C
  • Programmed Mode of System Operation No. 2 involves configuration of the system as follows: disabling the IR-based Object Presence and Range Detection Subsystem 12; and enabling the use of manual-trigger activation, the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode in the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Symbol Reading Subsystem 17.
  • the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17. Then, the bar code reader illuminates the target object using narrow-area illumination, captures a narrow-area image of the target object, and launches the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17. The captured narrow-area image is then processed using the No-Finder Mode.
  • the resulting symbol character data is sent to the Input/Output Subsystem for use by the host system. If the cycle of programmed image processing does not produce a successful read, then the system automatically enables successive cycles of illumination/capture/processing so long as the trigger switch 2C is being pulled, and then until the system reads a bar code symbol within a captured image of the target object; only thereafter, or when the user releases the trigger switch 2C, will the bar code symbol reader return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation.
  • the default decode timeout is set to 500 ms which can be simply changed by programming. This default decode timeout setting ensures that while the trigger switch 2C is being pulled by the user, the imaging-based bar code symbol reader will re-attempt reading every 500 ms (at most) until it either succeeds or the trigger switch 2C is manually released.
  • Programmed Mode of System Operation No. 3 involves configuration of the system as follows: disabling the IR-based Object Presence and Range Detection Subsystem 12; and enabling the use of manual-trigger activation, the narrow-area and wide-area illumination modes within the Multi-Mode Illumination Subsystem 14, the narrow-area and wide-area image capture modes in the Image Formation and Detection Subsystem 13, and the No-Finder Mode and Manual, ROJ-Specific and/or Automatic Modes of the Multi-Mode Bar Code Reading Subsystem 17.
  • the bar code reader is idle (in its sleep mode) until a user points the bar code reader towards an object with a bar code label, and then pulls the trigger switch 2C.
  • the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14 (i.e. drives the narrow-area illumination array 27), the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • the bar code reader illuminates the target object using narrow-area illumination, captures a narrow-area image of the target object, and launches the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • the captured narrow-area image is then processed using the No-Finder Mode. If this single cycle of programmed image processing results in the successful reading of a I D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system.
  • the system deactivates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17, and then activates the wide-area illumination mode within the Multi-Mode Illumination Subsystem 14, the wide-area image capture mode of the Image Formation and Detection Subsystem 13, and the Manual, ROI-Specific and/or Automatic Modes of the Multi-Mode Bar Code Reading Subsystem 17.
  • the bar code reader illuminates the target object using both near-field and far-field wide-area illumination, captures a wide- area image of the target object, and launches the Manual, ROI-Specific or Automatic Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • the captured wide-area image is then processed using the Manual, ROI-Specific or Automatic Mode. If this single cycle of programmed image processing results in the successful reading of a I D or 2D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system. If this cycle of programmed image processing does not produce a successful read, then the subsystem 19 deactivates all subsystems and then returns to its sleep mode, and waits for an event, which will cause it to re-enter its active mode of operation.
  • Programmed Mode of System Operation No. 4 involves configuration of the system as follows: disabling the IR-based object detection subsystem 12; and enabling the use of manual-trigger activation, the narrow-area and wide-area illumination modes within the Multi-Mode Illumination Subsystem 14, the narrow-area and wide-area image capture modes of the Image Formation and Detection Subsystem 13, and the No-Finder Mode and Manual, ROI-Specific and/or Automatic Modes of the Multi-Mode Bar Code Reading Subsystem 17.
  • the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17. Then, the bar code reader illuminates the target object using narrow-area illumination, captures a narrow-area image of the target object, and launches the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17. The captured narrow-area image is then processed using the No-Finder Mode.
  • this single cycle of programmed image processing results in the successful reading of a I D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem for use by the host system. If this cycle of programmed image processing does not produce a successful read, then the system deactivates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17, and then activates the wide-area illumination mode within the Multi-Mode Illumination Subsystem 14, the wide-area image capture mode of the Image Formation and Detection Subsystem 13, and the Manual and/or Automatic Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • the bar code reader illuminates the target object using both near-field and far-field wide-area illumination, captures a wide-area image of the target object, and launches the Manual (or Automatic) Mode of the Multi-Mode Bar Code Reading Subsystem.
  • the captured wide-area image is then processed using the Manual Mode of bar code symbol reading. If this single cycle of programmed processing results in the successful reading of a I D or 2D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system.
  • the Subsystem 19 automatically enables successive cycles of wide-area illumination/wide-area image capture and processing so long as the trigger switch 2C is being pulled, and then until the system reads a single ID or 2D bar code symbol within a captured image of the target object; only thereafter, or when the user releases the trigger switch 2C 3 will the bar code reader return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation.
  • the default decode timeout is set to 500 ms which can be simply changed by programming. This default decode timeout setting ensures that while the trigger switch is being pulled by the user, the imaging-based bar code symbol reader will re-attempt reading every 500 ms (at most) until it either succeeds or the trigger switch 2C is manually released.
  • Programmed Mode of System Operation No. 5 involves configuration of the system as follows: disabling the IR-based Object Presence and Range Detection Subsystem 12; and enabling the use of manual-trigger activation, the narrow-area and wide-area illumination modes within the Multi-Mode Illumination Subsystem 14, the narrow-area and wide-area image capture modes of the Image Formation and Detection Subsystem 13, and the No-Finder Mode and Manual, ROI-Specific and/or Automatic Modes of the Multi-Mode Bar Code Reading Subsystem 17.
  • the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem. Then, the bar code reader illuminates the target object using narrow-area illumination, captures a narrow-area image of the target object, and launches the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem. The captured narrow- area image is then processed using the No-Finder Mode.
  • the system deactivates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17, and then activates the wide-area illumination mode within the Multi- Mode Illumination Subsystem, the wide-area image capture mode of the Image Formation and Detection Subsystem 13, and the Manual and/or Automatic Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • the bar code reader illuminates the target object using both near-field and far-field wide-area illumination, captures a wide-area image of the target object, and launches the Manual (ROI- Specific and/or Automatic) Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • the captured wide-area image is then processed using the Manual Mode of reading. If this single cycle of programmed processing results in the successful reading of a 1 D or 2D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system.
  • the system automatically enables successive cycles of wide-area illumination/wide-area image capture/image processing so long as the trigger switch is being pulled, and then until the system reads one or more ID and/or 2D bar code symbols within a captured image of the target object; only thereafter, or when the user releases the trigger switch 2C, will the bar code reader return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation.
  • the default decode timeout is set to 500 ms which can be simply changed by programming. This default decode timeout setting ensures that while the trigger switch 2C is being pulled by the user, the imaging-based bar code symbol reader will re-attempt reading every 500 ms (at most) until it either succeeds or the trigger switch 2C is manually released.
  • Programmed Mode of System Operation No. 6 involves configuration of the system as follows: disabling the use of manual-trigger activation; and enabling IR-based Object Presence and Range Detection Subsystem 12, the narrow-area illumination mode only within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode only in the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • the bar code reader is idle until a user points the reader towards an object with a bar code label. Once the object is under the field-of-view of the bar code reader, and the object is automatically detected, the bar code reader "wakes up” and the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • FOV field-of-view
  • Programmed Mode of System Operation No. 7 involves configuration of the system as follows: disabling the use of manual-trigger activation; and enabling IR-based Object Presence And Range Detection Subsystem 12, the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode in the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • the bar code reader is idle until a user points the bar code reader towards an object with a bar code label. Once the object is under the field- of-view of the bar code reader, and the object is automatically detected, the bar code reader "wakes up” and the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • FOV field-of-view
  • the system automatically enables successive cycles of narrow-area illumination/narrow- area image capture/processing so long as the trigger switch 2C is being pulled, and then until the system reads a single I D bar code symbol within a captured image of the target object; only thereafter, or when the user releases the trigger switch 2C, will the bar code reader return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation.
  • the default decode timeout is set to 500 ms which can be simply changed by programming. This default decode timeout setting ensures that while the trigger switch is being pulled by the user, the imaging-based bar code symbol reader will re-attempt reading every 500 ms (at most) until it either succeeds or the trigger switch 2C is manually released.
  • Programmed Mode of System Operation No. 7 involves configuration of the system as follows: disabling the use of manual-trigger activation during all phase of system operation; and enabling IR- based Object Presence and Range Detection Subsystem 12, the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode in the Image Formation and Detection Subsystem 13, and the No-Finder Mode and Manual, ROl-Specific and/or Automatic Modes of the Multi-Mode Bar Code Reading Subsystem 17.
  • the bar code reader is idle until a user points the reader towards an object with a bar code label. Once the object is under the field-of-view of the scanner, and the object is automatically detected, the bar code reader "wakes up” and the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem, 14 the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • FOV field-of-view
  • the system deactivates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17, and then activates the wide-area illumination mode within the Multi-Mode Illumination Subsystem 14, the wide-area image capture mode in the Image Formation and Detection Subsystem 13, and the Manual, ROI-Specific and/or Automatic Modes of the Multi-Mode Bar Code Reading Subsystem 17.
  • the bar code symbol reader illuminates the target object using either near-field or far-field wide-area illumination (depending on the detected range of the target object), captures a wide-area image of the target object, and launches the Manual, ROI-Specific or Automatic Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • the captured wide-area image is then processed using the Manual Mode of reading. If this cycle of programmed image processing results in the successful reading of a single I D or 2D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system.
  • the system automatically enables successive cycles of wide-area illumination/wide-area image capture/processing so long as the target object is being detected, and then until the system reads one or morel D and/or 2D bar code symbols within a captured image of the target object; only thereafter, or when the user moves the object out of the FOV of the bar code reader, will the bar code reader return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation.
  • the default decode timeout is set to 500 ms which can be simply changed by programming.
  • This default decode timeout setting ensures that while the object is being detected by the bar code reader, the bar code symbol reader will re-attempt reading every 500 ms (at most) until it either succeeds or the object is moved away from the FOV of the bar code reader.
  • Programmed Mode of System Operation No. 9 Automatically-Triggered Multi-Attempt 1 D/2D Multiple-Read Mode Employing The " No-Finder Mode and Manual, ROI-Specific and/or Automatic Modes Of the Multi-Mode Bar Code Symbol Reading Subsystem
  • Programmd Mode of System Operation No. 9 involves configuration of the system as follows: disabling the use of manual-trigger activation during all phases of system operation; and enabling IR- based Object Presence and Range Detection Subsystem 12, the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode in the Image Formation and Detection Subsystem 13, and the No Finder Mode and Manual or Automatic Modes of the Multi-Mode Bar Code Symbol Reading Subsystem 17.
  • the bar code reader is idle until a user points the reader towards an object with a bar code label. Once the object is under the field-of-view of the bar code reader, and the object is automatically detected, the bar code reader "wakes up” and the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • FOV field-of-view
  • the system deactivates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17, and then activates the wide-area illumination mode within the Multi-Mode Illumination Subsystem 14, the wide-area image capture mode in the Image Formation and Detection Subsystem 13, and the Manual and/or Automatic Modes of the Multi-Mode Bar Code Reading Subsystem 17.
  • the bar code reader illuminates the target object using either near-field or far-field wide-area illumination (depending on the detected range of the target object), captures a wide-area image of the target object, and launches the Manual (ROI-Specific or Automatic) Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • the captured wide-area image is then processed using the Manual Method of decoding. If this cycle of programmed image processing results in the successful reading of a single 1 D or 2D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem IS for use by the host system.
  • the system automatically enables successive cycles of wide-area-illumination/wide-area image-capture/processing so long as the target object is being detected, and then until the system reads one or morel D and/or 2D bar code symbols within a captured image of the target object; only thereafter, or when the user moves the object out of the FOV of the bar code symbol reader, will the bar code reader return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation.
  • the default decode timeout is set to 500 ms which can be simply changed by programming.
  • This default decode timeout setting ensures that while the object is being detected by the bar code reader, the bar code reader will re-attempt reading every 500 ms (at most) until it either succeeds or the object is moved away from the FOV of the bar code reader.
  • Programmed Mode of System Operation No. 10 involves configuration of the system as follows: disabling the use of manual-trigger activation during all phase of system operation; and enabling IR-based Object Presence and Range Detection Subsystem 12, the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode in the Image Formation and Detection Subsystem 13, and the Manual, ROI-Specific, Automatic or Omniscan Modes of the Multi-Mode Bar Code Reading Subsystem 17.
  • the bar code reader is idle until a user present an object with a bar code symbol under the field-of-view of the bar code reader, and the object is automatically detected, the bar code reader "wakes up” and the system activates the wdie-area illumination mode within the Multi-Mode Illumination Subsystem 14, the wide-area image capture mode in the Image Formation and Detection Subsystem 13, and either Manual, ROI-Specific, Automatic or Omniscan Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • FOV field-of-view
  • the system automatically enables successive cycles of wide-area illumination/wide-area-image-capture/processing so long as the target object is being detected, and then until the system reads a single I D and/or 2D bar code symbol within a captured image of the target object; only thereafter, or when the user moves the object out of the FOV of the bar code reader, will the bar code reader return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation.
  • the default decode timeout is set to 500 ms which can be simply changed by programming.
  • Programmed Mode of System Operation No. 1 1 involves configuration of the system as follows: disabling the use of the manual-trigger activation during the system activation phase of operation; and enabling the IR-based Object Presence and Range Detection Subsystem 12, the narrow- area and wide-area illumination modes within the Multi-Mode Illumination Subsystem 14, the narrow- area and wide-area image capture modes in the Image Formation and Detection Subsystem 13, and the No-Finder Mode and Manual, ROJ-Specific and/or Automatic Modes of the Multi-Mode Bar Code Reading Subsystem 17.
  • the bar code reader is idle until a user points the reader towards an object with a bar code label. Once the object is under the field-of-view of the bar code reader, and the object is automatically detected, the bar code reader "wakes up” and the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • FOV field-of-view
  • the system deactivates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17, and then activates the wide-area illumination mode within the Multi-Mode Illumination Subsystem 14, the wide-area image capture mode of the Image Formation and Detection Subsystem 13, and the Manual, ROI-Specific and/or Automatic Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • the bar code reader will automatically illuminate the target object using wide-area illumination, capture a wide- area image of the target object, and launch the Manual, ROI-Specific or Automatic Mode of the Multi- Mode Bar Code Symbol Reading Subsystem 17.
  • the captured wide-area image is then processed using the Manual, ROI-Specific or Automatic Mode/Method of bar code reading. If this single cycle of programmed image processing results in the successful reading of a single I D or 2D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system.
  • the subsystem 19 automatically deactivates all subsystems, causing the bar code reader return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation.
  • Programmed Mode of System Operation No. 12 involves configuration of the system as follows: disabling the use of manual-trigger activation during the system activation phase of operation; and enabling the IR-based Object Presence and Range Detection Subsystem 12, the narrow-area and wide-area illumination modes within the Multi-Mode Illumination Subsystem 14, the narrow-area and wide-area image capture modes in the Image Formation and Detection Subsystem 13, and the No- Finder Mode and Manual, ROI-Specific and/or Automatic Modes of the Multi-Mode Bar Code Reading Subsystem 17.
  • the bar code reader is idle until a user points the reader towards an object with a bar code label. Once the object is under the field-of-view of the bar code reader, and the object is automatically detected, the bar code reader "wakes up” and the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • FOV f ⁇ eld-of-view
  • the system deactivates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17, and then activates the wide-area illumination mode within the Multi-Mode Illumination Subsystem, the wide-area image capture mode of the Image Formation and Detection Subsystem 13, and the Manual, ROI-Specific and/or Automatic Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • the bar code reader will automatically illuminate the target object using wide-area illumination, capture a wide-area image of the target object, and launches the Manual, ROI-Specific or Automatic Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • the captured wide-area image is then processed using the Manual Mode of reading. If this single cycle of programmed image processing results in the successful reading of a single I D or 2D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system.
  • the system automatically enables successive cycles of wide-area illumination/wide-area-image-capture/processing so long as the trigger switch 2C is being pulled, and then until the system reads one or more 1 D and/or 2D bar code symbols within a captured image of the target object; only thereafter, or when the user releases the trigger switch 2C, will the bar code reader return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation.
  • the default decode timeout is set to 500 ms which can be simply changed by programming. This default decode timeout setting ensures that while the trigger switch 2c is being pulled by the user, the imaging-based bar code symbol reader will re-attempt reading every 500 ms (at most) until it either succeeds or the trigger switch 2C is manually released.
  • the Focus IR module When the Focus IR module detects an object in front of object detection field 20, it posts the OBJECT_DETECT_ON event to the Application Layer.
  • the Application Layer software responsible for processing this event starts the CodeGate Task.
  • the TRIGGER-ON event is posted to the Application.
  • the Application Layer software responsible for processing this event checks if the CodeGate Task is running, and if so, it cancels it and then starts the Main Task.
  • the TRIGGER_OFF event When the user releases the trigger switch 2C, the TRIGGER_OFF event is posted to the Application.
  • the Application Layer software responsible for processing this event checks if the Main Task is running, and if so, it cancels it. If the object is stiU within the object detection field 20, the Application Layer starts the CodeGate Task again.
  • the OBJECT_DETECT_OFF event is posted to the Application Layer.
  • the Application Layer software responsible for processing this event checks if the CodeGate Task is running, and if so, it cancels it.
  • the CodeGate Task in an infinite loop, does the following. It activates the narrow-area illumination array 27 which illuminates a "narrow" horizontal area at the center of the field-of-view and then the Image Formation and Detection Subsystem 13 acquires an image of that narrow-area (i.e. few rows of pixels on the CMOS image sensing array 22), and then attempts to read a bar code symbol represented in the image.
  • the event TRIGGER_ON is posted to the Application Layer.
  • the Application Layer software responsible for processing this event checks if the CodeGate Task is running, and if so, it cancels it and then starts the Main Task.
  • the CodeGate Task can also be canceled upon OBJECT_DETECT_OFF event, posted when the user moves the bar code reader away from the object, or the object away from the bar code reader.
  • Programmed Mode of System Operation No. 13 involves configuration of the system as follows: disabling the use of manual-trigger activation during the system activation phase of operation; and enabling the IR-based Object Presence and Range Detection Subsystem 12, the narrow-area and wide-area illumination modes within the Multi-Mode Illumination Subsystem 14, the narrow-area and wide-area image capture modes in the Image Formation and Detection Subsystem 13, and the No- Finder Mode and Manual, ROI-Specific and/or Automatic Modes of the Multi-Mode Bar Code Reading Subsystem 17.
  • the bar code reader is idle until a user points the reader towards an object with a bar code label.
  • the bar code reader "wakes up” and the system activates the narrow-area illumination mode in the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode in the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • FOV field-of-view
  • the system deactivates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17, and then activates the wide-area illumination mode within the Multi-Mode Illumination Subsystem 14, the wide-area image capture mode of the Image Formation and Detection Subsystem 13, and the Manual and/or Automatic Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • the bar code reader will automatically illuminate the target object using wide-area illumination, capture a wide-area image of the target object, and invoke the Manual, ROI-Specific and/or Automatic Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • the captured wide-area image is then processed using the Manual, ROI-Specific or Automatic Mode of reading. If this single cycle of programmed image processing results in the successful reading of one or more 1 D and/or 2D bar code symbols, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system.
  • the system automatically enables successive cycles of wide-area illumination/wide-area- image-capture/image-processing so long as the trigger switch 2C is being pulled, and then until the system reads one or morel D and/or 2D bar code symbols within a captured image of the target object; only thereafter, or when the user releases the trigger switch 2C, will the bar code reader return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation.
  • the default decode timeout is set to 500 ms which can be simply changed by programming.
  • This default decode timeout setting ensures that while the trigger switch 2C is being pulled by the user, the Imaging-Based Bar Code Symbol Reader will re-attempt reading every 500 ms (at most) until it either succeeds or the trigger switch 2C is manually released.
  • Programmed Mode of System Operation No. 14 involves configuration of the system as follows: disabling the use of manual-trigger activation during the system activation phase of operation; and enabling the IR-based Object Presence and Range Detection Subsystem 12, the narrow-area and wide-area illumination modes within the Multi-Mode Illumination Subsystem 14, the narrow-area and wide-area image capture modes in the Image Formation and Detection Subsystem 13, and the No- Finder Mode and OmniScan Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • the bar code reader is idle until a user points the reader towards an object with a bar code label.
  • the bar code reader "wakes up” and the system activates the narrow-area illumination mode in the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode in the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • Subsystem 13 captures/acquires a narrow-area image which is then processed by Subsystem 17 using its No-Finder Mode. If this single cycle of programmed image processing results in the successful reading of a 1 D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system, and then the system deactivates all subsystems and resumes its sleep state of operation.
  • this cycle of programmed image processing does not produce a successful read, it may nevertheless produce one or more code fragments indicative of the symbology represented in the image, (e.g. PDF 417).
  • the system deactivates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17; and then, if the user is pulling the trigger switch 2C at about this time, the system activates the wide-area illumination mode within the Multi-Mode Illumination Subsystem 14, the wide- area image capture mode of the Image Formation and Detection Subsystem, and either the Omniscan Mode of the Multi-Mode Bar Code Reading Subsystem 17 if code fragments have been found indicating a 2D code format (e.g.
  • the bar code reader proceeds to automatically illuminate the target object using wide-area illumination, capture a wide-area image of the target object, and invoke the Omniscan Mode of the Multi-Mode Bar Code Reading Subsystem 17.
  • the captured wide-area image is then first processed using the Omniscan Mode, using a first processing direction (e.g. at 0 degrees), and sequentially advances the Omniscan Mode of reading at an different angular orientation (e.g. 6 possible directions/orientations) until a single bar code symbol is successfully read.
  • this single cycle of programmed decode processing results in the successful decoding of a single I D and/or 2D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system. If this cycle of programmed image processing does not produce a successful reading of a single ID and/or 2D bar code symbol, then the system automatically enables successive cycles of wide-area illumination/wide-area image capture/ processing so long as the trigger switch 2C is being pulled, and then until the system reads a single I D and/or 2D bar code symbol within a captured image of the target object.
  • the default decode timeout is set to 500 ms which can be simply changed by programming. This default decode timeout setting ensures that while the trigger switch 2C is being pulled by the user, the Imaging-Based Bar Code Symbol Reader will re-attempt reading every 500 ms (at most) until it either succeeds or the trigger switch is manually released.
  • Programmed Mode of System Operation No. 15 typically used for testing purposes, involves configuration of the system as follows: disabling the use of manual-trigger activation during all phase of system operation; and enabling IR-based Object Presence and Range Detection Subsystem 12, the wide-area illumination mode in the Multi-Mode Illumination Subsystem, 14 the wide-area image capture mode in the Image Formation and Detection Subsystem 13, and the Manual, ROI-Specific, Automatic or OmniScan Modes of the Multi-Mode Bar Code Reading Subsystem 17.
  • the bar code reader continuously and sequentially illuminates a wide area of the target object within the field-of-view (FOV) of the bar code reader with both far-field and near-field wide-area illumination, captures a wide-area image thereof, and then processes the same using either the Manual, ROl-Specific, Automatic or Omniscan Modes of operation. If any cycle of programmed image processing results in the successful reading of a I D or 2D bar code symbol (when the Manual, ROI-Specific and Automatic Modes are used), then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system (i.e. typically a test measurement system).
  • the host system i.e. typically a test measurement system
  • the system automatically enables successive cycles of wide-area illumination/wide-area image-capture/processing.
  • the default decode timeout is set to 500 ms which can be simply changed by programming. This default decode timeout setting ensures that while the object is being detected by the bar code reader, the bar code reader will re-attempt reading every 500 ms (at most) until it either succeeds or the object is moved away from the FOV of the bar code reader.
  • Programmed Mode of System Operation No. 16 is a Diagnostic Mode.
  • An authorized user can send a special command to the bar code reader to launch a Command Line Interface (CLI) with the bar code reader.
  • CLI Command Line Interface
  • the bar code reader receives such request from the user, it sends a prompt "MTLG>" back to the user as a handshaking indication that the scanner is ready to accept the user commands.
  • the user then can enter any valid command to the bar code reader and view the results of its execution.
  • the user can use any standard communication program, such as Windows HyperTerminal for example. This mode of operation can be used to test/debug the newly introduced features or view/change the bar code reader configuration parameters. It can also be used to download images and/or a backlog of the previously decoded bar code data from the reader memory to the host computer.
  • CMOS imaging array In Programmed Mode of System Operation No. 17, automatic IR-based object presence detection is enabled, and the CMOS imaging array is operated in its Video Mode as illustrated in Fig. 27E.
  • a trigger signal is automatically generated in response to the automatic detection of an object in the field of view (FOV) of the system, frames of digital images are automatically captured by the CMOS imaging array and are processed subsystem 17 et al in accordance with the principles of the present invention.
  • Such captured frames of digital video data can be transmitted to the host computer in real-time, along with the results of image-processing (i.e. symbol character data) by subsystem 17 (if W 2
  • This mode of system operation is well suited for use at point of sale (POS) applications, as shown in Figs. 55B and 55C, where bar coded objects are either presented to or passed by the bar code reading system.
  • POS point of sale
  • Figs. 55B and 55C where bar coded objects are either presented to or passed by the bar code reading system.
  • Programmed Mode of System Operation No. 17 can be used in combination with any other supported imaging modes.
  • the Multi-mode Illumination Subsystem 14 had three primary modes of illumination: (1 ) narrow-area illumination mode; (2) near-field wide-area illumination mode; and (3) far-field wide-area illumination mode.
  • the Multi-Mode Illumination Subsystem 14 is modified to support four primary modes of illumination: (1) near-field narrow-area illumination mode; (2) far-field narrow-area illumination mode; (3) near-field wide-area illumination mode; and (4) far-field wide-area illumination mode.
  • these near-field and far-field narrow- area illumination modes of operation are conducted during the narrow-area image capture mode of the Multi-Mode Image Formation and Detection Subsystem 13, and are supported by a near-field narrow- illumination array 27A and a far field narrow-area illumination array 27B illustrated in Fig. 19, and as shown in Figs. 2Al .
  • each of these illumination arrays 27A, 27B are realized using at least a pair of LEDs, each having a cylindrical lens of appropriate focal length to focus the resulting narrow-area (i.e. linear) illumination beam into the near-field portion 24A and far- field portion 24B of the field of view of the system, respectively.
  • One of advantages of using a pair of independent illumination arrays to produce narrow-area illumination fields over near and far field portions of the FOV is that it is possible to more tightly control the production of a relatively " narrow” or “narrowly-tapered” narrow-area illumination field along its widthwise dimension. For example, as shown in Fig.
  • the near-field narrow area illumination array 27A can be used to generate (over the near- field portion of the FOV) an illumination field 24A that is narrow along both its widthwise and height- wise dimensions, to enable the user to easily align the illumination field (beam) with a single bar code symbol to be read from a bar code menu of one type or another, thereby avoiding inadvertent reads of two or more bar code symbols or simply the wrong bar code symbol.
  • the far-field narrow area illumination array 27B can be used to generate (over the far-field portion of the FOV) an illumination field 24B that is sufficient wide along its widthwise dimension, to enable the user to easily read elongated bar code symbols in the far-field portion of the field of view of the bar code reader, by simply moving the object towards the far portion of the field.
  • the imaging-based bar code symbol reading device of the present invention can have virtually any type of form factor that would support the reading of bar code symbols at diverse application environments.
  • One alternative form factor for the bar code symbol reading device of the present invention is shown in Figs. 29A through 29C, wherein a portable digital imaging-based bar code symbol reading device of the present invention 1 " is shown from various perspective views, while arranged in a Presentation Mode (i.e. configured in Programmed System Mode No. 12).
  • the digital imaging-based bar code symbol reading device of the present invention 1 ', 1 " can also be realized in the form of a Digital Imaging-Based Bar Code Reading Engine 100 that can be readily integrated into various kinds of information collection and processing systems.
  • trigger switch 2C shown in Fig. 21 is symbolically represented on the housing of the engine design, and it is understood that this trigger switch 2C or functionally equivalent device will be typically integrated with the housing of the resultant system into which the engine is embedded so that the user can interact with and actuate the same.
  • Such Engines according to the present invention can be realized in various shapes and sizes and be embedded within various kinds of systems and devices requiring diverse image capture and processing functions as taught herein. Details regarding one illustrative embodiment of the Digital Imaging-Based Bar Code Reading Engine of the present invention are shown in Figs. 34 through and which will be described in detail hereinafter.
  • Figs. 22, 23, and 24 show a Wireless Bar Code-Driven Portable Data Terminal (PDT) System 140 according to the present invention which comprises: a Bar Code Driven PDT 150 embodying the Digital Imaging-Based Bar Code Symbol Reading Engine of the present invention 100, described herein; and a cradle-providing Base Station 155.
  • PDT Portable Data Terminal
  • the Digital Imaging-Based Bar Code Symbol Reading Engine 100 can be used to read bar code symbols on packages and the symbol character data representative of the read bar code can be automatically transmitted to the cradle-providing Base Station 155 by way of an RF-enabled 2-way data communication link 170.
  • robust data entry and display capabilities are provided on the PDT 150 to support various information based transactions that can be carried out using System 140 in diverse retail, industrial, educational and other environments.
  • the Wireless Bar Code Driven Portable Data Terminal System 140 comprises: a hand-supportable housing 151; Digital Imaging-Based Bar Code Symbol Reading Engine 100 as shown in Fig. 21, and described herein above, mounted within the head portion of the hand- supportable housing 151 ; a user control console 151 A; a high-resolution color LCD display panel 152 and drivers mounted below the user control console 151 A and integrated with the hand-supportable housing, for displaying, in a real-time manner, captured images, data being entered into the system, and graphical user interfaces (GUIs) generated by the end-user application running on the virtual machine of the wireless PDT; and PDT computing subsystem 180 contained within the PDT housing, for carrying out system control operations according to the requirements of the end-user application to be implemented upon the hardware and software platforms of the wireless PDT 2B of this illustrative embodiment.
  • GUIs graphical user interfaces
  • a design model for the Wireless Hand- Supportable Bar Code Driven Portable Data Terminal System 140 shown in Figs. 31 and 32, and its cradle-supporting Base Station 155 interfaced with possible host systems 173 and/or networks 174, comprises a number of subsystems integrated about a system bus, namely: a data transmission circuit 156 for realizing the PDT side of the electromagnetic-based wireless 2-way data communication link 170; program memory (e.g. DRAM) 158; non-volatile memory (e.g.
  • SRAM Serial Bus
  • a battery power supply circuit 164 is provided for supplying regulated power supplies to the various subsystems, at particular voltages determined by the technology used to implement the PDT device.
  • the Base Station 155 also comprises a number of integrated subsystems, namely: a data receiver circuit 165 for realizing the base side of the electromagnetic-based wireless 2- way data communication link 170; a data transmission subsystem 171 including a communication control module; a base station controller 172 (e.g. programmed microcontroller) for controlling the operations of the Base Station 155.
  • the data transmission subsystem 171 interfaces with the host system 173 or network 174 by way of the USB or RS232 communication interfaces, TCP/IP, AppleTalk or the like, well known in the art.
  • data transmission and reception circuits 156 and 165 realize the wireless electromagnetic 2-way digital data communication link 170 employed by the wireless PDT of the present invention.
  • Wireless Hand-Supportable Bar Code Driven Portable Data Terminal System 140 as well as the POS Digital Imaging-Based Bar Code Symbol Reader 1 " shown in Figs. 2OA through 2OC, each have two primary modes of operation: (1 ) a hands-on mode of operation, in which the PDT 150 or POS Reader 1 " is removed from its cradle and used as a bar code driven transaction terminal or simply bar code symbol reader; and (2) a hands-free mode of operation, in which the PDT 150 or POS Reader 1 " remains in its cradle-providing Base Station 155, and is used a presentation type bar code symbol reader, as required in most retail point-of-sale (POS) environments.
  • POS point-of-sale
  • the trigger switch 2C employed in the digital imaging-based bar code symbol reading device of the present invention can be readily modified, and augmented with a suitable stand-detection mechanism, which is designed to automatically configure and invoke the PDT 150 and its Engine 100 into its Presentation Mode (i.e. System Mode of Operation No. 12) or other suitable system mode when the PDT is placed in its Base Station 155 as shown in Fig. 24. Then when the PDT 150 is picked up and removed from its cradling supporting Base Station 155 as shown in Figs. 22 and 23, the trigger switch 2C and stand-detection mechanism, arrangement can be arranged so as to automatically configure and invoke the PDT 150 and its Engine 100 into a suitable hands-on supporting mode of system operation to enable hands-on mode of operation.
  • a suitable stand-detection mechanism which is designed to automatically configure and invoke the PDT 150 and its Engine 100 into its Presentation Mode (i.e. System Mode of Operation No. 12) or other suitable system mode when the PDT is placed in its Base Station 155 as shown in
  • the trigger switch 2C employed in the POS Digital Imaging Bar Code Symbol Reading Device 1 " can be readily modified, and augmented with stand-detection mechanism, which is designed to automatically configure and invoke the POS Reader 1 " into its Presentation Mode (i.e. System Mode of Operation No. 12) or other suitable system mode, when the Reader 1 " is resting on a countertop surface, as shown in Figs. 2OA and 2OB. Then when the POS Reader 1 " is picked up off the countertop surface, for use in its hands-on mode of operation, the trigger switch 2C and stand-detection mechanism, arrangement will automatically configure and invoke Reader 1 " into a suitable hands-on supporting mode of system operation, as shown in Fig. 2OC.
  • the stand-detection mechanism can employ a physical contact switch, or IR object sensing switch, which is actuated then the device is picked up off the countertop surface. Such mechanisms will become apparent in view of the teachings disclosed herein.
  • automatic illumination control is provided by precise controlling the duration of LED illumination during exposure, thereby capturing well- illuminated images.
  • greater degrees of illumination control may be required and the method shown in Figs. 26 through 26B may be helpful.
  • an enhanced auto-illumination control scheme is embodied within the hand-held image-processing bar code reader of the present invention.
  • the illumination level of a captured image is first (i.e. intitially) determined by measuring the actual light illumination level at a central portion of the image detection array, and then computing an appropriate illumination duration level based on this measurement. Then, after an image is captured using this initial illumination level, a software illumination metering program is used to analyze the spatial intensity distribution of the captured image and determine if a new illumination duration should be calculated for use in subsequent image illumination and capture operations, to provide more fine-tuned images.
  • the program automatically (i) calculates a corrected illumination duration (count) for use by the Automatic Light Exposure Measurement and Illumination Control Subsystem, and (ii) provides the corrected illumination duration thereto. Then the Automatic Light Exposure Measurement and Illumination Control Subsystem uses this corrected illumination duration to control the illumination delivered to the field of view (FOV) during the next object illumination and image capturing operation supported by the system.
  • FOV field of view
  • Fig. 26 schematically illustrates the hand-supportable digital imaging-based bar code symbol reading device of the present invention, wherein a Software- Based Illumination Metering Program is used to help the Automatic Light Exposure Measurement and Illumination Control Subsystem control the operation of the LED-Based Multi-Mode Illumination Subsystem.
  • Fig. 26A illustrates in greater detail this enhanced method of automatic illumination control, namely how the current illumination duration (determined by the Automatic Light Exposure Measurement and Illumination Control Subsystem) is automatically over-written by the illumination duration computed by a software- implemented, image-processing-based Illumination Metering Program carried out within the Image- Processing Based Bar Code Symbol Reading Subsystem.
  • This over-written illumination duration is then used by the Automatic Light Exposure Measurement and Illumination Control Subsystem to control the amount of LED illumination produced and delivered to the CMOS image detection array during the next image frame captured by the system, in accordance with this Enhanced Auto- Illumination Control Scheme of the present invention.
  • Fig. 26B is a flow chart setting forth the steps involved in carrying out the enhanced auto- illumination control scheme/method illustrated in Fig. 26A.
  • the first step of the method involves using the Automatic Light Exposure Measurement and Illumination Control Subsystem to automatically (i) measure the illumination level at a particular (e.g. central) portion of field of view of the CMOS image sensing array and (ii) determine the illumination duration (i.e. time count) necessary to achieve a desired spatial intensity in the captured image.
  • the Automatic Light Exposure Measurement and Illumination Control Subsystem uses this computed/determined illumination duration to drive the LED- based illumination subsystem and capture a digital image of the object within the field of view of the Image Formation and Detection Subsystem.
  • the Image-Processing Bar Code Reading Subsystem e.g. image processor
  • the Image-Processing Bar Code Reading Subsystem analyzes and measures in real-time the spatial intensity distribution of the captured image and determines whether or not a corrected illumination duration is required or desired when capturing the next or subsequent frames of image data, during the current or subsequent image capture cycle.
  • the previously determined illumination duration used to captured the analyzed image
  • the corrected illumination duration count
  • the Automatic Light Exposure Measurement and Illumination Control Subsystem uses the corrected illumination duration (computed by the software-based Illumination Metering Program) to drive the LED-based Illumination Subsystem and capture a subsequent digital image of the illuminated object within the field of view of the system.
  • the steps indicated at Blocks C through E can be repeated a number of times in a recursive manner, each image capture cycle, to finally produce a digital image having an optimized spatial intensity level with excellent image contrast.
  • FIGs. 27A and 27B wherein object illumination and image capturing operations are dynamically controlled within the multi-mode image-processing based bar code symbol reader system of the present invention, by analyzing the exposure quality of captured digital images and reconfiguring system control parameters based on the results of such exposure quality analysis.
  • Figs. 27C through 27E illustrate the three basic modes of operation of the CMOS image sensing array employed in the illustrative embodiment, (i.e. Single Frame Shutter Mode, Rolling Shutter Mode and Video Mode), which are dynamically and automatically controlled within the system in accordance with the adaptive system control method of the present invention.
  • shutter mode of the image sensing array e.g. Single Frame Shutter Mode illustrated in Fig. 27C, and Rolling Shutter Mode illustrated in Fig. 27D;
  • illumination mode e.g., off, continuous and strobe/flash
  • illumination field type e.g. narrow-area near-field illumination, wide-area far-field illumination, narrow-area field of illumination, and wide-area field of illumination
  • image capture mode e.g. narrow-area image capture, and wide-area image capture
  • image capture control e.g. single frame, video frames
  • STEP 2 Illuminate an object using the method of illumination indicated by the Illumination Mode parameter, and capture a digital image thereof.
  • STEP 3 Analyze the captured digital image for exposure quality.
  • exposure quality is a quantitative measure of the quality of the image brightness.
  • Setting system control parameters such as the type and the intensity of the object illumination, value of the image sensor gain, and the type and the value of the image sensor exposure parameters, will affect the image brightness.
  • the value of the exposure quality can be presented in the range from 0 to 100, with 0 being an extremely poor exposure that would generally be fruitless to process (in cases when the image is too dark or too bright), and 100 being an excellent exposure. It is almost always worthwhile to process an image when the value of the exposure quality is close to 100. Conversely, it is almost never worthwhile to process an image when the value of the exposure quality is as low as 0. As will be explained in greater detail below, for the latter case where the computed exposure quality is as low as 0, the system control parameters (SCPs) will need to be dynamically re-evaluated and set to the proper values in accordance with the principles of the present invention.
  • STEP 4 If the exposure quality measured in STEP 3 does not satisfy the Exposure Quality Threshold (EQT) parameters set in STEP 0, then calculate new SCPs for the system and set the SCPR flag to TRUE indicating that system must be reconfigured prior to acquiring a digital image during the next image acquisition cycle. Otherwise, maintain the current SCPs for the system.
  • EQT Exposure Quality Threshold
  • STEP 5 If barcode decoding is required in the application at hand, then attempt to process the digital image and decode a barcode symbol represented therein.
  • STEP 6 If barcode decoding fails, or if barcode decoding was not required but the exposure quality did not satisfy the Exposure Quality Threshold parameters, go to STEP 1.
  • STEP 7 Tf barcode decoding succeeded, then transmit results to the host system.
  • STEP 8 If necessary, transmit the digital image to the host system, or store the image in internal memory.
  • system control process is intended for practice during any "system mode" of any digital image capture and processing system, including the bar code symbol reader of the illustrative embodiments, with its various modes of system operation described in Figs 17A and 17B.
  • this control method is generally described in Figs. 27A and 27B, it is understood that its principles will be used to modify particular system control processes that might be supported in any particular digital image capture and processing system.
  • the salient features of this adaptive control method involve using (i) automated real-time analysis of the exposure quality of captured digital images, and (ii) automated reconfiguring of system control parameters (particularly illumination and exposure control parameters) based on the results of such exposure quality analysis, so as to achieve improved system functionality and/or performance in diverse environments.
  • the system in response to a "trigger event" (automatically or manually generated), the system will be able to automatically generate, (i) a narrow-area field of illumination during the narrow-area image capture mode of the system; and if the system fails to read a bar code symbol reading during this mode, then the system will automatically generate (ii) a wide-area field of illumination during its wide-area image capture mode.
  • a "trigger event” automated or manually generated
  • the system in response to a "trigger event” (automatically or manually generated), the system will be able to automatically generate, (i) a narrow-area field of illumination during the narrow-area image capture mode of the system; and if the system fails to read a bar code symbol reading during this mode, then the system will automatically generate (ii) a wide-area field of illumination during its wide-area image capture mode.
  • the adaptive control method described in Figs. 27A and 27B will now be described below as an illustrative embodiment of the control method. It is understood that there are many ways to practice
  • Case 1 System Operated in Programmed Mode of System Operation No.
  • S Automatically-Triggered Multi-Attempt 1 P/2D Single-Read Mode Employing The No-Finder and Manual and/or Automatic Modes of Operation
  • the system control parameters will be configured to implement the selected Programmed Mode of System Operation.
  • the SCPs For System Mode No. 8, the SCPs would be initially configured as follows:
  • the shutter mode parameter will be set to the "single frame shutter mode"(illustrated in Fig. 27C, for implementing the Global Illumination/Exposure Method of the present invention described in Figs. 6D through 6E2);
  • the illumination mode parameter will be set to "flash/strobe"
  • the illumination field type will be set to "narrow-area field"
  • the image capture mode parameter will be set to "narrow-area image capture"
  • the image capture control parameter will be set to "single frame"
  • the image processing mode will be set, for example, to a default value
  • the automatic object detection mode will be set to ON. Also, the SCPR flag will be set to its FALSE value.
  • a trigger signal from the system e.g. generated by automatic object detection by IR object presence and range detection subsystem in System Mode No. 8-10, or by manually pulling the activation switch in System Modes 1 1-12
  • the system will reconfigure itself only if the SCPR flag is TRUE; otherwise, the system will maintain its current SCPs.
  • the SCPR flag will be false, and therefore the system will maintain its SCPs at their default settings.
  • the object will be illuminated within a narrow-field of LED-based illumination produced by the illumination subsystem, and a narrow-area digital image will be captured by the image formation and detection subsystem.
  • the narrow-area digital image will be analyzed for exposure quality (e.g. brightness level, saturation etc.).
  • exposure quality e.g. brightness level, saturation etc.
  • the system recalculates new SCPs and sets the SCPR flag to TRUE, indicating that the system must be reconfigured prior to acquiring a digital image during the next image acquisition cycle. Otherwise, the SCPs are maintained by the system.
  • EQT exposure quality threshold
  • the system attempts to read a 1 D bar code symbol in the captured narrow-area image.
  • the system if the system is incapable of reading the bar code symbol (i.e. decoding fails), then the system returns to STEP 1 and reconfigures its SCPs if the SCPR flag is set to TRUE (i.e. indicative of unsatisfactory exposure quality in the captured image).
  • the system might reset the SCPs as follows:
  • the illumination field type will be set to "narrow-area field"
  • the image capture mode parameter will be set to "narrow-area image capture"
  • the image capture control parameter will be set to "single frame"
  • the system captures a second narrow-area image using ambient illumination and the image sensing array configured in its rolling shutter mode (illustrated in Fig. 27D), and recalculates Exposure Quality Threshold Parameters and if the exposure quality does not satisfy the current Exposure Quality Threshold Parameters, then the system calculates new SCPs (including switching to the wide-area image capture mode, and possibly) and sets the SCPR flag to TRUE. Otherwise, the system maintains the SCPs, and proceeds to attempt to decode a bar code symbol in the narrow-area digital image captured using ambient illumination.
  • the object is illuminated with ambient illumination and captured at STEP 2, and at STEP 3, the captured image is analyzed for exposure quality, as described above.
  • the exposure quality measured in STEP 3 is compared with the Exposure Quality Threshold parameters, and if it does not satisfy these parameters, then new SCPs are calculated and the SCPR flag is set to TRUE. Otherwise the system maintains the SCPs using current SCPs.
  • bar code decoding is attempted, and if it is successful, then at STEPS 7 and 8, symbol character data and image data are transmitted to the host system, and then the system exits the control process at STEP 9. If bar code decoding fails, then the system returns to STEP 1 to repeat STEPS within Blocks Bl and B2 of Figs.
  • the system control parameters will be configured to implement the selected Programmed Mode of System Operation.
  • the SCPs would be initially configured as follows:
  • the shutter mode parameter will be set to the "Video Mode"(illustrated in Fig. 2E);
  • the illumination field type will be set to "wide-area field"
  • the image capture mode parameter will be set to "wide-area image capture"
  • the image processing mode will be set, for example, to a default value
  • the automatic object detection mode will be set to ON. Also, the SCPR flag will be set to its FALSE value.
  • the system Upon the occurrence of a trigger signal from the system (i.e. generated by automatic object detection by IR object presence and range detection subsystem), the system will reconfigure itself only if the SCPR flag is TRUE; otherwise, the system will maintain its current SCPs. During the first pass through STEP 1, the SCPR flag will be FALSE, and therefore the system will maintain its SCPs at their default settings. Then at STEP 2 in Fig. 27A, the object will be continuously illuminated within a wide-field of LED-based illumination produced by the illumination subsystem, and a wide-area digital image will be captured by the image formation and detection subsystem, while the CMOS image sensing array is operated in its Video Mode of operation.
  • a trigger signal from the system i.e. generated by automatic object detection by IR object presence and range detection subsystem
  • the wide-area digital image will be analyzed for exposure quality (e.g. brightness level, saturation etc.).
  • exposure quality e.g. brightness level, saturation etc.
  • the system recalculates new SCPs and sets the SCPR flag to TRUE, indicating that the system must be reconfigured prior to acquiring a digital image during the next image acquisition cycle while the CMOS sensing array is operated in its Video Mode. Otherwise, the SCPs are maintained by the system.
  • EQT exposure quality threshold
  • the system attempts to read a 1 D bar code symbol in the captured wide-area digital image.
  • the system if the system is incapable of reading the bar code symbol (i.e. decoding fails), then the system returns to STEP 1 and reconfigures its SCPs if the SCPR flag is set to TRUE (i.e. indicative of unsatisfactory exposure quality in the captured image). In the case of reconfiguration, the system might reset the SCPs as follows:
  • the illumination field type will be set to "wide-area field"
  • the image capture mode parameter will be set to "wide-area image capture"
  • the system captures a second wide-area image using continous LED illumination and the image sensing array configured in its Video Mode (illustrated in Fig. 27E), and recalculates Exposure Quality Threshold Parameters and if the exposure quality does not satisfy the current Exposure Quality Threshold Parameters, then the system calculates new SCPs (including switching to the wide-area image capture mode, and possibly) and sets the SCPR flag to TRUE. Otherwise, the system maintains the SCPs, and proceeds to attempt to decode a bar code symbol in the narrow-area digital image captured using continuous LED illumination.
  • the object is illuminated with ambient illumination and captured at STEP 2, and at STEP 3, the captured image is analyzed for exposure quality, as described above.
  • the exposure quality measured in STEP 3 is compared with the Exposure Quality Threshold parameters, and if it does not satisfy these parameters, then new SCPs are calculated and the SCPR flag is set to TRUE. Otherwise the system maintains the SCPs using current SCPs.
  • bar code decoding is attempted, and if it is successful, then at STEPS 7 and 8, symbol character data and image data are transmitted to the host system, and then the system exits the control process at STEP 9. If bar code decoding fails, then the system returns to STEP 1 to repeat STEPS within Blocks Bl and B2 of Figs.
  • the adaptive control method of the present invention described above can be applied to any of the System Modes of Operation specified in Figs. 17A and 17B, as well as to any system modes not specifying specified herein.
  • the particular SCPs that will be set in a given system will depend on the structure of and functionalities supported by the system.
  • the subsystems with the system may have a single or multiple modes of suboperation, depending on the nature of the system design.
  • each system will involve the using (i) automated real-time analysis of the exposure quaiity of captured digital images and (ii) automated reconfiguring of system control parameters (particularly illumination and exposure control parameters) based on the results of such exposure quality analysis, so as to achieve improved system functionality and/or performance in diverse environments.
  • system control parameters particularly illumination and exposure control parameters
  • the hand-held image-processing bar code symbol readers described hereinabove employs a narrow-area illumination beam which provides a visual indication to the user on the vicinity of the narrow-area field of view of the system.
  • a narrow-area illumination beam which provides a visual indication to the user on the vicinity of the narrow-area field of view of the system.
  • Fig. 28 shows a hand-supportable image-processing based bar code symbol reader of the present invention 1' employing an image cropping zone (ICZ) framing pattern, and an automatic post- image capture cropping method involving the projection of the ICZ within the field of view (FOV) of the reader and onto a targeted object to be imaged during object illumination and imaging operations.
  • this hand-supportable image-processing based bar code symbol reader 1 ' is similar to the designs described above in Figs. I B through 14, except that it includes one or more image cropping zone (ICZ) illumination framing source(s) operated under the control of the System Control Subsystem.
  • ICZ image cropping zone
  • these ICZ framing sources are realized using four relative bright LEDs indicating the corners of the ICZ in the FOV, which will be cropped during post-image capture operations.
  • the ICZ framing source could be a VLD that produces a visible laser diode transmitted through a light diffractive element (e.g. volume transmission hologram) to produce four beamlets indicating the corners of the ICZ, or bright lines that appear in the captured image.
  • the ICZ frame created by such corner points or border lines (formed thereby) can be located using edge-tracing algorithms, and then the corners of the ROI can be identified from the traced border lines.
  • the first step of the method involves projecting an ICZ framing pattern within the FOV of the system during wide-area illumination and image capturing operations.
  • the second step of the method involves the user visually aligning the object to be imaged within the ICZ framing pattern (however it might be realized).
  • the third step of the method involves the Image Formation and Detection Subsystem and the Image Capture and Buffering Subsystem forming and capturing the wide-area image of the entire FOV of the system, which embraces (i.e. spatially encompasses) the ICZ framing pattern aligned about the object to be imaged.
  • the fourth step of the method involves using an automatic software-based image cropping algorithm, implemented within the Image-Processing Bar Code Reading Subsystem, to automatically crop the pixels within the spatial boundaries defined by the ICZ, from those pixels contained in the entire wide-area image frame captured at Block B. Due to the fact that image distortion may exist in the captured image of the ICZ framing pattern, the cropped rectangular image may partially contain the ICZ framing pattern itself and some neighboring pixels that may fall outside the ICZ framing pattern.
  • the fifth step of the method involves the Image-Processing Bar Code Reading Subsystem automatically decode processing the image represented by the cropped image pixels in the ICZ so as to read a 1 D or 2D bar code symbol graphically represented therein.
  • the sixth step of the method involves the Image-Processing Bar Code Reading Subsystem outputting (to the host system) the symbol character data representative of the decoded bar code symbol.
  • FCZ framing pattern (however realized) does not have to coincide with the field of view of the Image Formation And Detection Subsystem.
  • the ICZ framing pattern also does not have to have parallel optical axes.
  • the only basic requirement of this method is that the ICZ framing pattern fall within the field of view (FOV) of the Image Formation And Detection Subsystem, along the working distance of the system.
  • the imager can provide a visual or audio feedback to the user so that he may repeat the image acquisition process at a more appropriate distance.
  • the hand-supportable image-processing based bar code symbol reader 1 " is provided with the capacity to generate and project a visible illumination-based Image Cropping Pattern (ICP) 200 within the field of view (FOV) of the reader.
  • ICP visible illumination-based Image Cropping Pattern
  • the operator will align the visibly projected ICP onto the object (or graphical indicia) to be imaged so that the graphical indicia generally falls within, or is framed by the outer boundaries covered by the ICP.
  • the object to be imaged may be perfectly planar in geometry, or it may have a particular degree of surface curvature.
  • the angle of the object surface may also be inclined with respect to the bar code symbol reader, which may produce "keystone" type effects during the projection process.
  • the operator will then proceed to use the reader to illuminate the object using its multi-mode illumination subsystem 14, and capture an image of the graphical indicia and the ICP aligned therewith using the multi-mode image formation and detection subsystem 13.
  • the image After the image has been captured and buffered within the image capturing and buffering system 16, it is then transferred to the ICP locating/finding module 201 for image processing that locates the features and elements of the ICP and determines therefrom an image region (containing the graphical indicia) to be cropped for subsequent processing.
  • the coordinate/pixel location of the ICP elements relative to each other in the captured image are then analyzed using computational analysis to determine whether or not the captured image has been distorted due to rotation or tilting of the object relative to the bar code reader during image capture operations. If this condition is indicated, then the cropped image will be transferred to the image perspective correction and scaling module 202 for several stages of image processing.
  • the first stage of image processing will typically involve correction of image "perspective", which is where the cropped image requires processing to correct for perspective distortion cause by rotation or tilting of the object during imaging. Perspective distortion is also know as keystone effects.
  • the perspective/tilt corrected image is then cropped. Thereafter, the cropped digital image is processed to scale (i.e.
  • the digital image-processing based bar code symbol reader 1 " shown in Fig. 31 is very similar to the system 1 shown in Figs. IB through 14, with the exception of a few additional subcomponents indicated below.
  • the digital imaging-based bar code symbol reading device depicted in Fig. 31 comprises the following system components: a Multi-Mode Area-Type Image Formation and Detection (i.e. Camera) Subsystem 13 having image formation (camera) optics for producing a field of view (FOV) upon an object to be imaged and a CMOS or like area-type image sensing array 22 for detecting imaged light reflected off the object during illumination operations in either (i) a narrow-area image capture mode in which a few central rows of pixels on the image sensing array are enabled, or (ii) a wide-area image capture mode in which substantially all rows of the image sensing array are enabled; a Multi-Mode LED-Based Illumination Subsystem 14 for producing narrow and wide area fields of narrow-band illumination within the FOV of the Image Formation And Detection Subsystem 13 during narrow and wide area modes of image capture, respectively, so that only light transmitted from the Multi-Mode Illumination Subsystem 14 and reflected from the illuminated object and transmitted
  • an Image Cropping Pattern Generator 203 for generating a visible illumination-based Image Cropping Pattern (ICP) 200 projected within the field of view (FOV) of the Multi-Mode Area-type Image Formation and Detection Subsystem 13; an IR-based object presence and range detection subsystem 12 for producing an IR- based object detection field within the FOV of the Image Formation and Detection Subsystem 13: an Automatic Light Exposure Measurement and Illumination Control Subsystem 15 for measuring illumination levels in the FOV and controlling the operation of the LED-Based Multi-Mode Illumination Subsystem 14; an Image Capturing and Buffering Subsystem for capturing and buffering 2-D images detected by the Image Formation and Detection Subsystem 13; an Image Processing and Cropped Image Locating Module 201 for processing captured and buffered images to locate the image region corresponding to the
  • FIG. 33 A through 34D5 several refractive-based designs are disclosed for generating an image cropping pattern (ICP) 200, from a single two-dot pattern, to a more complex four dot pattern. While the four dot ICP is a preferred pattern, in some applications, the two dot pattern may be suitable for the requirements at hand where 1 D bar code symbols are primarily employed. Also, as shown in Fig. 35, light diffractive technology (e.g. volume holograms, computer generated holograms CGHs, etc) can be used in conjunction with a VLD and a light focusing lens to generate an image cropping pattern (ICP) having diverse characteristics. It is appropriate at this juncture to describe these various embodiments for the Image Cropping Pattern Generator of the present invention.
  • ICP image cropping pattern
  • a first illustrative embodiment of the VLD-based Image Cropping Pattern Generator 203 A comprising: a VLD 205 located at the symmetrical center of the focal plane of a pair of flat-convex lenses 206A and 206B arranged before the VLD 205, and capable of generating and projecting a two (2) dot image cropping pattern (ICP) 200 within the field of view of the of the Multi-Mode Area-type Image Formation and Detection Subsystem 13.
  • ICP dot image cropping pattern
  • the pair of flat-convex lenses 206A and 206B focus naturally diverging light rays from the VLD 205 into two substantially parallel beams of laser illumination which to produce a two (2) dot image cropping pattern (ICP) 200 within the field of view (FOV) of the Multi- Mode Area-type Image Formation and Detection Subsystem.
  • ICP dot image cropping pattern
  • FOV field of view
  • Fig. 33Dl through 33D5 are simulated images of the two dot Image Cropping Pattern produced by the ICP Generator 203A of Fig. 33A, at distances of 40mm, 80mm, 120mm, 160mm and 200mm, respectively, from its pair of flat-convex lenses, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem.
  • a second illustrative embodiment of the VLD-based Image Cropping Pattern Generator of the present invention 203B comprising: a VLD 206 located at the focus of a biconical lens 207 (having a biconical surface and a cylindrical surface) arranged before the VLD 206, and four flat-convex lenses 208A, 208B, 208C and 208D arranged in four corners.
  • This optical assembly is capable of generating and projecting a four (4) dot image cropping pattern (ICP) within the field of view of the of the Multi-Mode Area-type Image Formation and Detection Subsystem.
  • ICP dot image cropping pattern
  • FIG. 34B and 34C show a composite ray-tracing diagram for the third illustrative embodiment of the VLD- based Image Cropping Pattern Generator depicted in Fig. 34A.
  • the biconical lens 207 enlarges naturally diverging light rays from the VLD 206 in the cylindrical direction (but not the other) and thereafter, the four flat-convex lenses 208 A through 208D focus the enlarged laser light beam to generate a four parallel beams of laser illumination which form a four (4) dot image cropping pattern (ICP) within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem.
  • ICP dot image cropping pattern
  • Figs. 34Dl through 34D5 are simulated images of the linear Image Cropping Pattern produced by the ICP Generator of Fig. 34A, at distance of 40mm, 80mm, 120mm, 160mm and 200mm, respectively, from its flat-convex lens, within the field of view of the Multi-Mode Image Formation and Detection Subsystem 13.
  • a third illustrative embodiment of the VLD-based Image Cropping Pattern Generator of the present invention 203C comprising: a VLD 210, focusing optics 21 1 , and a light diffractive optical element (DOE) 212 (e.g. volume holographic optical element) forming an ultra- compact optical assembly.
  • DOE light diffractive optical element
  • This optical assembly is capable of generating and projecting a four (4) dot image cropping pattern (ICP) within the field of view of the of the Multi-Mode Area-type Image Formation and Detection Subsystem, similar to that generated using the refractive optics based device shown in Fig. 35A.
  • ICP dot image cropping pattern
  • the bar code symbol reader during wide-area imaging operations, projects an illumination-based Image Cropping Pattern (ICP) 200 within the field of view (FOV) of the system, as schematically illustrated in Fig. 36.
  • ICP illumination-based Image Cropping Pattern
  • the operator aligns an object to be imaged within the projected Image Cropping Pattern (ICP) of the system.
  • ICP projected Image Cropping Pattern
  • the bar code symbol reader captures a wide-area digital image of the entire FOV of the system.
  • the bar code symbol reader uses module 201 to process the captured digital image and locate/find features and elements (e.g. illumination spots) associated with the Image Capture Pattern 200 within the captured digital image.
  • features and elements e.g. illumination spots
  • the clusters of pixels indicated by reference characters (a,b,c,d) represent the four illumination spots (i.e. dots) associated with the Image Cropping Pattern (ICP) projected in the FOV.
  • the coordinates associated with such features and elements of the ICP would be located/found using module 201 during this step of the image processing method of the present invention.
  • the bar code symbol reader uses module 201 to analyze the coordinates of the located image features (a,b,c,d) and determine the geometrical relationships among certain of such features (e.g. if the vertices of the ICP have been distorted during projection and imaging due to tilt angles, rotation of the object, etc), and reconstruct an undistorted image cropping pattern (ICP) independent of the object tilt angle (or perspective) computed therefrom.
  • Module 210 supports real-time computational analysis to analyze the coordinates of the pixel locations of the ICP elements relative to each other in the captured image, and determine whether or not the captured image has been distorted due to rotation or tilting of the object relative to the bar code reader during image capture operations.
  • the digital image will be transferred to the image perspective correction and scaling module 202 for several stages of image processing.
  • the first stage of image processing performed by module 202 will typically involve correction of image "perspective", which is where the cropped image requires processing to correct for perspective distortion cause by rotation or tilting of the object during imaging.
  • Perspective distortion is also known as keystone effects.
  • the bar code symbol reader uses module 202 to crops a set of pixels from the corrected digital image, that corresponds to the ICP projected in the FOV of the system.
  • the bar code symbol reader uses module 202 to carry out a digital zoom algorithm to process the cropped and perspective-corrected ICP region and produce a scaled digital image having a predetermined pixel size independent of object distance.
  • This step involves processing the cropped perspective-corrected image so as to scale (i.e. magnify or minify) the same so that it has a predetermined pixel size (e.g. NxM) optimized for image processing by the image processing based bar code symbol reading module 17.
  • a predetermined pixel size e.g. NxM
  • the bar code symbol reader transmits the scaled perspective- corrected digital image to the decode processing module 17 (and optionally, a visual display).
  • the bar code symbol reader decode-processes the scaled digital image so as to read ID or 2D bar code symbols represented therein and generate symbol character data representative of a decoded bar code symbol.
  • the input/output subsystem 18 of the bar code symbol reader outputs the generated symbol character data to a host system.
  • a PLIIM-based object identification and attribute acquisition system of the present invention having a housing of unity design.
  • the housing 1540 has the same light transmission apertures of the housing design shown in Figs. 12A and 12B of WIPO Publication No. WO 02/43195, incorporated herein by reference in its entirety, but has no housing panels disposed about the light transmission apertures 1541A, 1541 B and 1542, through which planar laser illumination beams (PLIBs) and the field of view (FOV) of the PLIIM-based subsystem extend, respectively.
  • PLIBs planar laser illumination beams
  • FOV field of view
  • This feature of the present invention provides a region of space (i.e.
  • Light transmission aperture 1543 enables the AM laser beams 1 167A/1 167B from the LDIP subsystem 1122 to project out from the housing.
  • a pair of PLIIM-based package identification (PID) systems 25' of Figs. 3E4 through 3E8 of WIPO Publication No. WO 02/43195 are modified and arranged within a compact POS housing 1581 having bottom and side light transmission apertures 1582 and 1583 (beneath bottom and side imaging windows 584 and 585, respectively), to produce a bioptical PLIIM- based product identification, dimensioning and analysis (PIDA) system 1580 according to a first illustrative embodiment of the present invention.
  • PIDA bioptical PLIIM- based product identification, dimensioning and analysis
  • the bioptical PIDA system 580 comprises: a bottom PLIIM-based unit 1586A mounted within the bottom portion of the housing 1581 ; a side PLIIM-based unit 586B mounted within the side portion of the housing 1581 ; an electronic product weigh scale 1587, mounted beneath the bottom PLIIM-based unit 1587A, in a conventional manner; and a local data communication network 1588, mounted within the housing, and establishing a highspeed data communication link between the bottom and side units 586A and 586B, and the electronic weigh scale 1587, and a host computer system (e.g. cash register) 1589.
  • a host computer system e.g. cash register
  • each PLIIM-based subsystem 25' employs (i) a plurality of visible laser diodes (VLDs) having different color producing wavelengths to produce a multi-spectral planar laser illumination beam (PLIB) from the side and bottom light transmission apertures 1582 and 1583, and also (ii) a 1-D (linear-type) CCD image detection array for capturing color images of objects (e.g. produce) as the objects are manually transported past the imaging windows 1584 and 1585 of the bioptical system, along the direction of the indicator arrow, by the user or operator of the system (e.g. retail sales clerk).
  • VLDs visible laser diodes
  • PLIB multi-spectral planar laser illumination beam
  • a pair of PLIIM-based package identification (PID) systems 25" of Figs. 6Dl through 6E3 in WIPO Publication No. 02/43 195, supra, are modified and arranged within a compact POS housing 601 having bottom and side light transmission windows 1602 and 1603 (beneath bottom and side imaging windows 604 and 605, respectively), to produce a bioptical PLIIM- based product identification, dimensioning and analysis (PIDA) system 1600 according to a second illustrative embodiment of the present invention.
  • PIDA bioptical PLIIM- based product identification, dimensioning and analysis
  • the bioptical PIDA system 1600 comprises: a bottom PLIIM-based unit 1606A mounted within the bottom portion of the housing 1601; a side PLIIM-based unit 1606B mounted within the side portion of the housing 1601 ; an electronic product weigh scale 1589, mounted beneath the bottom PLIIM-based unit 1606A, in a conventional manner; and a local data communication network 1588, mounted within the housing, and establishing a high-speed data communication link between the bottom and side units 1606A and 1606B, and the electronic weigh scale 1589.
  • each PLIIM-based subsystem 25 employs (i) a plurality of visible laser diodes (VLDs) having different color producing wavelengths to produce a multi-spectral planar laser illumination beam (PLIB) from the bottom and side imaging windows 604 and 605, and also (ii) a 2-D (area-type) CCD image detection array for capturing color images of objects (e.g. produce) as the objects are presented to the imaging windows of the bioptical system by the user or operator of the system (e.g. retail sales clerk).
  • VLDs visible laser diodes
  • PLIB multi-spectral planar laser illumination beam
  • the PLIIM-based imager 1200 comprises: a hand-supportable housing 1201 ; a PLIIM-based image capture and processing engine 1202 contained therein, for projecting a planar laser illumination beam (PLIB) 1203 through its imaging window 1204 in coplanar relationship with the field of view (FOV) 1205 of the linear image detection array 1206 employed in the engine; a LCD display panel 1207 mounted on the upper top surface 1208 of the housing in an integrated manner, for displaying, in a real-time manner, captured images, data being entered into the system, and graphical user interfaces (GUIs) required in the support of various types of information-based transactions; a data entry keypad 1209 mounted on the middle lop surface of the housing 1210 for enabling the user to manually enter data into the imager required during the course of such information-based transactions; and an embedded-type computer and
  • PLIIM-based image capture and processing engine 1202 contained therein, for projecting a planar laser illumination beam (PLIB) 1203 through its imaging window 1204
  • 3-D Digitizer A First Illustrative Embodiment Of The Transportable PLHM-Based 3-D Digitization Device (3-D Digitizer") Of The Present Invention
  • a first illustrative embodiment of the transportable PLIIM-based 3-D digitization device (“3-D digitizer”) 2830 of the present invention comprising: a transportable housing 2831 of lightweight construction, having a handle 2832 on its top portion for transporting system device about from one location to another, and four rubber feet 2834 on its base portion for supporting the device on any stable surface, indoors and outdoors alike; a PLIIM-based imaging and profiling subsystem 120 as described above, contained within the transportable housing 2831 , and including a PLIIM-based camera subsystem 25' and a LDIP subsystem 122, both described in detail in WIPO Publication No. 02/43195, supra.
  • a Second Illustrative Embodiment Of The Transportable PLIIM-Based 3-D Digitization Device (“3-D Digitizer” ' ) Of The Present Invention
  • a second illustrative embodiment of the transportable PLIIM-based 3-D digitization device (“3-D digitizer”) of the present invention 2850 comprising: a transportable housing 2851 of lightweight construction, having a handle 2852 on its top portion for transporting system device about from one location to another, and four rubber feet 2853 on its base portion for supporting the device on any stable surface, indoors and outdoors alike; a PLIIM-based imaging and profiling subsystem 2855, contained within the transportable housing, and including a PLIIM-based camera subsystem 25" with a 2-D area CCD image detection array as shown in Figs. 6Dl through 6D5 and described above, and a LDIP subsystem 122 as described in detail in WIPO Publication No. 02/43195, supra.
  • a "vertical-type" 3-D PLIIM-based CAT scanning system of the present invention 2800 comprising: a support base 2801 for supporting a human or animal subject during imaging operations; a pair of vertically extending rail structures 2802A and 2802B supported from the support base 2801 ; a motorized carriage 2803 supported on and adapted to travel along the length of each rail structure 2802A and 2802B at a programmably controlled velocity; a PLIIM-based imaging and profiling subsystem 120 mounted to each motorized 2803 for producing a pair of amplitude modulated (AM) laser scanning beams 2804 and a single planar laser illumination beam (PLIB) 2805, wherein the sets of PLIBs are orthogonal to each other; and a computer workstation 2806 with LCD monitor 2807, operably connected to each PLIIM-based imaging and profiling subsystem 120, for collecting and storing both linear image slices and 3-D range data profiles of the subject generated during scanning operations, so that the workstation can reconstruct
  • AM amplitude modulated
  • a hand-supportable mobile-type PLIIM-based 3-D digitization device 2810 of the present invention comprising: a hand-supportable housing 281 1 having a handle structure 2812; a PLIIM-based camera subsystem 25'(or 25) mounted in the hand-supportable housing; a miniature-version of LDIP subsystem 122 mounted in the hand-supportable housing 281 1; a set of optically isolated light transmission apertures 2813 and 2813B for transmission of the PLIBs from the PLIIM-based camera subsystem mounted therein, and a light transmission aperture 2814 for transmission of the FOV of the PLIIM-based camera subsystem, during object imaging operations; a light transmission aperture 2815, optically isolated from light transmission apertures 2813 A, 2813B and 2814, for.transmission of the AM laser beam transmitted from the LDlP subsystem 1 122 during object profiling operations; a LCD view finder 2816 integrated with the housing, for displaying 3-D digital data models and 3-
  • the mobile laser scanning 3-D digitization device 2810 of Fig. 33El also has an Ethernet data communications port 2817 for communicating information files with other computing machines on a LAN to which the mobile device is connected, as described in detail in WlPO Publication No. 02/43195, supra.
  • Figs. 40 through 54 it is appropriate at this juncture to describe the digital image capture and processing engine of the present invention 220 employing light-pipe technology 221 for collecting and conducting LED-based illumination in the automatic light exposure measurement and illumination control subsystem 15 during object illumination and image capture modes of operation.
  • the digital image capture and processing engine 220 is shown generating and projecting a visible illumination-based Image Cropping Pattern (ICP) 200 within the field of view (FOV) of the engine, during object illumination and image capture operations, as described in connection with Figs. 31 through 37B.
  • ICP visible illumination-based Image Cropping Pattern
  • the digital image capture and processing engine 220 will be embedded or integrated within a host system 222 which uses the digital output generated from the digital image capture and processing engine 220.
  • the host system 222 can be any system that requires the kind of information that the digital image capture and processing engine 220 can capture and process.
  • the digital image capture and processing engine 220 depicted in Fig. 40 comprising: an assembly of an illumination/targeting optics panel 223; an illumination board 224; a lens barrel assembly 225; a camera housing 226; a camera board 227; and image processing board 230.
  • these components are assembled into an ultra-compact form factor offering advantages of light-weight construction, excellent thermal management, and exceptional image capture and processing performance.
  • camera housing 226 has a pair of integrated engine mounting projections 226A and 226B, each provided with a hole through which a mounting screw can be passed to fix the engine relative to an optical bench or other support structure within the housing of the host system or device.
  • the digital image capture and processing engine 220 shown in Fig. 46 reveals the integration of a linear optical waveguide (i.e. light conductive pipe) component 221 within the engine housing.
  • optical waveguide 221 is made from a plastic material having high light transmission characteristics, and low energy absorption characteristics over the optical band of the engine (which is tuned to the spectral characteristics of the LED illumination arrays and band-pass filter employed in the engine design).
  • the function of optical waveguide 221 is to collect and conduct light energy from the FOV of the Multi-Mode Area-Type Image Formation and Detection Subsystem 13, and direct it to the photo-detector 228 mounted on the camera board 227, and associated with the Automatic Light Exposure Measurement and Illumination Control Subsystem 15.
  • the optical waveguide 221 replaces the parabolic light collecting mirror 55 which is employed in the system design shown in Fig. 6A.
  • Use of the optical waveguide 221 in subsystem 15 offers the advantage of ultra-small size and tight integration within the miniature housing of the digital image capture and processing engine.
  • the optical waveguide 221 aligns with the photodiode 228 on the camera board which supports subsystem 15, specified in great detail in Figs. 6B through 6C2.
  • Fig. 50 an exploded, perspective view of the digital image capture and processing engine 220 is provided to show how the illumination/targeting optics panel 23, the illumination board 224, the lens barrel assembly 225, the camera housing 226, the camera board 227, and its assembly pins 23 IA through 23 I D are easily arranged and assembled with respect to each other in accordance with the principles of the present invention.
  • the illumination board 224 of the illustrative embodiment supports four (4) LEDs 238A through 238D, along with driver circuitry, as generally taught in Figs. 6Cl and 6C2.
  • illumination/targeting optics panel 223 supports light focusing lenses 239A through 239D, for the LEDs in the illumination array supported on the illumination board 224.
  • Optical principles and techniques for specifying lenses 239A through 239D are taught in Figs. 4B through 4D7, and corresponding disclosure here.
  • a wide-area near/far field LED illumination array is shown used in the digital image capture and processing engine of the illustrative embodiment 220, it is understood that the illumination array can be readily modified to support separate wide-area near field illumination and wide-area far field illumination, as well as narrow-area far and near fields of illumination, as taught in great detail herein with respect to systems disclosed in Figs. 1 through 39C2.
  • the illumination/targeting optics panel 223, the illumination board 224 and the camera board 230 of digital image capture and processing engine 220 are shown assembled with the lens barrel assembly 225 and the camera housing 226 removed for clarity of illustration.
  • the illumination/targeting optics panel 223 and the illumination board 224 are shown assembled together as a subassembly 232 using the assembly pins.
  • the subassembly 232 of Fig. 52 is arranged in relation to the lens barrel assembly 225, the camera housing 226, the camera board 227 and the image processing board 230, showing how these system components are assembled together to produce the digital image capture and processing engine 220 of Fig. 40.
  • the digital image capture and processing engine 220 illustrated in Figs. 40 through 53 is shown comprising: a Multi-Mode Area-Type Image Formation and Detection (i.e. Camera) Subsystem 14 having image formation (camera) optics for producing a field of view (FOV) upon an object to be imaged and a CMOS or like area-type image sensing array 22 for detecting imaged light reflected off the object during illumination operations in either (i) a narrow-area image capture mode in which a few central rows of pixels on the image sensing array are enabled, or (ii) a wide-area image capture mode in which substantially all rows of the image sensing array are enabled; a LED-Based Illumination Subsystem 14 for producing a wide area field of narrow-band illumination within the FOV of the Image Formation And Detection Subsystem 13 during.the image capture mode, so that only light transmitted from the LED-Based Illumination Subsystem 14 and reflected from the illuminated object and transmitted through a narrow-band transmission-type optical filter realized
  • an Image Cropping Pattern Generator 203 for generating a visible illumination-based Image Cropping Pattern (ICP) 200 projected within the field of view (FOV) of the Multi-Mode Area-type Image Formation and Detection Subsystem 13; an IR-Based Object Presence And Range Detection Subsystem 12 for producing an IR- based object detection field within the FOV of the Image Formation and Detection Subsystem 13; an Automatic Light Exposure Measurement and Illumination Control Subsystem 14 for measuring illumination levels in the FOV and controlling the operation of the LED-Based Multi-Mode Illumination Subsystem 14 during the image capture mode; an Image Capturing and Buffering Subsystem 16 for capturing and buffering 2-D images detected by the Image Formation and Detection Subsystem 13; an Image Processing and Cropped Image Locating Module 201 for processing captured and buffere
  • a Multimode Image-Processing Based Bar Code Symbol Reading Subsystem 17 for processing cropped and scaled images generated by the Image Perspective and Scaling Module 202 and reading 1 D and 2D bar code symbols represented; and an Input/Output Subsystem 18 for outputting processed image data and the like to an external host system or other information receiving or responding device, in which each said subsystem component is integrated about a System Control Subsystem 19, as shown.
  • FOV folding mirror 236 can help to achieve a wider FOV beyond the light transmission window, while using a housing having narrower depth dimensions. Also, use of the linear optical waveguide 221 obviates the need for large aperture light collection optics which requires significant space within the housing.
  • FIG. 55A an alternative embodiment of the digital image capture and processing engine 220 of the present invention is shown reconfigured in such as way that the illumination/aiming subassembly 232 (depicted in Fig. 52) is detached from the camera housing 226 and mounted adjacent the light transmission window 233 of the engine housing 234.
  • the remaining subassembly, including lens barrel assembly 225, the camera housing 226, the camera board 227 and the image processing board 230 is mounted relative to the bottom of the engine housing 234 so that the optical axis of the camera lens assembly 225 is parallel with the light transmission aperture 233.
  • a curved optical waveguide 221 is used to collect light from a central portion of the field of view of the engine, and guide the collected light to photodiode 228 on the camera board 227.
  • a field of view (FOV) folding mirror 236 is mounted beneath the illumination/aiming subassembly 232 for directing the FOV of the system out through the central aperture 237 formed in the illumination/aiming subassembly 232.
  • FOV folding mirror 236 in this design can help to achieve a wider FOV beyond the light transmission window, while using housing having narrower depth dimensions.
  • use of the curved optical waveguide 221 obviates the need for large aperture light collection optics which requires significant space within the housing.
  • a presentation-type imaging-based bar code symbol reading system 300 is shown constructed using the general components of the digital image capture and processing engine of Figs. 55Al .
  • the illumination/aiming subassembly 232' of Fig. 52 is mounted adjacent the light transmission window 233' of the system housing 301.
  • the remaining subassembly, including lens barrel assembly 225', the camera housing 226', the camera board 227' and the image processing board 230, is mounted relative to the bottom of the engine housing 234' so that the optical axis of the camera lens is parallel with the light transmission aperture 233'.
  • a field of view (FOV) folding mirror 236' is mounted beneath the illumination/aiming subassembly 232' for directing the FOV of the system out through the central aperture formed in the illumination/aiming subassembly 232.
  • FOV field of view
  • FIGs. 55Cl through 55C4 there is shown an automatic imaging-based bar code symbol reading system of the present invention 400 supporting a pass-through mode of operation illustrated in Fig. 55C2 using narrow-area illumination and video image capture and processing techniques, and a presentation-type mode of operation illustrated in Fig. 55C3 using wide-area illumination and video image capture and processing techniques.
  • the POS-based imaging system 400 employs a digital image capture and processing engine similar in design to that shown in Figs. 55BB 1 and 55B2 and that shown in Fig. 2Al , except for the following differences:
  • the Automatic Light Exposure Measurement and Illumination Control Subsystem 15 is adapted to measure the light exposure on a central portion of the CMOS image sensing array and control the operation of the LED-Based Multi-Mode Illumination Subsystem 14 in cooperation with a the Multi-Mode Image Processing Based Bar Code Symbol Reading Subsystem 17 employing software for performing real-time "exposure quality analysis" of captured digital images in accordance with the adaptive system control method of the present invention, illustrated in Figs. 27A through 27E;
  • the substantial ly-coplanar narrow-area field of illumination and narrow-area FOV 401 are oriented in the vertical direction (i.e. oriented along Up and Down directions) with respect to the counter surface of the POS environment, so as to support the "pass-through" imaging mode of the system, as illustrated in Fig. 55C2;
  • IR-based object presence and range detection system 12 employed in Fig. 55 A2 is replaced with an automatic IR-based object presence and direction detection subsystem 12' comprising four independent IR-based object presence and direction detection channels (i.e. fields) 402A, 402B, 402C and 402D, generated by IR LED and photodiode pairs 12Al , 12A2, 12A3 and 12A4 respectively, which automatically produce activation control signals Al(t), A2(t), A3(t) and A4(t) upon detecting an object moving through the object presence and direction detection fields, and a signal analyzer and control logic block 12B' for receiving and processing these activation control signals Al(t), A2(t), A3(t) and A4(t), according to Processing Rules 1 through 5 set forth in Fig.
  • an automatic IR-based object presence and direction detection subsystem 12' comprising four independent IR-based object presence and direction detection channels (i.e. fields) 402A, 402B, 402C and 402D,
  • this POS-based imaging system supports the adaptive control process illustrated in Fig. 27A through 27E, and in the illustrative embodiment of the present invention, operates generally according to System Mode No. 17, described hereinabove.
  • the "trigger signal" is generated from the automatic IR-based object presence and direction detection subsystem 12'.
  • the trigger signal can take on one or three possible values, namely: (1 ) that no object has been detected in the FOV of the system; (2) that an object has been detected in the FOV and is being moved therethrough in a "Pass-Through” manner; or that an object has been detected in the FOV and is being moved therethrough in a Presentation" manner (i.e. toward the imaging window).
  • trigger signal (1 ) above is deemed a "negative” trigger signal
  • trigger signals (2) and (3) are deemed “positive" trigger signals.
  • the shutter mode parameter will be set to the "Video Mode"(illustrated in Fig. 2E); (2) the electronic gain of the image sensor will be set to a default value determined during factory calibration;
  • the illumination field type will be set to "narrow-area field"
  • the image capture mode parameter will be set to "narrow-area image capture"
  • the image processing mode will be set, for example, to a default value
  • the automatic object detection mode will be set to "ON". Also, the SCPR flag will be set to its FALSE value.
  • the SCPs would be initially configured as follows:
  • the shutter mode parameter will be set to the "Video Mode"(illustrated in Fig. 2E);
  • the illumination field type will be set to "wide-area field"
  • the image capture mode parameter will be set to "wide-area image capture"
  • the image processing mode will be set, for example, to a default value
  • the automatic object detection mode will be set to "ON". Also, the SCPR flag will be set to its FALSE value.
  • a "positive" trigger signal from subsystem 12' i.e. that an object has been detected in the FOV and is being moved therethrough in a "Pass-Through” manner, or that an object has been detected in the FOV and is being moved therethrough in a Presentation" manner
  • the system will reconfigure itself only if the SCPR flag is TRUE; otherwise, the system will maintain its current SCPs.
  • the SCPR flag will be FALSE, and therefore the system will maintain its SCPs at their default settings.
  • trigger signal (2) was generated, indicative of Pass-Through object detection and movement.
  • the object will be continuously illuminated within a narrow-field of LED-based illumination produced by the illumination subsystem, and a sequence of narrow-area digital images will be captured by the image formation and detection subsystem and buffered to reconstruct 2D images, while the CMOS image sensing array is operated in its Video Mode of operation.
  • the reconstructed digital image will be analyzed for exposure quality (e.g. brightness level, saturation etc.).
  • exposure quality e.g. brightness level, saturation etc.
  • the system recalculates new SCPs and sets the SCPR flag to TRUE, indicating that the system must be reconfigured prior to acquiring a digital image during the next wide-area image acquisition cycle while the CMOS sensing array is operated in its Video Mode. Otherwise, the SCPs are maintained by the system.
  • EQT exposure quality threshold
  • the system attempts to read a ID bar code symbol in the captured reconstructed 2D digital image.
  • the system if the system is incapable of reading the bar code symbol (i.e. decoding fails), then the system returns to STEP 1 and reconfigures its SCPs if the SCPR flag is set to TRUE (i.e. indicative of unsatisfactory exposure quality in the captured image). In the case of reconfiguration, the system might reset the SCPs as follows:
  • the illumination field type will be set to "narrow-area field"
  • the image capture mode parameter will be set to "narrow-area image capture"
  • the system captures a second 2D image using continous LED illumination and the image sensing array configured in its Video Mode (illustrated in Fig. 27E), and recalculates Exposure Quality Threshold Parameters and if the exposure quality does not satisfy the current Exposure Quality Threshold Parameters, then the system calculates new SCPs and sets the SCPR flag to TRUE. Otherwise, the system maintains the SCPs, and proceeds to attempt to decode a bar code symbol in the 2D reconstructed digital image captured using continuous LED illumination.
  • the object is illuminated using, for example, ambient illumination and captured at STEP 2, and at STEP 3, the captured/ reconstructed 2D image is analyzed for exposure quality, as described above.
  • the exposure quality measured in STEP 3 is compared with the Exposure Quality Threshold parameters, and if it does not satisfy these parameters, then new SCPs are calculated and the SCPR flag is set to TRUE. Otherwise the system maintains the SCPs using current SCPs.
  • bar code decoding is attempted, and if it is successful, then at STEPS 7 and 8, symbol character data and image data are transmitted to the host system, and then the system exits the control process at STEP 9.
  • trigger signal (3) indicative of Presentation object detection and movement
  • the object will be continuously illuminated within a wide-Field of LED-based illumination produced by the illumination subsystem, and a sequence of wide-area (2D) digital images will be captured by the image formation and detection subsystem and buffered, while the CMOS image sensing array is operated in its Video Mode of operation.
  • the reconstructed digital image will be analyzed for exposure quality (e.g. brightness level, saturation etc.).
  • exposure quality e.g. brightness level, saturation etc.
  • the system recalculates new SCPs and sets the SCPR flag to TRUE, indicating that the system must be reconfigured prior to acquiring a digital image during the next wide-area image acquisition cycle while the CMOS sensing array is operated in its Video Mode. Otherwise, the SCPs are maintained by the system.
  • EQT exposure quality threshold
  • the system attempts to read a 1 D bar code symbol in the captured wide-area digital image.
  • the system if the system is incapable of reading the bar code symbol (i.e. decoding fails), then the system returns to STEP 1 and reconfigures its SCPs if the SCPR flag is set to TRUE (i.e. indicative of unsatisfactory exposure quality in the captured image). In the case of reconfiguration, the system might reset the SCPs as follows:
  • the illumination field type will be set to "wide-area field"
  • the image capture mode parameter will be set to "wide-area image capture"
  • the system captures a second 2D image using continuous LED illumination and the image sensing array configured in its Video Mode (illustrated in Fig. 27E), and recalculates Exposure Quality Threshold Parameters and if the exposure quality does not satisfy the current Exposure Quality Threshold Parameters, then the system calculates new SCPs and sets the SCPR flag to TRUE. Otherwise, the system maintains the SCPs, and proceeds to attempt to decode a bar code symbol in the 2 D reconstructed digital image captured using continuous LED illumination.
  • the object is illuminated with ambient illumination and captured at STEP 2, and at STEP 3, the captured wide-area image is analyzed for exposure quality, as described above.
  • the exposure quality measured in STEP 3 is compared with the Exposure Quality Threshold parameters, and if it does not satisfy these parameters, then new SCPs are calculated and the SCPR flag is set to TRUE. Otherwise the system maintains the SCPs using current SCPs.
  • bar code decoding is attempted, and if it is successful, then at STEPS 7 and 8, symbol character data and image data are transmitted to the host system, and then the system exits the control process at STEP 9.
  • a first alternative embodiment of a projection-type POS image-processing based bar code symbol reading system 250 is shown employing the digital image capture and processing engine 220 or 220'.
  • system 250 includes a housing 241 which may contain the engine housing shown in Fig. 55Al, or alternatively, it may support the subassemblies and components shown in Fig. 55Al .
  • a second illustrative embodiment of a projection-type POS image-processing based bar code symbol reading system 260 employing the digital image capture and processing engine 220 or 220'.
  • system 260 includes a housing 261 which may contain the engine housing shown in Fig. 55Al , or alternatively, it may support the subassemblies and components shown in Fig. 55Al .
  • a third illustrative embodiment of a projection-type POS image-processing based bar code symbol reading system 270 employing the digital image capture and processing engin2 220 or 220'.
  • system 270 includes a housing portion 271 (containing engine 220 or 220'), and a base portion 272 for rotatably supporting housing portion 271.
  • Housing portion 271 may contain the engine housing shown in Fig. 55Al , or alternatively, it may support the subassemblies and components shown in Fig. 55Al .
  • the number of VLDs mounted on the illumination board 224 can be substantially greater than four (4), as shown in the illustrative embodiment in Fig. 55.
  • the exact number of LEDs used in the illumination will depend on the end-user application requirements at hand.
  • the IR-Based Object Presence And Range Detection Subsystem 12 employed therein may be used to detect the range of an object within the FOV
  • the LED-Based Illumination Subsystem 14 may include both long and short range wide-area LED illumination arrays, as disclosed hereinabove, for optimized illumination of long and short range regions of the FOV during image capture operations.
  • a price lookup unit (PLU) system 280 comprising: a housing 281 with mounting bracket; a LCD panel 282; a computing platform 283 with network interfaces etc, and a digital image capture and processing subsystem 220 or 220' of the present invention, for identifying bar coded consumer products in retail store environments, and displaying the price thereof on the LCD panel 282.
  • PLU price lookup unit
  • FIG. 58 through 59C2 the method of and apparatus for extending the standard system features and functions within a digital image capture and processing system of the present invention, will now be described below. While it is understond that any of the digital image capture and processing systems described and disclosed herein could be referred to for purposes of illustrating the novel plug-in programming methodology of the present invention, described in Figs. 58 through 59C2, reference will be made to the digital imaging based bar code reading system shown in Figs. 2A through 18 for purposes of illustration, and not limitation.
  • the first step involves the "system designer" of the Imaging- based Bar Code Symbol Reading System (having a multi-tier software architecture), determining which "features" of the system (implemented by Tasks called in the Application Layer) and which functions within any given feature, will be modifiable and/or extendable by end-users and/or third-party persons other (than the original designer and the manufacturer, e.g. VARs 3 end-users, customers et al.) without having detailed knowledge of the system's hardware platform, its communication interfaces with the outside environment, or its user interfaces.
  • This step by the system designer establishes constraints on system modification by others, yet provides degrees of freedom on how the system can be modified to meet custom requirements of end-user applications.
  • the system designer designs and makes the image-processing based bar code reading system of the present invention, wherein persons other than the system designer (e.g. end-users and third-parties) are permitted to modify and/or extend the- system features and functionalities of the original product/system specified by the system designer (i.e. designer of the original product/system) in Block A.
  • persons other than the system designer e.g. end-users and third-parties
  • modify and/or extend the- system features and functionalities of the original product/system specified by the system designer i.e. designer of the original product/system
  • each plug-in module stored within the Plug-In and Configuration File Library, shown in Fig. 10, consists of the set of software libraries (object modules) and configuration files. They can be downloaded to the Image-Processing Based Bar Code Symbol Reading System from an external host system, such as Plug-in Development Platform implemented on a host PC, and using various standard or proprietary communication protocols to communicate with the OS layer of the system. In the Image-Processing Based Bar Code Symbol Reading System, this operation is performed by the Metroset task or User Command Manager (see Software Block Diagram) upon reception of the appropriate command from the host system. Once the download is complete, the plug-in files are stored in the file system of the Image-Processing Based Bar Code Symbol Reading System.
  • the management of all plug-in modules is performed by the Plug-in Controller shown in Fig. 10.
  • the Plug-in Controller can perform operations such as: load (install) plug- in module from the file system to the executable memory of the Image-Processing Based Bar Code Symbol Reading System and perform dynamic linking of the plug-in libraries with the Application; unload (uninstall) the plug-in module; provide executable address of (i.e. Place Holder for) the plug-in module (i.e. third-party code) to the Application; provide additional information about the plug-in module to the Application, such as the rules of the plug-in engagement as described in the plug-in configuration file.
  • Any task of the Image-Processing Based Bar Code Symbol Reading System can request information from the Plug-in Controller about a plug-in module and/or request an operation on it.
  • the Application tasks can request the Plug-in Controller to check the availability of a third-party plug-in module, and if such module is available, install it and provide its executable address as well as the rules of the plug-in engagement.
  • the tasks then can execute it either instead or along with the "standard" module that implements the particular feature.
  • the rules of engagement of the plug-in module i.e. determination whether the plug-in module should be executed as a replacement or a complimentary module to the "standard” module, can be unique to the particular feature.
  • the rules can also specify whether the complimentary plug-in module should be executed first, prior to the "standard” module, or after. Moreover, the plug-in module, if executed first, can indicate back to the device whether the "standard” module should also be called or not, thus, allowing the alteration of the device's behavior.
  • the programming interfaces are predefined for the features that allow the plug-in functionality, thus, enabling the third-parties to develop their own software for the device.
  • the Image Pre-Processing Plug-in described in Fig. 32A The original equipment manufacturer of the Image-Processing Based Bar Code Symbol Reading System supplies the system's "standard” Image Pre-Processing Module (i.e. "original product code” of executable binary format), which is normally executed by the Main Task at Block D in Fig. 32, after the system acquires an image at Block C.
  • the customer can provide its own image preprocessing software as a plug-in module (i.e. "third-party code") to the multi-tier software-based system.
  • the third-party code is typically expressed in executable binary format.
  • the plug-in can be described in a "Image Preprocessing Plug-in Configuration File", having a format, for example, as expressed below:
  • IMGPREPR_PROGMD libimgprepr_plugin.so.l->PluginImgpreprProgmd
  • FIG. 59 A illustrates the logic of the Image Preprocessing plug- in.
  • the Image Processing and Barcode Decoding Plug-in described in Fig. 59B the original equipment manufacturer of the Image-Processing Based Bar Code Symbol Reading System supplies the system's "standard" Image Processing and Barcode Decoding Module, which is normally executed by the Main Task after the system acquires an image, as indicated in Fig. 59.
  • the customer can provide its own image processing and barcode decoding software as a plug-in module to the multi-tier software-based system.
  • the plug-in can be described in a "Image Processing and Barcode Decoding Plug-in Configuration File", having a format, for example, as expressed below:
  • DECODE is a keyword identifying the image processing and barcode decoding plug-in; wherein “0x02” is the value identifying the plug-in's rules of engagement; wherein “libdecode_plugin.so.l " is the name of the plug-in library in the device's file system; and wherein "PluginDecode” is the name of the plug-in function that implements the customer-specific image processing and barcode decoding functionality.
  • bit meaning The individual bits of the value "param”, which is used as the value indicating the rules of this particular plug-in's engagement, can have the following meaning: bit meaning
  • FIG. 32B illustrates the logic of the Image Processing and Barcode Decoding plug-in.
  • the Image Processing and Barcode Decoding Plug-in described in Fig. 59Cl the original equipment manufacturer of the Image-Processing Based Bar Code Symbol Reading System supplies the system's "standard" Image Processing and Barcode Decoding Module, which is normally executed by the Main Task after the system acquires an image as indicated in Fig. 59.
  • the customer can provide its own image processing and barcode decoding software as a plug-in module to the multi-tier software-based system.
  • the plug-in can be described in a "Image Processing and Barcode Decoding Plug-in Configuration File", having a format, for example, as expressed below:
  • Plug-Ins described above provide a few examples of the many kinds of plug-ins (objects) that be developed so that allowed features and functionalities of the system can be modified by persons other than the system designer, in accordance with the principles of the present invention.
  • Other system features and functionalities for which Plug-in modules can be developed and installed within the Image-Processing Based Bar Code Symbol Reading System include, but are not limited to, control over functions supported and performed by the following systems: the IR-based Object Presence and Range Detection Subsystem 12; the Multi-Mode Area-type Image Formation and Detection (i.e.
  • Plug-In Modules that can be created by persons other than the OEM system designer, it is now in order to describe an illustrative embodiment of the Plug-In Development Platform of the present invention with reference to Figs. 10 and 1 1.
  • the system designer/OEM of the system e.g. Metrologic FocusTM1690 Image-Processing Bar Code Reader
  • a CD contains, for example, the following software tools:
  • This directory contains the Arm Linux cross-compiling toolchain package for IBM-compatible Linux PC.
  • This directory contains the Arm Linux cross-compiling toolchain package for IBM-compatible Windows PC.
  • the Cygwin software must be installed prior to the usage of this cross-compiling toolchain.
  • This directory contains sample plug-in development projects.
  • the plug-in software must be compiled on the IBM-compatible Linux PC using the Arm Linux Toolchain for Linux PC or on Windows PC with installed Cygwin software using Arm Linux Toolchain for Cygwin.
  • This directory contains the installation package of the program FWZ Maker for Windows PC. This program is used to build the FWZ-files for downloading into the Focus 1690 scanner.
  • This directory contains the FWZ-f ⁇ le with the latest Metrologi® FocusTM scanner software.
  • the first step of the plug-in software development process involves configuring the plug-in developer platform by installing the above tools on. the host/developer computer system.
  • the next step involves installing system software onto the Image-Processing Bar Code Reader, via the host plug-in developer platform using a communications cable between the communication ports of the system and the plug-in developer computer shown in Figs. 10 and 1 1 .
  • each line of the plug-in configuration file contains information about a plug-in function in the following format: plug-in type: parameter: filename -> function _name
  • plug-in type is one of the supported plug-in type keywords, followed by the field separator ":”; wherein parameter is a number (could be decimal or hex, if preceded with Ox) 3 having a specific and unique meaning for some plug-in functions.
  • the parameter is also called a "call-mode", for it can provide some specific information on how the plug-in should be called. The parameter is not required and can be omitted. If specified, the parameter must be followed by the field separator ":”; wherein filename is the name of the shared library, followed by the filename separator "->". The filename can contain a full-path to the library.
  • the library is assumed to be located in either "/usr/local/Iib" or "/usr/lib/” directory in the Focus scanner. It is therefore important to make sure that the shared library is loaded to the correct directory in the Focus scanner, as specified by the plug-in configuration file; and wherein function _name is the name of the corresponding plug-in C function.
  • configuration file can also contain single-line C-style comments.
  • plug-in developer decides which plug-jn functions (of those supported by the system designer) should be included in the plug-in module (i.e. "object").
  • object the plug-in module
  • the plug-in developer can then generate the FWZ file and include the configuration file and the shared library in it using FWZ Maker program on the Windows PC. Thereafter, the FWZ file can be downloaded to Metrologic's FocusTM Image-processing bar code reader using, for example, Metrologic's Metroset program's Flash Utility tool.
  • configuration of image-processing bar code reader of the present invention can be changed via scanning special programming barcodes, or by sending equivalent data to the reader from the host computer (i.e. plug-in development computer).
  • Programming barcodes are usually Code 128 symbols with the Fn3 codeword.
  • the reader When scanning a programming barcode, the reader may or may not be in its so-called programming mode. When the reader is not in its programming mode, the effect of the programming barcode is supposed to be immediate. On the other hand, when the reader is in its programming mode, the effect of all the programming barcodes read during the programming mode should occur at the time when the reader exits the programming mode.
  • the plug-in can be uninstalled by simply downloading an empty plug-in configuration file. For example, to uninstall a Decode plug-in, download an empty "decode.plugin" file into the "/usr" directory of the File system within the OS layer, shown in Fig. 10.
  • the purpose of the Decode Plug-in is to provide a replacement or a complimentary barcode decoding software to the standard Focus barcode decoding.
  • the Decode Plug-in can have the following plug-in functions:
  • Image is represented in memory as a two-dimensional array of 8-bit pixels.
  • the first pixel of the array represents the upper-left corner of the image.
  • p_cancel_flag is not NULL, it points to the integer flag (called "Cancel flag") that indicates whether the decoding process should continue or should stop as soon as possible. If the flag is 0, the decoding process can continue. If the flag is not zero, the decoding process must stop as soon as possible. The reason for aborting the decoding process could be, for example, a time out. It is recommended to check the Cancel flag often enough so that the latency on aborting the decoding process would be as short as possible.
  • Cancel flag is not the only way the Decoding plug-in (or any plug-in for that matter) can be aborted. Depending on the circumstances, the system can decide to abruptly kill the thread, in which the Decoding piug-in is running, at any time.
  • the structure DECODE_RESULT has the following format:
  • BCJ 3 OINT BCPts[4]; /* Coordinates of the 4 corners of the barcode */ ⁇ BC_BOUNDS;
  • BC_BOUNDS structure The order of the array elements (i.e. corners) in BC_BOUNDS structure is as follows:
  • the Symld member of DECODE RESULT structure can have a string of up to 31 null- terminated characters describing the barcode symbology. It is used for informational purposes only. The following values are recommended for some known barcode symbologies.

Abstract

A digital image capture and processing system, and software development environment, that supports manufacturer-constrained system behavior modification and/or extension by end-users and third-parties through the development and installation/deployment of plug-in modules within the application layer of the system by persons other than the original system designers. By virtue of the present invention, the standard features and functionalities of such systems can now be flexibly modified and/or extended by end-users and third-parties (e.g. VARs, OEMs etc), and thus satisfy customized end-user application requirements, without possessing or acquiring detailed knowledge about the hard-ware platform of the system, its communication interfaces with the outside environment, and user-related interfaces.

Description

DIGITAL IMAGE CAPTURE AND PROCESSING SYSTEM PERMITTING MODIFICATION AND/OR EXTENSION OF SYSTEM FEATURES AND FUNCTIONS
Applicant: Metrologic Instruments, Inc.
BACKGROUND OF INVENTION Technical Field
The present invention relates to hand-supportable and portable area-type digital bar code readers having diverse modes of digital image processing for reading one-dimensional (ID) and two- dimensional (2D) bar code symbols, as well as other forms of graphically-encoded intelligence.
Brief Description Of The State Qf The Art
The state of the automatic-identification industry can be understood in terms of (i) the different classes of bar code symbologies that have been developed and adopted by the industry, and (ii) the kinds of apparatus developed and used to read such bar code symbologies in various user environments.
In general, there are currently three major classes of bar code symbologies, namely: one dimensional (I D) bar code symbologies, such as UPC/EAN, Code 39, etc.; I D stacked bar code symbologies, Code 49, PDF417, etc.; and two-dimensional (2D) data matrix symbologies.
One Dimensional optical bar code readers are well known in the art. Examples of such readers include readers of the Metrologic Voyager® Series Laser Scanner manufactured by Metrologic Instruments, Inc. Such readers include processing circuits that are able to read one dimensional (I D) linear bar code symbologies, such as the UPC/EAN code, Code 39, etc., that are widely used in supermarkets. Such 1 D linear symbologies are characterized by data that is encoded along a single axis, in the widths of bars and spaces, so that such symbols can be read from a single scan along that axis, provided that the symbol is imaged with a sufficiently high resolution along that axis.
In order to allow the encoding of larger amounts of data in a single bar code symbol, a number of 1 D stacked bar code symbologies have been developed, including Code 49, as described in U.S. Pat. No. 4,794,239 (Allais), and PDF417, as described in U.S. Pat. No. 5,340,786 (Pavlidis, et al.). Stacked symbols partition the encoded data into multiple rows, each including a respective ID bar code pattern, all or most of all of which must be scanned and decoded, then linked together to form a complete message. Scanning still requires relatively high resolution in one dimension only, but multiple linear scans are needed to read the whole symbol. The third class of bar code symbologies, known as 2D matrix symbologies offer orientation- free scanning and greater data densities and capacities than their I D counterparts. In 2D matrix codes, data is encoded as dark or light data elements within a regular polygonal matrix, accompanied by graphical finder, orientation and reference structures. When scanning 2D matrix codes, the horizontal and vertical relationships of the data elements are recorded with about equal resolution.
In order to avoid having to use different types of optical readers to read these different types of bar code symbols, it is desirable to have an optical reader that is able to read symbols of any of these types, including their various subtypes, interchangeably and automatically. More particularly, it is desirable to have an optical reader that is able to read all three of the above-mentioned types of bar code symbols, without human intervention, i.e., automatically. This is turn, requires that the reader have the ability to automatically discriminate between and decode bar code symbols, based only on information read from the symbol itself. Readers that have this ability are referred to as "auto-discriminating" or having an "auto-discrimination" capability.
If an auto-discriminating reader is able to read only I D bar code symbols (including their various subtypes), it may be said to have a ID auto-discrimination capability. Similarly, if it is able to read only 2D bar code symbols, it may be said to have a 2D auto-discrimination capability. If it is able to read both I D and 2D bar code symbols interchangeably, it may be said to have a 1 D/2D auto- discrimination capability. Often, however, a reader is said to have a 1 D/2D auto-discrimination capability even if it is unable to discriminate between and decode 1 D stacked bar code symbols.
Optical readers that are capable of 1 D auto-discrimination are well known in the art. An early example of such a reader is Metrologic's VoyagerCG® Laser Scanner, manufactured by Metrologic Instruments, Inc.
Optical readers, particularly hand held optical readers, that are capable of 1 D/2D auto- discrimination and based on the use of an asynchronously moving I D image sensor, are described in US Patent Nos. 5,288,985 and 5,354,977, which applications are hereby expressly incorporated herein by reference. Other examples of hand held readers of this type, based on the use of a stationary 2D image sensor, are described in U.S. Patent Nos. 6,250,551 ; 5,932,862; 5,932,741 ; 5,942,741 ; 5,929,418; 5,914,476; 5,831 ,254; 5,825,006; 5,784,102, which are also hereby expressly incorporated herein by reference.
Optical readers, whether of the stationary or movable type, usually operate at a fixed scanning rate, which means that the readers are designed to complete some fixed number of scans during a given amount of time. This scanning rate generally has a value that is between 30 and 200 scans/sec for 1 D readers. In such readers, the results the successive scans are decoded in the order of their occurrence.
Imaging-based bar code symbol readers have a number advantages over laser scanning based bar code symbol readers, namely: they are more capable of reading stacked 2D symbologies, such as the PDF 417 symbology; more capable of reading matrix 2D symbologies, such as the Data Matrix symbology; more capable of reading bar codes regardless of their orientation; have lower manufacturing costs; and have the potential for use in other applications, which may or may not be related to bar code scanning, such as OCR, security systems, etc
Prior art imaging-based bar code symbol readers suffer from a number of additional shortcomings and drawbacks.
Most prior art hand held optical reading devices can be reprogrammed by reading bar codes from a bar code programming menu or with use of a local host processor as taught in US Patent No. 5,929,418. However, these devices are generally constrained to operate within the modes in which they have been programmed to operate, either in the field or on the bench, before deployment to end-user application environments. Consequently, the statically-configured nature of such prior art imaging- based bar code reading systems has limited their performance as well as capacity for easy integration into third-party products (i.e. systems and devices).
Prior art imaging-based bar code symbol readers with integrated illumination subsystems also support a relatively short range of the optical depth of field. This limits the capabilities of such systems from reading big or highly dense bar code labels.
Prior art imaging-based bar code symbol readers generally require separate apparatus for producing a visible aiming beam to help the user to aim the camera's field of view at the bar code label on a particular target object.
Prior art imaging-based bar code symbol readers generally require capturing multiple frames of image data of a bar code symbol, and special apparatus for synchronizing the decoding process with the image capture process within such readers, as required in US Patent Nos. 5,932,862 and 5,942,741 assigned to Welch Allyn, Inc.
Prior art imaging-based bar code symbol readers generally require large arrays of LEDs in order to flood the field of view within which a bar code symbol might reside during image capture operations, oftentimes wasting larges amounts of electrical power which can be significant in portable or mobile imaging-based readers.
Prior art imaging-based bar code symbol readers generally require processing the entire pixel data set of capture images to find and decode bar code symbols represented therein. On the other hand, some prior art imaging systems use the inherent programmable (pixel) windowing feature within conventional CMOS image sensors to capture only partial image frames to reduce pixel data set processing and enjoy improvements in image processing speed and thus imaging system performance.
Many prior art imaging-based bar code symbol readers also require the use of decoding algorithms that seek to find the orientation of bar code elements in a captured image by finding and analyzing the code words of 2-D bar code symbologies represented therein.
Some prior art imaging-based bar code symbol readers generally require the use of a manually- actuated trigger to actuate the image capture and processing cycle thereof.
Prior art imaging-based bar code symbol readers generally require separate sources of illumination for producing visible aiming beams and for producing visible illumination beams used to flood the field of view of the bar code reader. Prior art imaging-based bar code symbol readers generally utilize during a single image capture and processing cycle, and a single decoding methodology for decoding bar code symbols represented in captured images.
Some prior art imaging-based bar code symbol readers require exposure control circuitry integrated with the image detection array for measuring the light exposure levels on selected portions thereof.
Also, many imaging-based readers also require processing portions of captured images to detect the image intensities thereof and determine the reflected light levels at the image detection component of the system, and thereafter to control the LED-based illumination sources to achieve the desired image exposure levels at the image detector.
Prior art imaging-based bar code symbol readers employing integrated illumination mechanisms control image brightness and contrast by controlling the time the image sensing device is exposed to the light reflected from the imaged objects. While this method has been proven for the CCD-based bar code scanners, it is not suitable, however, for the CMOS-based image sensing devices, which require a more sophisticated shuttering mechanism, leading to increased complexity, less reliability and, ultimately, more expensive bar code scanning systems.
Prior art imaging-based bar code symbol readers generally require the use of tables and bar code menus to manage which decoding algorithms are to be used within any particular mode of system operation to be programmed by reading bar code symbols from a bar code menu.
Also, due to the complexity of the hardware platforms of such prior art imaging-based bar code symbol readers, end-users are not permitted to modify the features and functionalities of such system to their customized application requirements, other than changing limited functions within the system by reading system-programming type bar code symbols, as disclosed in US Patent Nos. 6,321 ,989; 5,965,863; 5,929,418; 5,932,862, each being incorporated herein by reference.
Also, dedicated image-processing based bar code symbol reading devices usually have very limited resources, such as the amount of volatile and non-volatile memories. Therefore, they usually do not have a rich set of tools normally available to universal computer systems. Further, if a customer or a third-party needs to enhance or alter the behavior of a conventional image- process ing based bar code symbol reading system or device, they need to contact the device manufacturer and negotiate the necessary changes in the "standard" software or the ways to integrate their own software into the device, which usually involves the re-design or re-compilation of the software by the original equipment manufacturer (OEM). This software modification process is both costly and time consuming.
Also, as a result of limitations in the mechanical, electrical, optical, and software design of prior art imaging-based bar code symbol readers, such prior art readers generally: (i) fail to enable users to read high-density 1 D bar codes with the ease and simplicity of laser scanning based bar code symbol readers and also 2D symbologies, such as PDF 417 and Data Matrix, and (iii) have not enabled end- users to modify the features and functionalities of such prior art systems without detailed knowledge about the hard-ware platform, communication interfaces and the user interfaces of such systems.
Also, control operations in prior art image-processing bar code symbol reading systems have not been sufficiently flexible or agile to adapt to the demanding lighting conditions presented in challenging retail and industrial work environments where I D and 2D bar code symbols need to be reliably read.
Thus, there is a great need in the art for an improved method of and apparatus for reading bar code symbols using image capture and processing techniques which avoid the shortcomings and drawbacks of prior art methods and apparatus.
DISCLOSURE OF THE PRESENT INVENTION
Accordingly, a primary object of the present invention is to provide a novel method of and apparatus for enabling the recognition of graphically-encoded information, including 1 D and 2D bar code symbologies and alphanumerical character strings, using novel image capture and processing based systems and devices, which avoid the shortcomings and drawbacks of prior art methods and apparatus.
Another object of the present invention is to provide a digital image capture and processing system employing multi-layer software-based system architecture permitting modification of system features and functionalities by way of third party code plug-ins.
Another object of the present invention is to provide such an image capture and processing system that allows customers, VARs and third parties to modify and/or extend a set of standard features and functions of the system without needing to contact the system's OEM and negotiate ways of integrating their desired enhancements to the system.
Another object of the present invention is to provide such an image capture and processing system that allows customers, VARs and third parties to independently design their own software according to the OEM specifications, and plug this software into the system, thereby effectively changing the device's behavior, without detailed knowledge about the hard-ware platform of the system, its communications with outside environment, and user-related interfaces
Another object of the present invention is to provide a customer of the such an image capture and processing system, or any third-party thereof, with a way of and means for enhancing or altering the behavior of the system without interfering with underlying hardware, communications and user- related interfaces.
Another object of the present invention is to provide end-users of such an image capture and processing system, as well as third-parties, with a way of and means for designing, developing, and installing in the device, their own plug-in modules without a need for knowledge of details of the device's hardware. Another object of the present invention is to provide original equipment manufacturers (OEM) with a way of and means for installing the OEM's plug-in modules into an image capture and processing system, without knowledge of the third-party's plug-in (software) modules that have been installed therein, provided established specifications for system features and functionalities for the third-party plug-ins are met.
Another object of the present invention is to provide customers of an image capture and processing system, and third-parties thereof, with a way of and means for installing their own modules to enhance or alter the "standard" behavior of the device according to their own needs and independently from each other.
Another object of the present invention is to provide an image capture and processing system that supports designer/manufacturer-constrained system behavior modification, without requiring detailed knowledge about the hard-ware platform of the system, its communications with outside environment, and user-related interfaces.
Another object of the present invention is to provide a novel hand-supportable digital imaging- based bar code symbol reader capable of automatically reading ID and 2D bar code symbologies using the state-of-the art imaging technology, and at the speed and with the reliability achieved by conventional laser scanning bar code symbol readers.
Another object of the present invention is to provide a novel hand-supportable digital imaging- based bar code symbol reader that is capable of reading stacked 2D symbologies such as PDF417, as well as Data Matrix.
Another object of the present invention is to provide a novel hand-supportable digital imaging- based bar code symbol reader that is capable of reading bar codes independent of their orientation with respect to the reader.
Another object of the present invention is to provide a novel hand-supportable digital imaging- based bar code symbol reader that utilizes an architecture that can be used in other applications, which may or may not be related to bar code scanning, such as OCR, OCV, security systems, etc.
Another object of the present invention is to provide a novel hand-supportable digital imaging- based bar code symbol reader that is capable of reading high-density bar codes, as simply and effectively as "flying-spot" type laser scanners do.
Another object of the present invention is to provide a hand-supportable imaging-based bar code symbol reader capable of reading I D and 2D bar code symbologies in a manner as convenient to the end users as when using a conventional laser scanning bar code symbol reader.
BRIEF DESCRIPTION OF THE DRAWINGS OF PRESENT INVENTION
For a more complete understanding of how to practice the Objects of the Present Invention, the following Detailed Description of the Illustrative Embodiments can be read in conjunction with the accompanying Drawings, briefly described below: Fig. IA is a schematic representation of a digital image capture and processing system of the present invention, employing a multi-tier software system architecture capable of supporting various subsystems providing numerous standard system features and functions that can be modified and/or extended using the innovative piug-in programming methods of the present invention;
Fig. IB is a schematic representation of the system architecture of the a digital image capture and processing system of the present invention, represented in Fig. I A;
Figs. 1C1-1C2, taken together, sets forth a table indicating the features and functions supported by each of the subsystems provided in the system architecture of the a digital image capture and processing system of the present invention, represented in Figs. 1 A and IB;
Fig. I D is a schematic representation indicating that the digital image capture and processing system of the present invention, shown in Figs. I A through 1 C2, can be implemented using a digital camera board and a printed circuit (PC) board that are interfaced together;
Fig. IE is a schematic representation indicating that the digital image capture and processing system of the present invention, shown in Figs. IA through 1C2, can be implemented using a single hybrid digital camera/PC board;
Fig. I F is a schematic representation illustrating that the digital image capture and processing system of the present invention, shown in Figs. I A through I E, can be integrated or embodied within third-party products, such as, for example, but not limited to digital image-processing based bar code symbol reading systems, OCR systems, object recognition systems, portable data terminals (PDTs), mobile phones, computer mice-type devices, personal computers, keyboards, consumer appliances, automobiles, ATMs, vending machines, reverse-vending machines, retail POS-based transaction systems, 2D or 2D digitizers, and CAT scanning systems, automobile identification systems, package inspection systems, personal identification systems and the like;
Fig. 2A is a rear perspective view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention;
Fig. 2B is a front perspective view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention;
Fig. 2C is an elevated left side view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention;
Fig. 2D is an elevated right side view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention;
Fig. 2E is an elevated rear view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention;
Fig. 2F is an elevated front view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention, showing components associated with its illumination subsystem and its image capturing subsystem;
Fig. 2G is a bottom view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention; Fig. 2H is a top rear view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention;
Fig. 21 is a first perspective exploded view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention;
Fig. 2 J is a second perspective exploded view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention;
Fig. 2 K. is a third perspective exploded view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention;
Fig. 2Ll is a schematic block diagram representative of a system design for the hand- supportable digital imaging-based bar code symbol reading device illustrated in Figs. 2A through 2K, wherein the system design is shown comprising (1) a Multi-Mode Area-Type Image Formation and Detection (i.e. Camera) Subsystem having image formation (camera) optics for producing a field of view (FOV) upon an object to be imaged and a CMOS or like area-type image sensing array for detecting imaged light reflected off the object during illumination operations in either (i) a narrow-area image capture mode in which a few central rows of pixels on the image sensing array are enabled, or (ii) a wide-area image capture mode in which all rows of the image sensing array are enabled, (2) a Multi-Mode LED-Based Illumination Subsystem for producing narrow and wide area fields of narrowband illumination within the FOV of the Image Formation And Detection Subsystem during narrow and wide area modes of image capture, respectively, so that only light transmitted from the Multi-Mode Illumination Subsystem and reflected from the illuminated object and transmitted through a narrowband transmission-type optical filter realized within the hand-supportable housing (i.e. using a red- wavelength high-pass reflecting window filter element disposed at the light transmission aperture thereof and a low-pass filter before the image sensor) is detected by the image sensor and all other components of ambient light are substantially rejected, (3) an IR-based object presence and range detection subsystem for producing an IR-based object detection field within the FOV of the Image Formation and Detection Subsystem, (4) an Automatic Light Exposure Measurement and Illumination Control Subsystem for controlling the operation of the LED-Based Multi-Mode Illumination Subsystem, (5) an Image Capturing and Buffering Subsystem for capturing and buffering 2-D images detected by the Image Formation and Detection Subsystem, (6) a Multimode Image-Processing Based Bar Code Symbol Reading Subsystem for processing images captured and buffered by the Image Capturing and Buffering Subsystem and reading I D and 2D bar code symbols represented, and (7) an Input/Output Subsystem for outputting processed image data and the like to an external host system or other information receiving or responding device, in which each said subsystem component is integrated about (7) a System Control Subsystem, as shown;
Fig. 2L2 is a schematic block representation of the Multi-Mode Image-Processing Based Bar Code Symbol Reading Subsystem, realized using the three-tier computing platform illustrated in Fig. 2M; Fig. 2M is a schematic diagram representative of a system implementation for the hand- supportable digital imaging-based bar code symbol reading device illustrated in Figs. 2A through 2L2, wherein the system implementation is shown comprising (1) an illumination board 33 carrying components realizing electronic functions performed by the Multi-Mode LED-Based Illumination Subsystem and the Automatic Light Exposure Measurement And Illumination Control Subsystem, (2) a CMOS camera board carrying a high resolution (1280 X 1024 7-bit 6 micron pixel size) CMOS image sensor array running at 25Mhz master clock, at 7 frames/second at 1280*1024 resolution with randomly accessible region of interest (ROl) window capabilities, realizing electronic functions performed by the multi-mode area-type Image Formation and Detection Subsystem, (3) a CPU board (i.e. computing platform) including (i) an Intel Sabinal 32-Bit Microprocessor PXA210 running at 200 Mhz 1.0 core voltage with a 16 bit lOOMhz external bus speed, (ii) an expandable (e.g. 7+ megabyte) Intel J3 Asynchronous 16-bit Flash memory, (iii) an 16 Megabytes of 100 MHz SDRAM, (iv) an Xilinx Spartan II FPGA FIFO 39 running at 50Mhz clock, frequency and 60MB/Sec data rate, configured to control the camera timings and drive an image acquisition process, (v) a multimedia card socket, for realizing the other subsystems of the system, (vi) a power management module for the MCU adjustable by the system bus, and (vii) a pair of UARTs (one for an IRDA port and one for a JTAG port), (4) an interface board for realizing the functions performed by the I/O subsystem, and (5) an IR-based object presence and range detection circuit for realizing the IR-based Object Presence And Range Detection Subsystem;
Fig. 3 A is a schematic representation showing the spatial relationships between the near and far and narrow and wide area fields of narrow-band illumination within the FOV of the Multi-Mode Image Formation and Detection Subsystem during narrow and wide area image capture modes of operation;
Fig. 3B is a perspective partially cut-away view of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment, showing the LED-Based Multi- Mode Illumination Subsystem transmitting visible narrow-band illumination through its narrow-band transmission-type optical filter system and illuminating an object with such narrow-band illumination, and also showing the image formation optics, including the low pass filter before the image sensing array, for collecting and focusing light rays reflected from the illuminated object, so that an image of the object is formed and detected using only the optical components of light contained within the narrow-band of illumination, while all other components of ambient light are substantially rejected before image detection at the image sensing array;
Fig. 3 C is a schematic representation showing the geometrical layout of the optical components used within the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment, wherein the red-wavelength reflecting high-pass lens element is positioned at the imaging window of the device before the image formation lens elements, while the low-pass filter is disposed before the image sensor of between the image formation elements, so as to image the object at the image sensing array using only optical components within the narrow-band of illumination, while rejecting all other components of ambient light; Fig. 3D is a schematic representation of the image formation optical subsystem employed within the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment, wherein all three lenses are made as small as possible (with a maximum diameter of 12mm), all have spherical surfaces, all are made from common glass, e.g. LAK2 (~ LaK9), ZFlO (=SF8), LAF2 (~LaF3);
Fig. 3E is a schematic representation of the lens holding assembly employed in the image formation optical subsystem of the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment, showing a two-piece barrel structure which holds the lens elements, and a base structure which holds the image sensing array, wherein the assembly is configured so that the barrel structure slides within the base structure so as to focus the assembly;
Fig. 3Fl is a first schematic representation showing, from a side view, the physical position of the LEDs used in the Multi-Mode Illumination Subsystem, in relation to the image formation lens assembly, the image sensing array employed therein (e.g. a Motorola MCM20027 or "National Semiconductor LM9638 CMOS 2-D image sensing array having a 1280x1024 pixel resolution (1/2" format), 6 micron pixel size, 13.5 Mhz clock rate, with randomly accessible region of interest (ROI) window capabilities);
Fig. 3F2 is a second schematic representation showing, from an axial view, the physical layout of the LEDs used in the Multi-Mode Illumination Subsystem of the digital imaging-based bar code symbol reading device, shown in relation to the image formation lens assembly, and the image sensing array employed therein;
Fig. 4Al is a schematic representation specifying the range of narrow-area illumination, near- field wide-area illumination, and far-field wide-area illumination produced from the LED-Based Multi- Mode Illumination Subsystem employed in the hand-supportable digital imaging-based bar code symbol reading device of the present invention;
Fig. 4A2 is a table specifying the geometrical properties and characteristics of each illumination mode supported by the LED-Based Multi-Mode Illumination Subsystem employed in the hand- supportable digital imaging-based bar code symbol reading device of the present invention;
Fig. 4B is a schematic representation illustrating the physical arrangement of LED light sources associated with the narrow-area illumination array and the near-field and far-field wide-area illumination arrays employed in the digital imaging-based bar code symbol reading device of the present invention, wherein the LEDs in the far-field wide-area illuminating arrays are located behind spherical lenses, the LEDs in the narrow-area illuminating array are disposed behind cylindrical lenses, and the LEDs in the near-field wide-area illuminating array are unlensed in the first illustrative embodiment of the Digital Imaging-Based Bar Code Reading Device;
Fig. 4Cl is a graphical representation showing the Lambertian emittance versus wavelength characteristics of the LEDs used to implement the narrow-area illumination array in the Multi-Mode Illumination Subsystem of the present invention; Fig. 4C2 is a graphical representation showing the Lambertian emittance versus polar angle characteristics of the LEDs used to implement the narrow-area illumination array in the Multi-Mode Illumination Subsystem of the present invention;
Fig. 4C3 is a schematic representation of the cylindrical lenses used before the LEDs in the narrow-area (linear) illumination arrays in the digital imaging-based bar code symbol reading device of the present invention, wherein the first surface of the cylindrical lens is curved vertically to create a narrow-area (i.e. linear) illumination pattern, and the second surface of the cylindrical lens is curved horizontally to control the height of the of the narrow-area illumination pattern to produce a narrow- area (i.e. linear) illumination field;
Fig. 4C4 is a schematic representation of the layout of the pairs of LEDs and two cylindrical lenses used to implement the narrow-area (linear) illumination array employed in the digital imaging- based bar code symbol reading device of the present invention;
Fig. 4C5 is a set of six illumination profiles for the narrow-area (linear) illumination fields produced by the narrow-area (linear) illumination array employed in the digital imaging-based bar code symbol reading device of the illustrative embodiment, taken at 30, 40, 50, 80, 120, and 220 millimeters along the field away from the imaging window (i.e. working distance) of the digital imaging-based bar code symbol reading device, illustrating that the spatial intensity of the narrow-area illumination field begins to become substantially uniform at about 80 millimeters;
Fig. 4Dl is a graphical representation showing the Lambertian emittance versus wavelength characteristics of the LEDs used to implement the wide area illumination arrays employed in the digital imaging-based bar code symbol reading device of the present invention;
Fig. 4D2 is a graphical representation showing the Lambertian emittance versus polar angle characteristics of the LEDs used to implement the far-field and near-field wide-area illumination arrays employed in the digital imaging-based bar code symbol reading device of the present invention;
Fig. 4D3 is a schematic representation of the plano-convex lenses used before the LEDs in the far-field wide-area illumination arrays in the illumination subsystem of the present invention,
Fig. 4D4 is a schematic representation of the layout of LEDs and plano-convex lenses used to implement the far and narrow wide-area illumination array employed in the digital imaging-based bar code symbol reading device of the present invention, wherein the illumination beam produced therefrom is aimed by positioning the lenses at angles before the LEDs in the near-field (and far-field) wide-area illumination arrays employed therein;
Fig. 4D5 is a set of six illumination profiles for the near-field wide-area illumination fields produced by the near-field wide-area illumination arrays employed in the digital imaging-based bar code symbol reading device of the illustrative embodiment, taken at 10, 20, 30, 40, 60, and 100 millimeters along the field away from the imaging window (i.e. working distance) of the digital imaging-based bar code symbol reading device, illustrating that the spatial intensity of the near-field wide-area illumination field begins to become substantially uniform at about 40 millimeters; Fig. 4D6 is a set of three illumination profiles for the far-field wide-area illumination fields produced by the far-field wide-area illumination arrays employed in the digital imaging-based bar code symbol reading device of the illustrative embodiment, taken at 100, 150 and 220 millimeters along the field away from the imaging window (i.e. working distance) of the digital imaging-based bar code symbol reading device, illustrating that the spatial intensity of the far-field wide-area illumination field begins to become substantially uniform at about 100 millimeters;
Fig. 4D7 is a table illustrating a preferred method of calculating the pixel intensity value for the center of the far-field wide-area illumination field produced from the Multi-Mode Illumination Subsystem employed in the digital imaging-based bar code symbol reading device of the present invention, showing a significant signal strength ( greater than 80 DN);
Fig. 5Al is a schematic representation showing the red-wavelength reflecting (high-pass) imaging window integrated within the hand-supportable housing of the digital imaging-based bar code symbol reading device, and the low-pass optical filter disposed before its CMOS image sensing array therewithin, cooperate to form a narrow-band optical filter subsystem for transmitting substantially only the very narrow band of wavelengths (e.g. 620-700 nanometers) of visible illumination produced from the Multi-Mode Illumination Subsystem employed in the digital imaging-based bar code symbol reading device, and rejecting all other optical wavelengths outside this narrow optical band however generated (i.e. ambient light sources);
Fig. 5A2 is a schematic representation of transmission characteristics (energy versus wavelength) associated with the low-pass optical filter element disposed after the red-wavelength reflecting high-pass imaging window within the hand-supportable housing of the digital imaging-based bar code symbol reading device, but before its CMOS image sensing array, showing that optical wavelengths below 620 nanometers are transmitted and wavelengths above 620 nm are substantially blocked (e.g. absorbed or reflected);
Fig. 5A3 is a schematic representation of transmission characteristics (energy versus wavelength) associated with the red-wavelength reflecting high-pass imaging window integrated within the hand-supportable housing of the digital imaging-based bar code symbol reading device of the present invention, showing that optical wavelengths above 700 nanometers are transmitted and wavelengths below 700 nm are substantially blocked (e.g. absorbed or reflected);
Fig. 5A4 is a schematic representation of the transmission characteristics of the narrow-based spectral filter subsystem integrated within the hand-supportable imaging-based bar code symbol reading device of the present invention, plotted against the spectral characteristics of the LED-emissions produced from the Multi-Mode Illumination Subsystem of the illustrative embodiment of the present invention;
Fig. 6A is a schematic representation showing the geometrical layout of the spherical/parabolic light reflecting/collecting mirror and photodiode associated with the Automatic Light Exposure Measurement and Illumination Control Subsystem, and arranged within the hand-supportable digital imaging-based bar code symbol reading device of the illustrative embodiment, wherein incident illumination is collected from a selected portion of the center of the FOV of the system using a spherical light collecting mirror, and then focused upon a photodiode for detection of the intensity of reflected illumination and subsequent processing by the Automatic Light Exposure Measurement and Illumination Control Subsystem, so as to then control the illumination produced by the LED-based Multi-Mode Illumination Subsystem employed in the digital imaging-based bar code symbol reading device of the present invention;
Fig. 6B is a schematic diagram of the Automatic Light Exposure Measurement and Illumination Control Subsystem employed in the hand-supportable digital imaging-based bar code symbol reading device of the present invention, wherein illumination is collected from the center of the FOV of the system and automatically detected so as to generate a control signal for driving, at the proper intensity, the narrow-area illumination array as well as the far-field and narrow-field wide-area illumination arrays of the Multi-Mode Illumination Subsystem, so that the CMOS image sensing array produces digital images of illuminated objects of sufficient brightness;
Figs. 6Cl and 6C2, taken together, set forth a schematic diagram of a hybrid analog/digital circuit designed to implement the Automatic Light Exposure Measurement and Illumination Control Subsystem of Fig. 6B employed in the hand-supportable digital imaging-based bar code symbol reading device of the present invention;
Fig. 6D is a schematic diagram showing that, in accordance with the principles of the present invention, the CMOS image sensing array employed in the digital imaging-based bar code symbol reading device of the illustrative embodiment, once activated by the System Control Subsystem (or directly by the trigger switch), and when all rows in the image sensing array are in a state of integration operation, automatically activates the Automatic Light Exposure Measurement and Illumination Control Subsystem which, in response thereto, automatically activates the LED illumination driver circuitry to automatically drive the appropriate LED illumination arrays associated with the Multi-Mode Illumination Subsystem in a precise manner and globally expose the entire CMOS image detection array with narrowly tuned LED-based illumination when all of its rows of pixels are in a state of integration, and thus have a common integration time, thereby capturing high quality images independent of the relative motion between the bar code reader and the object;
Fig. 6El and 6E2, taken together, set forth a flow chart describing the steps involved in carrying out the global exposure control method of the present invention, within the digital imaging- based bar code symbol reading device of the illustrative embodiments;
Fig. 7 is a schematic block diagram of the IR-based automatic Object Presence and Range Detection Subsystem employed in the hand-supportable digital imaging-based bar code symbol reading device of the present invention, wherein a first range indication control signal is generated upon detection of an object within the near-field region of the Multi-Mode Illumination Subsystem, and wherein a second range indication control signal is generated upon detection of an object within the far- field region of the Multi-Mode Illumination Subsystem; Fig. 8 is a schematic representation of the hand-supportable digital imaging-based bar code symbol reading device of the present invention, showing that its CMOS image sensing array is operably connected to its microprocessor through a FIFO (realized by way of a FPGA) and a system bus, and that its SDRAM is also operably connected to the microprocessor by way of the system bus, enabling the mapping of pixel data captured by the imaging array into the SDRAM under the control of the direct memory access (DMA) module within the microprocessor;
Fig. 9 is a schematic representation showing how the bytes of pixel data captured by the CMOS imaging array within the hand-supportable digital imaging-based bar code symbol reading device of the present invention, are mapped into the addressable memory storage locations of its SDRAM during each image capture cycle carried out within the device;
Fig. 10 is a schematic representation showing the software modules associated with the three- tier software architecture of the hand-supportable digital imaging-based bar code symbol reading device of the present invention, namely: the Main Task module, the CodeGate Task module, the Narrow-Area Illumination Task module, the Metroset Task module, the Application Events Manager module, the User Commands Table module, the Command Handler module, Plug-In Controller, and Plug-In Libraries and Configuration Files, all residing within the Application layer of the software architecture; the Tasks Manager module, the Events Dispatcher module, the Input/Output Manager module, the User Commands Manager module, the Timer Subsystem module, the Input/Output Subsystem module and the Memory Control Subsystem module residing with the System Core (SCORE) layer of the software architecture; and the Linux Kernal module in operable communication with the Plug-In Controller, the Linux File System module, and Device Drivers modules residing within the Linux Operating System (OS) layer of the software architecture, and in operable communication with an external (hostO Plug-In Development Platform via standard or proprietary communication interfaces;
Fig. 1 1 is a perspective view of an illustrative embodiment of a computer software development platform for developing plug-ins for tasks within the application layer of the imaging-based bar code reading system of the present invention;
Fig. 12A is a schematic representation of the Events Dispatcher software module which provides a means of signaling and delivering events to the Application Events Manager, including the starting of a new task, stopping a currently running task, doing something, or doing nothing and ignoring the event;
Fig. 12B is a table listing examples of system-defined events which can occur and be dispatched within the hand-supportable digital imaging-based bar code symbol reading device of the present invention, namely: SCORE_EVENT_POWER_UP which signals the completion of system start-up and involves no parameters;_SCORE_EVENT_TIMEOUT which signals the timeout of the logical timer, and involves the parameter "pointer to timer id"; SCORE_EVENT_UNEXPECTED_INPUT which signals that the unexpected input data is available and involves the parameter "pointer to connection id"; SCORE_EVENT_TRIG_ON which signals that the user pulled the trigger switch and involves no parameters; SCORE_EVENT_TR1G_OFF which signals that the user released the trigger switch and involves no parameters; SCORE_EVENT_OBJECT_DETECT_ON which signals that the object is positioned under the bar code reader and involves no parameters; SCORE_EVENT_OBJECT_DETECT_OFF which signals that the object is removed from the field of view of the bar code reader and involves no parameters; SCORE_EVENT_EXIT_TASK which signals the end of the task execution and involves the pointer UTID; and SCORE_EVENT_ABORT_TASK which signals the aborting of a task during execution;
Fig. 12C is a schematic representation of the Tasks Manager software module which provides a means for executing and stopping application specific tasks (i.e. threads);
Fig. 12D is a schematic representation of the Input/Output Manager software module (i.e Input/Output Subsystem), which runs in the background and monitors activities of external devices and user connections, and signals appropriate events to the Application Layer, which such activities are detected;
Figs. 12El and 12E2 set forth a schematic representation of the Input/Output Subsystem software module which provides a means for creating and deleting input/output connections, and communicating with external systems and devices;
Figs. 12Fl and 12F2 set forth a schematic representation of the Timer Subsystem which provides a means for creating, deleting, and utilizing logical timers;
Figs. 12Gl and 12G2 set forth a schematic representation of the Memory Control Subsystem which provides an interface for managing the thread-level dynamic memory with the device, fully compatible with standard dynamic memory management functions, as well as a means for buffering collected data;
Fig. 12H is a schematic representation of the user commands manager which provides a standard way of entering user commands, and executing application modules responsible for handling the same;
Fig. 121 is a schematic representation of the device driver software modules, which includes trigger switch drivers for establishing a software connection with the hardware-based manually- actuated trigger switch employed on the digital imaging-based bar code symbol reading device, an image acquisition driver for implementing image acquisition functionality aboard the digital imaging- based bar code symbol reading device, and an IR driver for implementing object detection functionality aboard the imaging-based bar code symbol reading device;
Fig. 13A is an exemplary flow chart representation showing how when the user points the bar code reader towards a bar code symbol, the IR device drivers detect that object within the field, and then wakes up the Input/Output Manager software module at the System Core Layer;
Fig. 13B is an exemplary flow chart representation showing how upon detecting an object, the Input/Output Manager posts the SCORE_OBJECTJDETECT_ON event to the Events Dispatcher software module; Fig. 13C is an exemplary flow chart representation showing how, in response to detecting an object, the Events Dispatcher software module passes the SCORE_OBJECT_DETECT_ON event to the Application Layer;
Fig. 13D is an exemplary flow chart representation showing how upon receiving the SCORE_OBJECT_DETECT_ON event at the Application Layer, the Application Events Manager executes an event handling routine which activates the narrow-area illumination array associated with the Multi-Mode Illumination Subsystem, and executes either the CodeGate Task described in Fig. 13E (when required by System Mode in which the Device is programmed) or the Narrow-Area Illumination Task described in Fig. 13M (when required by System Mode in which the Device is programmed);
Fig. 13E is an exemplary flow chart representation showing how what operations are carried out when the CodeGate Task is (enabled and) executed within the Application Layer;
Fig. 13F is an exemplary flow chart representation showing how, when the user pulls the trigger switch on the bar code reader while the Code Task is executing, the trigger device driver wakes up the Input/Output Manager at the System Core Layer;
Fig. 13 G is an exemplary flow chart representation showing how, in response to waking up, the Input/Output Manager posts the SCORE_TRIGGER_ON event to the Events Dispatcher;
Fig. 13H is an exemplary flow chart representation showing how the Events Dispatcher passes on the SCORE_TRIGGER_ON event to the Application Events Manager at the Application Layer;
Figs. 1311 and 1312, taken together, set forth an exemplary flow chart representation showing how the Application Events Manager responds to the SCORE_TRIGGER_ON event by invoking a handling routine within the Task Manager at the System Core Layer which deactivates the narrow-area illumination array associated with the Multi-Mode Illumination Subsystem, cancels the CodeGate Task or the Narrow-Area Illumination Task (depending on which System Mode the Device is programmed), and executes the Main Task;
Fig. 13J is an exemplary flow chart representation showing what operations are carried out when the Main Task is (enabled and) executed within the Application Layer;
Fig. 13K. is an exemplary flow chart representation showing what operations are carried out when the Data Output Procedure, called in the Main Task, is executed within the Input/Output Subsystem software module in the Application Layer;
Fig. 13L is an exemplary flow chart representation showing decoded symbol character data being sent from the Input/Output Subsystem to the Device Drivers within the Linux OS Layer of the system;
Fig. 13M is an exemplary flow chart representation showing what operations are carried out when the Narrow-Area Illumination Task is (enabled and) executed within the Application Layer;
Figs. 13Ml through 13M3, taken together, -tβ set forth a flow chart describing a novel method of generating wide-area illumination, for use during the Main Task routine so as to illuminate objects with a wide-area illumination field in a manner, which substantially reduces specular-type reflection at the CMOS image sensing array in the digital im aging-based bar code reading device of the present invention;
Fig. 14 is a table listing various bar code symbologies supported by the Multi-Mode Bar Code Symbol Reading Subsystem module employed within the hand-supportable digital imaging-based bar code reading device of the present invention;
Fig. 15 is a table listing the four primary modes in which the Multi-Mode Bar Code Symbol Reading Subsystem module can be programmed to operate, namely: the Automatic Mode wherein the Multi-Mode Bar Code Symbol Reading Subsystem is configured to automatically process a captured frame of digital image data so as to search for one or more bar codes represented therein in an incremental manner, and to continue searching until the entire image is processed; the Manual Mode wherein the Multi-Mode Bar Code Symbol Reading Subsystem is configured to automatically process a captured frame of digital image data, starting from the center or sweep spot of the image at which the user would have aimed the bar code reader, so as to search for (i.e. find) one or more bar code symbols represented therein, by searching in a helical manner through frames or blocks of extracted image feature data and marking the same and processing the corresponding raw digital image data until a bar code symbol is recognized/read within the captured frame of image data; the ROI-Specific Mode wherein the Multi-Mode Bar Code Symbol Reading Subsystem is configured to automatically process a specified "region of interest" (ROI) in a captured frame of digital image data so as to search for one or more bar codes represented therein, in response to coordinate data specifying the location of the bar code within the field of view of the multi-mode image formation and detection system; the NoFinder Mode wherein the Multi-Mode Bar Code Symbol Reading Subsystem is configured to automatically process a captured narrow-area (linear) frame of digital image data, without feature extraction and marking operations used in the Automatic and Manual Modes, so as read one or more bar code symbols represented therein; and the Omniscan Mode, wherein the Multi-Mode Bar Code Symbol Reading Subsystem is configured to automatically process a captured frame of digital image data along any one or more predetermined virtual scan line orientations, without feature extraction and marking operations used in the Automatic and Manual Modes, so as to read one or more bar code symbols represented therein;
Fig. 16 is an exemplary flow chart representation showing the steps involved in setting up and cleaning up the software sub-Application entitled "Multi-Mode Image-Processing Based Bar Code Symbol Reading Subsystem", once called from either (i) the CodeGate Task software module at the Block entitled READ BAR CODE(S) IN CAPTURED NARROW-AREA IMAGE indicated in Fig. 13E, or (ii) the Main Task software module at the Block entitled "READ BAR CODE(S) IN CAPTURED WIDE-AREA IMAGE" indicated in Fig. 13J;
Figs. 17A and 17B provide a table listing the primary Programmable Modes of Bar Code Reading Operation supported within the hand-supportable Digital Imaging-Based Bar Code Symbol Reading Device of the present invention, namely: Programmed Mode of System Operation No. 1 --Manually-Triggered Single-Attempt I D Single-Read Mode Employing the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem;
Programmed Mode Of System Operation No. 2— Manually-Triggered Multiple-Attempt I D Single-Read Mode Employing the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem;
Programmed Mode Of System Operation No. 3-Manually-Triggered Single-Attempt 1 D/2D Single-Read Mode Employing the No-Finder Mode And The Automatic Or Manual Modes of the Multi-Mode Bar Code Reading Subsystem;
Programmed Mode of System Operation No. 4— Manually-Triggered Multiple-Attempt 1 D/2D Single-Read Mode Employing the No-Finder Mode And The Automatic Or Manual Modes of the Multi-Mode Bar Code Reading Subsystem;
Programmed Mode of System Operation No. 5— Manually-Triggered Multiple-Attempt 1D/2D Multiple-Read Mode Employing the No-Finder Mode And The Automatic Or Manual Modes of the Multi-Mode Bar Code Reading Subsystem;
Programmed Mode of System Operation No. 6--Autornatica!ly-Triggered Single-Attempt 1 D Single-Read Mode Employing The No-Finder Mode Of the Multi-Mode Bar Code Reading Subsystem:
Programmed Mode of System Operation No. 7— Automatically-Triggered Multi-Attempt I D Single-Read Mode Employing The No-Finder Mode Of the Multi-Mode Bar Code Reading Subsystem;
Programmed Mode of System Operation No. 7— Automatically-Triggered Multi-Attempt 1D/2D Single- Read Mode Employing The No-Finder Mode and Manual and/or Automatic Modes Of the Multi-Mode Bar Code Reading Subsystem;
Programmed Mode of System Operation No. 9— Automatically-Triggered Multi-Attempt 1 D/2D Multiple-Read Mode Employing The No-Finder Mode and Manual and/or Automatic Modes Of the Multi-Mode Bar Code Reading Subsystem;
Programmable Mode of System Operation No. 10— Automatically-Triggered Multiple-Attempt 1 D/2D Single-Read Mode Employing The Manual, Automatic or Omniscan Modes Of the Multi-Mode Bar Code Reading Subsystem;
Programmed Mode of System Operation No. 1 1— Semi-Automatic-Triggered Single- Attempt 1 D/2D Single-Read Mode Employing The No-Finder Mode And The Automatic Or Manual Modes Of the Multi-Mode Bar Code Reading Subsystem;
Programmable Mode of System Operation No. 12— Semi-Automatic-Triggered Multiple- Attempt 1D/2D Single-Read Mode Employing The No-Finder Mode And The Automatic Or Manual Modes Of the Multi-Mode Bar Code Reading Subsystem;
Programmable Mode of Operation No. 13— Semi-Automatic-Triggered Multiple-Attempt 1 D/2D Multiple-Read Mode Employing The No-Finder Mode And The Automatic Or Manual Modes Of the Multi-Mode Bar Code Reading Subsystem;
Programmable Mode of Operation No. 14— Semi- Automatic-Triggered Multiple- Attempt 1D/2D Multiple-Read Mode Employing The No-Finder Mode And The Omniscan Modes Of the Multi- Mode Bar Code Reading Subsystem; Programmable Mode of Operation No. 15— Continuously-Automatically-Triggered Multiple- Attempt 1 D/2D Multiple-Read Mode Employing The Automatic, Manual and/or Omniscan Modes Of the Multi-Mode Bar Code Reading Subsystem;
Programmable Mode of System Operation No. 16--Diagnostic Mode Of Jmaging-Based Bar Code Reader Operation; and
Programmable Mode of System Operation No. 17-Live Video Mode Of Imaging-Based Bar Code Reader Operation;
Fig. 18 is a schematic representation specifying the four modes of illumination produced from the Multi-Mode Illumination Subsystem employed in the second illustrative embodiment of the Digital Imaging-Based Bar Code Symbol Reader of the present invention, which supports both near and far fields of narrow-area illumination generated during the narrow-area image capture mode of its Multi- Mode Image Formation and Detection Subsystem;
Fig. 19 is a schematic representation illustrating the physical arrangement of LEDs and light focusing lenses associated with the near and far field narrow-area and wide-area illumination arrays employed in the digital imaging-based bar code reading device according to the second illustrative embodiment of the present invention;
Fig. 2OA is a first perspective view of a second illustrative embodiment of the portable POS digital imaging-based bar code reading device of the present invention, shown having a hand- supportable housing of a different form factor than that of the first illustrative embodiment, and configured for use in its hands-free/presentation mode of operation, supporting primarily wide-area image capture;
Fig. 2OB is a second perspective view of the second illustrative embodiment of the portable POS digital imaging-based bar code reading device of the present invention, shown configured and operated in its hands-free/presentation mode of operation, supporting primarily wide-area image capture;
Fig. 2OC is a third perspective view of the second illustrative embodiment of the portable digital imaging-based bar code reading device of the present invention, showing configured and operated in a hands-on type mode, supporting both narrow and wide area modes of image capture;
Fig. 21 is a perspective view of a third illustrative embodiment of the digital imaging-based bar code reading device of the present invention, realized in the form of a Multi-Mode Image Capture And Processing Engine that can be readily integrated into various kinds of information collection and processing systems, including wireless portable data terminals (PDTs), reverse-vending machines, retail product information kiosks and the like;
Fig. 22 is a schematic representation of a wireless bar code-driven portable data terminal embodying the imaging-based bar code symbol reading engine of the present invention, shown configured and operated in a hands-on mode;
Fig. 23 is a perspective view of the wireless bar code-driven portable data terminal of Fig. 22 shown configured and operated in a hands-on mode, wherein the imaging-based bar code symbol reading engine embodied therein is used to read a bar code symbol on a package and the symbol character data representative of the read bar code is being automatically transmitted to its cradle- providing base station by way of an RF-enabled 2-way data communication link;
Fig. 24 is a side view of the wireless bar code-driven portable data terminal of Figs. 31 and 32 shown configured and operated in a hands-free mode, wherein the imaging-based bar code symbol reading engine is configured in a wide-area image capture mode of operation, suitable for presentation- type bar code reading at point of sale (POS) environments;
Fig. 25 is a block schematic diagram showing the various subsystem blocks associated with a design model for the Wireless Hand-Supportable Bar Code Driven Portable Data Terminal System of Figs. 31, 32 and 33, shown interfaced with possible host systems and/or networks;
Fig. 26 is a schematic block diagram representative of a system design for the hand-supportable digital imaging-based bar code symbol reading device according to an alternative embodiment of the present invention, wherein the system design is similar to that shown in Fig. 2Al , except that the Automatic Light Exposure Measurement and Illumination Control Subsystem is adapted to measure the light exposure on a central portion of the CMOS image sensing array and control the operation of the LED-Based Multi-Mode Illumination Subsystem in cooperation with a software-based illumination metering program realized within the Multi-Mode Image Processing Based Bar Code Symbol Reading Subsystem, involving the real-time analysis of captured digital images for unacceptable spatial-intensity distributions;
Fig. 26A is a schematic representation of the system illustrated in , showing in greater detail how the current illumination duration determined by the Automatic Light Exposure Measurement and Illumination Control Subsystem is automatically over-ridden by the illumination duration computed by a software-implemented, image-processing based illumination metering program carried out within the Image-Processing Based Bar Code Symbol Reading Subsystem, and used to control the illumination produced during the next image frame captured by the system, in accordance with this enhanced auto- illumination control scheme of the present invention;
Fig. 26B is a flow chart setting forth the steps involved in carrying out the enhanced auto- illumination control scheme illustrated in Fig. 26A;
Figs. 27A and 27B, taken together, set forth a flow chart illustrating the steps involved in carrying out the adaptive method of controlling system operations (e.g. illumination, image capturing, image processing, etc.) within thϊ multi-mode image-processing based bar code symbol reader system of the illustrative embodiment of the present invention, wherein the "exposure quality" of captured digital images is automatically analyzed in real-time and system control parameters (SCPs) are automatically reconfigured based on the results of such exposure quality analysis;
Fig. 27C is a schematic representation illustrating the Single Frame Shutter Mode of operation of the CMOS image sensing array employed within the multi-mode image-processing based bar code symbol reader system of the illustrative embodiment of the present invention, while the system is operated in its Global Exposure Mode of Operation illustrated in Figs. 6D through 6E2; Fig. 27D is a schematic representation illustrating the Rolling Shutter Mode of operation of the CMOS image sensing array employed within the multi-mode image-processing based bar code symbol reader system of the illustrative embodiment of the present invention, while the system is operated according to its adaptive control method illustrated in Figs. 27 A through 27B;
Fig. 27E is a schematic representation illustrating the Video Mode of operation of the CMOS image sensing array employed within the multi-mode image-processing based bar code symbol reader system of the illustrative embodiment of the present invention, while the system is operated according to its adaptive control method illustrated in Figs. 27A through 27B;
Fig. 28 is a perspective view of a hand-supportable image-processing based bar code symbol reader employing an image cropping zone (ICZ) targeting/marking pattern, and automatic post-image capture cropping methods to abstract the ICZ within which the targeted object to be imaged has been encompassed during illumination and imaging operations;
Fig. 29 is a schematic system diagram of the hand-supportable image-processing based bar code symbol reader shown in Fig. 28, shown employing an image cropping zone (ICZ) illumination targeting/marking source(s) operated under the control of the System Control Subsystem;
Fig. 30 is a flow chart setting forth the steps involved in carrying out the first illustrative embodiment of the image cropping zone targeting/marking and post-image capture cropping process of the present invention embodied within the bar code symbol reader illustrated in Figs. 28 and 29;
Fig. 31 is a perspective view of another illustrative embodiment of the hand-supportable image- processing based bar code symbol reader of the present invention, showing its visible illumination- based Image Cropping Pattern (ICP) being projected within the field of view (FOV) of its Multi-Mode Image Formation And Detection Subsystem;
Fig. 32 is a schematic block diagram representative of a system design for the hand-supportable digital imaging-based bar code symbol reading device illustrated in Fig. 31 , wherein the system design is shown comprising (1 ) a Multi-Mode Area-Type Image Formation and Detection (i.e. Camera) Subsystem having image formation (camera) optics for producing a field of view (FOV) upon an object to be imaged and a CMOS or like area-type image sensing array for detecting imaged light reflected off the object during illumination operations in either (i) a narrow-area image capture mode in which a few central rows of pixels on the image sensing array are enabled, or (ii) a wide-area image capture mode in which substantially all rows of the image sensing array are enabled, (2) a Multi-Mode LED-Based Illumination Subsystem for producing narrow and wide area fields of narrow-band illumination within the FOV of the Image Formation And Detection Subsystem during narrow and wide area modes of image capture, respectively, so that only light transmitted from the Multi-Mode Illumination Subsystem and reflected from the illuminated object and transmitted through a narrow-band transmission-type optical filter realized within the hand-supportable housing (i.e. using a red-wavelength high-pass reflecting window filter element disposed at the light transmission aperture thereof and a low-pass filter before the image sensor) is detected by the image sensor and all other components of ambient light are substantially rejected, and an Image Cropping Pattern Generator for generating a visible illumination- based Image Cropping Pattern (ICP) projected within the field of view (FOV) of the Multi-Mode Area- type Image Formation and Detection Subsystem, (3) an IR-based object presence and range detection subsystem for producing an IR-based object detection field within the FOV of the Image Formation and Detection Subsystem, (4) an Automatic Light Exposure Measurement and Illumination Control Subsystem for measuring illumination levels in the FOV and controlling the operation of the LED- Based Multi-Mode Illumination Subsystem, (5) an Image Capturing and Buffering Subsystem for capturing and buffering 2-D images detected by the Image Formation and Detection Subsystem, (6) an Image Processing and Cropped Image Locating Module for processing captured and buffered images to locate the image region corresponding to the region defned by the Image Cropping Pattern (ICP), (7) an Image Perspective Correction and Scaling Module for correcting the perspective of the cropped image region and scaling the corrected image to a predetermined (i.e. fixed) pixel image size suitable for decode-processing, (8) a Multimode Image-Processing Based Bar Code Symbol Reading Subsystem for processing cropped and scaled images generated by the Image Perspective and Scaling Module and reading 1 D and 2D bar code symbols represented, and (9) an Input/Output Subsystem for outputting processed image data and the like to an external host system or other information receiving or responding device, in which each said subsystem component is integrated about (10) a System Control Subsystem, as shown;
Fig. 33A is a schematic representation of a first illustrative embodiment of the VLD-based Image Cropping Pattern Generator of the present invention, comprising a VLD located at the symmetrical center of the focal plane of a pair of flat-convex lenses arranged before the VLD, and capable of generating and projecting a two (2) dot image cropping pattern (ICP) within the field of view of the of the Multi-Mode Area-type Image Formation and Detection Subsystem;
Fig. 33 B and 33C, taken together provide a composite ray-tracing diagram for the first illustrative embodiment of the VLD-based Image Cropping Pattern Generator depicted in Fig. 33A, showing that the pair of flat-convex lenses focus naturally diverging light rays from the VLD into two substantially parallel beams of laser illumination which to produce a two (2) dot image cropping pattern (ICP) within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem, wherein the distance between the two spots of illumination in the ICP is a function of distance from the pair of lenses;
Fig. 33Dl is a simulated image of the two dot Image Cropping Pattern produced by the ICP Generator of Fig. 33A, at a distance of 40mm from its pair of flat-convex lenses, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem;
Fig. 33D2 is a simulated image of the two dot Image Cropping Pattern produced by the ICP Generator of Fig. 33A, at a distance of 80mm from its pair of flat-convex lenses, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem;
Fig. 33D3 is a simulated image of the two dot Image Cropping Pattern produced by the ICP Generator of Fig. 33 A, at a distance of 120mm from its pair of flat-convex lenses, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem; Fig. 33D4 is a simulated image of the two dot Image Cropping Pattern produced by the ICP Generator of Fig. 33A, at a distance of 160mm from its pair of flat-convex lenses, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem;
Fig. 33D5 is a simulated image of the two dot Image Cropping Pattern produced by the ICP Generator of Fig. 33A, at a distance of 200mm from its pair of flat-convex lenses, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem;
Fig. 34A is a schematic representation of a second illustrative embodiment of the VLD-based Image Cropping Pattern Generator of the present invention, comprising a VLD located at the focus of a biconical lens (having a biconical surface and a cylindrical surface) arranged before the VLD, and four flat-convex lenses arranged in four corners, and which optical assembly is capable of generating and projecting a four (4) dot image cropping pattern (ICP) within the field of view of the of the Multi-Mode Area-type Image Formation and Detection Subsystem;
Fig. 34B and 34C, taken together provide a composite ray-tracing diagram for the third illustrative embodiment of the VLD-based Image Cropping Pattern Generator depicted in Fig. 34A, showing that the biconical lens enlarges naturally diverging light rays from the VLD in the cylindrical direction (but not the other) and thereafter, the four flat-convex lenses focus the enlarged laser light beam to generate a four parallel beams of laser illumination which form a four (4) dot image cropping pattern (ICP) within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem, wherein the spacing between the four dots of illumination in the ICP is a function of distance from the flat-convex lens;
Fig. 34Dl is a simulated image of the linear Image Cropping Pattern produced by the ICP Generator of Fig. 34A, at a distance of 40mm from its flat-convex lens, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem;
Fig. 34D2 is a simulated image of the linear Image Cropping Pattern produced by the ICP Generator of Fig. 34A, at a distance of 80mm from its flat-convex lens, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem;
Fig. 34D3 is a simulated image of the linear Image Cropping Pattern produced by the ICP Generator of Fig. 34A, at a distance of 120mm from its flat-convex lens, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem;
Fig. 34D4 is a simulated image of the linear Image Cropping Pattern produced by the ICP Generator of Fig. 34A, at a distance of 160mm from its flat-convex lens, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem;
Fig. 34D5 is a simulated image of the linear Image Cropping Pattern produced by the ICP Generator of Fig. 34A, at a distance of 200mm from its fiat-convex lens, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem;
Fig. 35 is a schematic representation of a third illustrative embodiment of the VLD-based Image Cropping Pattern Generator of the present invention, comprising a VLD and a light diffractive optical (DOE) element (e.g. volume holographic optical element) forming an optical assembly which is capable of generating and projecting a four (4) dot image cropping pattern (ICP) within the field of view of the of the Multi-Mode Area-type Image Formation and Detection Subsystem, similar to that generated using the refractive optics based device shown in Fig. 36A;
Fig. 36 is a schematic representation of a digital image captured within the field of view (FOV) of the bar code symbol reader illustrated in Figs. 3 1 and 32, wherein the clusters of pixels indicated by reference characters (a,b,c,d) represent the four illumination spots (i.e. dots) associated with the Image Cropping Pattern (ICP) projected in the FOV;
Fig. 37 is a flow chart setting forth the steps involved in carrying out the second illustrative embodiment of the image cropping pattern targeting/marking and post-image capture cropping process of the present invention embodied in embodied within the bar code symbol reader illustrated in Figs. 31 and 32;
Fig. 38A is a first perspective view of an alternative housing design for use with the unitary PLITM-based object identification and attribute acquisition subsystem of the present invention;
Fig. 38Al is a schematic representation of a first illustrative embodiment of the bioptical PLIFM-based product dimensioning, analysis and identification system of the present invention, comprising a pair of PLIlM-based object identification and attribute acquisition subsystems, wherein each PLIIM-based subsystem employs visible laser diodes (VLDs) having different color producing wavelengths to produce a multi-spectral planar laser illumination beam (PLIB), and a 1 -D (linear-type) CCD image detection array within the compact system housing to capture images of objects (e.g. produce) that are processed in order to determine the shape/geometry, dimensions and color of such products in diverse retail shopping environments;
Fig. 38B2 is a schematic representation of the bioptical PLIIM-based product dimensioning, analysis and identification system, showing its PLIIM-based subsystems and 2-D scanning volume in greater detail;
Fig. 38Cl is a schematic representation of a second illustrative embodiment of the bioptical PLIIM-based product dimensioning, analysis and identification system of the present invention, comprising a pair of PLIIM-based object identification and attribute acquisition subsystems, wherein each PLIIM-based subsystem employs visible laser diodes (VLDs) having different color producing wavelengths to produce a multi-spectral planar laser illumination beam (PLIB), and a 2-D (area-type) CCD image detection array within the compact system housing to capture images of objects (e.g. produce) that are processed in order to determine the shape/geometry, dimensions and color of such products in diverse retail shopping environments;
Fig. 38C2 is a schematic representation of the bioptical PLIIM-based product dimensioning, analysis and identification system of Fig. 38Cl, showing its PLIIM-based subsystems and 3-D scanning volume in greater detail;
Fig. 39A is a perspective view of a first illustrative embodiment of the PLIIM-based hand- supportable linear imager of the present invention which contains within its housing, (1) a PLIIM-based image capture and processing engine comprising a dual-VLD PLIA and a 1-D (i.e. linear) image detection array with vertically-elongated image detection elements and configured within an optical assembly that operates in accordance with a method of speckle-pattern noise reduction of the present invention, (2) a LCD display panel for displaying images captured by said engine and information provided by a host computer system or other information supplying device, and (3) a manual data entry keypad for manually entering data into the imager during diverse types of information-related transactions supported by the PLIIM-based hand-supportable imager;
Fig. 39B is a schematic presentation of a transportable PLIIM-based 3-D digitization device ("3-D digitizer") of the present invention capable of producing 3-D digitized data models of scanned objects, for viewing on a LCD view finder integrated with the device housing (or on the display panel of an external computer graphics workstation), wherein the object under analysis is controllably rotated through a single planar laser illumination beam (PLIB) and a single amplitude modulated (AM) laser scanning beam generated by the 3-D digitization device so as to optically scan the object and automatically capture linear images and range-profile maps thereof relative to a coordinate reference system symbolically embodied within the 3-D digitization device, for subsequent reconstruction therein using computer-assisted tomographic (CAT) techniques to generate a 3-D digitized data model of the object for display, viewing and use in diverse applications;
Fig. 39C is a schematic presentation of a transportable PLIIM-based 3-D digitization device ("3-D digitizer") of the present invention capable of producing 3-D digitized data models of scanned objects, for viewing on a LCD view finder integrated with the device housing (or on the display panel of an external computer graphics workstation), wherein a single planar laser illumination beam (PLIB) and a single amplitude modulated (AM) laser scanning beam are generated by the 3-D digitization device and automatically swept through the 3-D scanning volume in which the object under analysis resides so as to optically scan the object and automatically capture linear images and range-profile maps thereof relative to a coordinate reference system symbolically embodied within the 3-D digitization device, for subsequent reconstruction therein using computer-assisted tomographic (CAT) techniques to generate a 3-D digitized data model of the object for display, viewing and use in diverse applications;
Fig. 39D is a perspective view of a "vertical-type" 3-D PLIIM-based CAT scanning system of the present invention capable of producing 3-D geometrical models of human beings, animals, and other objects, for viewing on a computer graphics workstation, wherein a three orthogonal planar laser illumination beams (PLIBs) and three orthogonal amplitude modulated (AM) laser scanning beams are controllably transported vertically through the 3-D scanning volume disposed above the support platform of the system so as to optically scan the object under analysis and capture linear images and range-profile maps thereof relative to a global coordinate reference system, for subsequent reconstruction in the computer workstation using computer-assisted tomographic (CAT) techniques to generate a 3-D geometrical model of the object;
Fig. 40 is a perspective view of the digital image capture and processing engine of the present invention, showing the projection of a visible illumination-based Image Cropping Pattern (ICP) within the field of view (FOV) of the engine, during object illumination and image capture operations;
Fig. 41 is a close-up, perspective view of the digital image capture and processing engine of the present invention depicted in Fig. 40, showing the assembly of an illumination/targeting optics panel, an illumination board, a lens barrel assembly, a camera housing, and a camera board, into a an ultra-compact form factor offering advantages of light-weight construction, excellent thermal management, and exceptional image capture performance;
Fig. 42 is a side perspective view of the digital image capture and processing engine of Fig. 40, showing how the various components are arranged with respect to each other;
Fig. 43 is an elevated front view of the digital image capture and processing engine of Fig. 40 , taken along the optical axis of its image formation optics;
Fig. 44 is a bottom view of the digital image capture and processing engine of Fig. 40, showing the bottom of its mounting base for use in mounting the engine within diverse host systems;
Fig. 45 is a top view of the digital image capture and processing engine of Fig. 40;
Fig. 46 is a first side view of the digital image capture and processing engine of Fig. 40;
Fig. 47 is a second partially cut-away side view of the digital image capture and processing engine taken in Fig. 46, revealing the light conductive pipe used to collect and conduct light energy from the FOV of the Multi-Mode Area-Type Image Formation and Detection Subsystem, and direct it to the photo-detector associated with the Automatic Light Exposure Measurement and Illumination Control Subsystem;
Fig. 48 is a first cross-sectional view of the digital image capture and processing engine taken in Fig. 46, revealing the light conductive pipe used to collect and conduct light energy from the FOV of the Multi-Mode Area- Type Image Formation and Detection Subsystem;
Fig. 49 is a second cross-sectional view of the digital image capture and processing engine taken in Fig. 46, revealing the light conductive pipe used to collect and conduct light energy from the FOV of the Multi-Mode Area-Type Image Formation and Detection Subsystem;
Fig. 50 is an exploded, perspective view of the digital, image capture and processing engine of Fig. 40, showing how the illumination/targeting optics panel, the illumination board, the lens barrel assembly, the camera housing, the camera board and its assembly pins are arranged and assembled with respect to each other in accordance with the principles of the present invention;
Fig. 51 is a perspective view of the illumination/targeting optics panel, the illumination board and the camera board of digital image capture and processing engine of Fig. 40, showing assembled with the lens barrel assembly and the camera housing removed for clarity of illustration;
Fig. 52 is a perspective view of the illumination/targeting optics panel and the illumination board of the engine of the present invention assembled together as a subassembly using the assembly pins;
Fig. 53 is a perspective view of the subassembly of Fig. 52 arranged in relation to the lens barrel assembly, the camera housing and the camera board of the engine of the present invention, and showing how these system components are assembled together to produce the digital image capture and processing engine of Fig. 40; Fig. 54 is a schematic block diagram representative of a system design for the digital image capture and processing engine illustrated in Figs. 40 through 53, wherein the system design is shown comprising (1 ) a Multi-Mode Area-Type Image Formation and Detection (i.e. Camera) Subsystem having image formation (camera) optics for producing a field of view (FOV) upon an object to be imaged and a CMOS or like area-type image sensing array for detecting imaged light reflected off the object during illumination operations in either (i) a narrow-area image capture mode in which a few central rows of pixels on the image sensing array are enabled, or (ii) a wide-area image capture mode in which substantially all rows of the image sensing array are enabled, (2) a LED-Based Illumination Subsystem for producing a wide area field of narrow-band illumination within the FOV of the Image Formation And Detection Subsystem during the image capture mode, so that only light transmitted from the LED-Based Illumination Subsystem and reflected from the illuminated object and transmitted through a narrow-band transmission-type optical filter realized within the hand-supportable housing (i.e. using a red-wavelength high-pass reflecting window filter element disposed at the light transmission aperture thereof and a low-pass filter before the image sensor) is detected by the image sensor and all other components of ambient light are substantially rejected, and an Image Cropping Pattern Generator for generating a visible illumination-based Image Cropping Pattern (ICP) projected within the field of view (FOV) of the Multi-Mode Area-type Image Formation and Detection Subsystem, (3) an IR-based object presence and range detection subsystem for producing an IR-based object detection field within the FOV of the Image Formation and Detection Subsystem, (4) an Automatic Light Exposure Measurement and Illumination Control Subsystem for measuring illumination levels in the FOV and controlling the operation of the LED-Based Multi-Mode Illumination Subsystem, during the image capture mode, (5) an Image Capturing and Buffering Subsystem for capturing and buffering 2-D images detected by the Image Formation and Detection Subsystem, (6) an Image Processing and Cropped Image Locating Module for processing captured and buffered images to locate the image region corresponding to the region defπed by the Image Cropping Pattern (ICP), (7) an Image Perspective Correction and Scaling Module for correcting the perspective of the cropped image region and scaling the corrected image to a predetermined (i.e. fixed) pixel image size suitable for decode-processing, (8) a Multimode Image-Processing Based Bar Code Symbol Reading Subsystem for processing cropped and scaled images generated by the Image Perspective and Scaling Module and reading I D and 2D bar code symbols represented, and (9) an Input/Output Subsystem for outputting processed image data and the like to an external host system or other information receiving or responding device, in which each said subsystem component is integrated about (10) a System Control Subsystem, as shown;
Fig. 55Al is a perspective view of an alternative illustrative embodiment of the digital image capture and processing engine shown in Figs. 40 through 53, adapted for POS applications and reconfigured so that the illumination/aiming subassembly shown in Fig. 52 is mounted adjacent the light transmission window of the engine housing, whereas the remaining subassembly is mounted relative to the bottom of the engine housing so that the optical axis of the camera lens is parallel with the light transmission aperture, and a field of view (FOV) folding mirror is mounted beneath the illumination/aiming subassembly for directing the FOV of the system out through the central aperture formed in the illumination/aiming subassembly;
Fig. 55A2 is a schematic block diagram representative of a system design for the digital image capture and processing engine of the present invention shown in Fig. 55Al , wherein the system design is similar to that shown in Fig. 2Al , except that the Automatic Light Exposure Measurement and Illumination Control Subsystem is adapted to measure the light exposure on a central portion of the CMOS image sensing array and control the operation of the LED-Based Multi-Mode Illumination Subsystem in cooperation with a software-based illumination metering program realized within the Multi-Mode Image Processing Based Bar Code Symbol Reading Subsystem, involving the real-time exposure quality analysis of captured digital images in accordance with the adaptive system control method of the present invention, illustrated in Figs. 27A through 27E;
Fig. 55Bl is a perspective view of an automatic imaging-based bar code symbol reading system of the present invention supporting a presentation-type mode of operation using wide-area illumination and video image capture and processing techniques, and employing the general engine design shown in Fig. 56Al ;
Fig. 55B2 is a cross-sectional view of the system shown in Fig. 55Bl ;
Fig. 55B3 is a schematic block diagram representative of a system design for the digital image capture and processing engine of the present invention shown in Fig. 55B l, wherein the system design is similar to that shown in Fig. 2Al, except that the Automatic Light Exposure Measurement and Illumination Control Subsystem is adapted to measure the light exposure on a central portion of the CMOS image sensing array and control the operation of the LED-Based Multi-Mode Illumination Subsystem in cooperation with a software-based illumination metering program realized within the Multi-Mode Image Processing Based Bar Code Symbol Reading Subsystem, performing the real-time exposure quality analysis of captured digital images in accordance with the adaptive system control method of the present invention, illustrated in Figs. 27A through 27E;
Fig. 55Cl is a perspective view of an automatic imaging-based bar code symbol reading system of the present invention supporting a pass-through mode of operation using narrow-area illumination and video image capture and processing techniques, as well as a presentation-type mode of operation using wide-area illumination and video image capture and processing techniques
Fig. 55C2 is a schematic representation illustrating the system of Fig. 55C l operated in its Pass-Through Mode of system operation;
Fig. 55C3 is a schematic representation illustrating the system of Fig. 55Cl operated in its Presentation Mode of system operation;
Fig. 55C4 is a schematic block diagram representative of a system design for the digital image capture and processing engine of the present invention shown in Figs. 55Cl and 55C2, wherein the system design is similar to that shown in Fig. 2Al, except for the following differences: (1 ) the Automatic Light Exposure Measurement and Illumination Control Subsystem is adapted to measure the light exposure on a central portion of the CMOS image sensing array and control the operation of the LED-Based Multi-Mode Illumination Subsystem in cooperation with the Multi-Mode Image Processing Based Bar Code Symbol Reading Subsystem, carrying out real-time quality analysis of captured digital images in accordance with the adaptive system control method of the present invention, illustrated in Figs. 27A through 27E; (2) the narrow-area field of illumination and image capture is oriented in the vertical direction with respect to the counter surface of the POS environment, to support the Pass- Through Mode of the system, as illustrated in Fig. 55C2; and (3) the IR-based object presence and range detection system employed in Fig. 55A2 is replaced with an automatic IR-based object presence and direction detection subsystem which comprises four independent IR-based object presence and direction detection channels;
Fig. 55C5 is a schematic block diagram of the automatic IR-based object presence and direction detection subsystem employed in the bar code reading system illustrated in Figs. 55Cl and 55C4, showing four independent IR-based object presence and direction detection channels which automatically generate activation control signals for four orthogonal directions within the FOW of the system, which are received and processed by a signal analyzer and control logic block;
Fig. 56A is a perspective view of a first illustrative embodiment of a projection-type POS image-processing based bar code symbol reading system, employing the digital image capture and processing engine showing in Fig. 55;
Fig. 56B is a perspective view of a second illustrative embodiment of a projection-type POS image-processing based bar code symbol reading system, employing the digital image capture and processing engine showing in Fig. 55;
Fig. 56C is a perspective view of a third illustrative embodiment of a projection-type POS image-processing based bar code symbol reading system, employing the digital image capture and processing engine showing in Fig. 55;
Fig. 57 is a perspective view of a price lookup unit (PLU) system employing a digital image capture and processing subsystem of the present invention identifying bar coded consumer products in retail store environments, and displaying the price thereof on the LCD panel integrated in the system;
Fig. 58 is a high-level flow chart illustrating the steps involving carrying out the method of the present invention, wherein the system behavior (i.e. features) of the imaging-based bar code symbol reading system of the present invention can be modified by the end-user, within a set of manufacturer- defined constraints (i.e. imposed on modifiable features and functions within features), by the end-user developing, installing/deploying and configuring "plug-in modules" (i.e. libraries) for any modifiable task within the Application Layer of the system, so as to allow the end user to flexible modify and/or extend standard (i.e. prespecified) features and functionalities of the system, and thus satisfy customized end-user application requirements, but without requiring detailed knowledge about the hard-ware platform of the system, its communication with the environment, and/or its user interfaces.
Fig. 59 is an exemplary flow chart representation showing what operations are carried out when the "Modifiable" Main Task is (enabled and) executed within the Application Layer of the system; Fig. 59A is an exemplary flow chart representation showing what operations are carried out when the system feature called "Image Preprocessing" is executed within the Image-Processing Based Bar Code Symbol Reading Subsystem software module in the Application Layer of the system;
Fig. 59B is an exemplary flow chart representation showing what operations are carried out when the system feature called "Image Processing and Bar Code Decoding" is executed within the Modifiable Main Task software module in the Application Layer of the system;
Fig. 59C is an exemplary flow chart representation showing what operations are carried out when the system feature called "Data Output Procedure" is executed within the Modifiable Main Task in the Application Layer of the system;
Fig. 59Cl is an exemplary flow chart representation showing what operations are carried out when the system feature called "Data Formatting Procedure" is executed within the Data Output Procedure software module in the Application Layer of the system; and
Fig. 59C2 is an exemplary flow chart representation showing what operations are carried out when the system feature called "Scanner Configuration Procedure" is executed within the Data Output Procedure software module in the Application Layer of the system.
DETAILED DESCRIPTION OF THE ILLUSTRATIVE EMBODIMENTS OF THE PRESENT INVENTION
Referring to the figures in the accompanying Drawings, the various illustrative embodiments of the hand-supportable imaging-based bar code symbol reading system of the present invention will be described in great detail, wherein like elements will be indicated using like reference numerals.
Overview Of The Digital Image Capture And Processing System Of The Present Invention Employing Multi-Layer Software-Based System Architecture Permitting Modification And/Or Extension Of System Features And Functions By Way Of Third Party Code Plug-Ins
The present invention addresses the shortcomings and drawbacks of prior art digital image capture and processing systems and devices, including laser and digital imaging-based bar code symbol readers, by providing a novel system architecture, platform and development environment which enables VARs, OEMs and others (i.e. other than the original system designers) to modify and/or extend the standard system features and functions of a very broad class of digital image capture and processing systems and devices, without requiring such third-parties to possess detailed knowledge about the hardware platform of the system, its communications with outside environment, and/or its user-related interfaces. This novel approach has numerous benefits and advantages to third parties wishing to employ, in their third party products, the digital image capture and processing technology of an expert digital imager designer and manufacturer, such as Applicants and their Assignee, Metrologic Instruments, Inc., but not having to sacrifice or risk the disclosure of its valuable intellectual property and know now, during such system feature and functionality modification and/or extension processes, in order to meet the requirements of its end-user applications at hand.
As shown in Figs. IA through I B, the digital image capture and processing system of the present invention 1000 employs a multi-tier software system architecture capable of supporting various subsystems providing numerous standard system features and functions that can be modified and/or extended using the innovative plug-in programming methods of the present invention. In the illustrative embodiments of the present invention disclosed herein, such subsystems include: an object presence detection subsystem; an object range detection subsystem; an object velocity detection subsystem; an object dimensioning subsystem; a field of view (FOV) illumination subsystem; an imaging formation and detection (IFD) subsystem; a digital image processing subsystem; a sound indicator output subsystem; a visual indictor output subsystem; a power management subsystem; an image time/space stamping subsystem; a network (IP) address storage subsystem; a remote monitoring/servicing subsystem; an input/output subsystem; and a system control and/or coordination subsystem, generally integrated as shown.
For the illustrative embodiments of the present invention disclosed herein, exemplary standard system features and functions are described in the table of Figs. I Cl and C2. Such system features and functions are described below, in conjunction with the subsystem that generally supports the feature and function in the digital image capture and processing of the present invention:
System Triggering Feature (i.e. Trigger Event Generation): Object Presence Detection Subsystem
Standard System Functions:
Automatic Triggering (i.e. IR Object Presence Detection) (e.g. OM, OFF)
Manual Triggering (e.g. ON, OFF)
Semi-Automatic Triggering (e.g. ON, OFF)
Object Range Detection Feature: Object Range Detection Subsystem Standard System Functions:
(IR-Based) Long/Short Range Detection (e.g. ON, OFF) (IR-Based) Quantized/Incremental Range Detection (e.g. ON, OFF)
Object Velocity Detection Feature: Object Velocity Detection Subsystem Standard System Functions:
LIDAR-Based Object Velocity Detection (e.g. ON, OFF) IR-PULSE-DOPPLER Object Velocity Detection (e.g. ON, OFF)
Object Dimensioning Feature: Object Dimensioning Subsystem Standard System Functions:
LlDAR-based Object Dimensioning (e.g. ON or OFF) Structured-Laser Light Object Dimensioning (e.g. ON or OFF)
Field of View (FOV) Illumination Feature: Illumination Subsystem
Standard System Functions:
Illumination Mode (e.g. Ambient/OFF, LED Continuous, and LED Strobe/Flash)
Automatic Illumination Control (i.e. ON or OFF)
Illumination Field Type (e.g. Narrow-Area Near-Field Illumination, Wide-Area Far-Field Illumination,
Narrow-Area Field Of Illumination, Wide-Area Field Of Illumination) Imaging Formation and Detection Feature: Imaging Formation and Detection (IFD) Subsystem
Standard System Functions:
Image Capture Mode (e.g. Narrow-Area Image Capture Mode, Wide-Area Image Capture Mode)
Image Capture Control (e.g. Single Frame, Video Frames)
Electronic Gain Of The Image Sensing Array (e.g. 1 -10,000)
Exposure Time For Each Image Frame Detected by The Image Sensing Array (e.g. programmable in increments of milliseconds)
Exposure Time For Each Block Of Imaging Pixels Within The Image Sensing Array (e.g. programmable in increments of milliseconds)
Field Of View Marking (e.g. One Dot Pattern; Two Dot Pattern; Four Dot Pattern; Visible Line Pattern;
Four Dot And Line Pattern)
Digital Image Processing Feature: Digital Image Processing Subsystem
Standard System Functions:
Image Cropping Pattern on Image Sensing Array (e.g. Xl,y2,x2,y2,x3,y3,x4,y4)
Pre-processing of Image frames (e.g. digital filter 1, digital filter 2, ... digital filter n)
Information Recognition Processing (e.g. Recognition of A-th Symbology;.. Recognition of Z-th
Symbology, Alphanumerical Character String Recognition using OCR l ,...Alphanumerical Character
String Recognition using OCR n; and Text Recognition Processes)
Post-Processing (e.g. Digital Data Filter 1 , Digital Data Filter 2,...)
Object Feature/Characteristic Set Recognition (e.g. ON or OFF)
Sound Indicator Output Feature: Sound Indicator Output Subsystem ***•
Standard System Functions:
Sound Loudness (e.g, High, Low, Medium Volume)
Sound Pitch (e.g. freq. 1, freq2, freq3,...sound 1 ,... sound N)
Visual Indictor Output Feature: Visual lndictor Output Subsystem Standard System Functions:
Indicator Brightness (e.g, High, Low, Medium Brightness) Indicator Color (e.g. red, green, yellow, blue, white)
Power Management Feature: Power Management Subsystem
Standard System Functions:
Power Operation Mode (e.g. OFF, ON Continuous, ON Energy Savings)
Energy Savings Mode (e.g. Savings Mode No. 1 , Savings Mode No. 2,.... Savings Mode M) image Time/Space Stamping Feature: Image Time/Space Stamping Subsystem
Standard System Functions:
GPS-Based Time/Space Stamping (e.g. ON, OFF)
Network Server Time Assignment (e.g. ON, OFF)
Network (IP) Address Storage Feature: IP Address Storage Subsystem
Standard System Functions:
Manual IP Address Storage (e.g. ON, OFF)
Automatic IP Address Storage via DHCP (e.g. ON, OFF)
Remote Monitoring/Servicing Feature: Remote Monitoring/Servicing Subsystem
Standard System Functions:
TCP/IP Connection (e.g. ON, OFF)
SNMP Agent (e.g. ACTIVE or DEACTlVE)
Input/Output Feature: Input/Output Subsystem
Standard System Functions:
Data Communication Protocols (e.g. RS-232 Serial, USB, Bluetooth, etc)
Output Image File Formats (e.g. JPG/EXIF, TIFF, PICT, PDF, etc) Output Video File Formats (e.g. MPEG, AVI, etc)
Data Output Format (e.g. ASCII)
Keyboard Interface (e.g. ASCII)
Graphical Display (LCD) Interface (e.g. ACTIVE or DEACTIVE)
System Control and/or Coordination Feature: System Control and/or Coordination Subsystem
Standard System Functions:
Mode of System Operation (e.g. System Mode 1 , System Mode 2 System Mode N)
As indicated in Fig. ID, the digital image capture and processing system of the present invention 1000, represented in Figs. I A through 1 C2, can be implemented using a digital camera board and a printed circuit (PC) board that are interfaced together. Alternatively, as shown in Fig. I E, the digital image capture and processing system of the present invention 1000 can also be implemented using a single hybrid digital camera/PC board, as shown.
As shown in Fig. I F, the digital image capture and processing system of the present invention can be integrated or embodied within third-party products, such as, for example, but not limited to, image-processing based bar code symbol reading systems, OCR systems, object recognition systems, portable data terminals (PDTs), mobile phones, computer mice-type devices, personal computers, keyboards, consumer appliances, automobiles, ATMs, vending machines, reverse-vending machines, retail POS-based transaction systems, 2D or 2D digitizers, and CAT scanning systems, automobile identification systems, package inspection systems, and personal identification systems, and the like.
In general, the digital image capture and processing system of the present invention has a set of standard features and functions as described above, and a set of custom features and functionalities that satisfy customized end-user application requirements, which typically aim to modify and/or extend such standard system features and functions for particular applications at hand.
In the illustrative .embodiments described in detail below with reference to Figs. 2A through 57, the digital image capture and processing system of the present invention (regardless of the third-product into which the system is integrated or embodied), generally comprises: a digital camera subsystem for projecting a field of view (FOV) upon an object to be imaged in said FOV, and detecting imaged light reflected off the object during illumination operations in an image capture mode in which one or more digital images of the object are formed and detected by said digital camera subsystem; a digital image processing subsystem for processing digital images and producing raw or processed output data or recognizing or acquiring information graphically represented therein, and producing output data representative of the recognized information; an input/output subsystem for transmitting said output data to an external host system or other information receiving or responding device; a system control system for controlling and/or coordinating the operation of the subsystems above; and a computing platform for supporting the implementation of one or more of the subsystems above, and the features and functions of the digital image capture and processing system.
The computing platform includes (i) memory for storing pieces of original product code written by the original designers of the digital image capture and processing system, and (ii) a microprocessor for running one or more Applications by calling and executing pieces of said original product code in a particular sequence, so as support a set of standard features and functions which characterize a standard behavior of the digital image capture and processing system.
As will be described in greater detail with reference to Figs. 58 through 59C2, these pieces of original product code have a set of place holders into which third-party product code can be inserted or plugged by third parties, including value-added resellers (VARs), original equipment manufacturers (OEMs), and also end-users of the digital image capture and processing system.
In accordance with the novel principles of the present invention, one or more pieces of third- party code ("plug-ins") are inserted or plugged into the set of place holders, and operate to extend the standard features and functions of the digital image capture and processing system, and modify the standard behavior thereof into a custom behavior for the digital image capture and processing system.
In most embodiments of the present invention, the digital image capture and processing system will further comprise a housing having a light transmission window, wherein the FOV is projected through the light transmission window and upon an object to be imaged in the FOV. Also, typically, these pieces of original product code as well as third-party product code are maintained in one or more libraries supported in the memory structure of the computing platform. In general, such memory comprises a memory architecture having different kinds of memory, each having a different access speed and performance characteristics.
In accordance with the principles of the present invention, the end-user, such a value-added reseller (VAR) or original equipment manufacturer (OEM), can write such pieces of third-party code (i.e. plug-ins) according to specifications set by the original system designers, and these pieces of custom code can be plugged into the place holders, so as to modify and extend the features and functions of the digital image capture and processing system (or third-party product into which the system is integrated or embodied), and modify the standard behavior of the digital image capture and processing system into a custom behavior for the digital image capture and processing system, without permanently modifying the standard features and functions of the digital image capture and processing system.
In some illustrative embodiments of the present invention, the digital camera system comprises: a digital image formation and detection subsystem having (i) image formation optics for projecting the FOV through a light transmission window and upon the object to be imaged in the FOV, and (ii) an image sensing array for detecting imaged light reflected off the object during illumination operations in an image capture mode in which sensor elements in the image sensing array are enabled so as to detect one or more digital images of the object formed on the image sensing array; an illumination subsystem having an illumination array for producing and projecting a field of illumination through the light transmission window and within the FOV during the image capture mode; and an image capturing and buffering subsystem for capturing and buffering these digital images detected by the image formation and detection subsystem.
The image sensing array can be realized by a digital image sensing structure selected from the group consisting of an area-type image sensing array, and a linear-type image sensing array. Preferably, the memory employed in the computing platform of the system maintains system parameters used to configure the functions of the digital image capture and processing system. In the illustrative embodiments, the memory comprises a memory architecture that supports a three-tier modular software architecture characterized by an Operating System (OS) layer, a System CORE (SCORE) layer, and an Application layer and responsive to the generation of a triggering event within said digital-imaging based code symbol reading system. The OS layer includes one or more software modules selected from the group consisting of an OS kernal module, an OS file system module, and device driver modules. The SCORE layer includes one or more of software modules selected from the group consisting of a tasks manager module, an events dispatcher module, an input/output manager module, a user commands manager module, the timer subsystem module, an input/output subsystem module and an memory control subsystem module. The application layer includes one or more software modules selected from the group consisting of a code symbol decoding module, a function programming module, an application events manager module, a user commands table module, and a command handler module.
The field of illumination projected from the illumination subsystem can be narrow-band illumination produced from an array of light emitting diodes (LEDs). Also, the digital image processing subsystem is typically adapted to process captured digital images so as to read one or more code symbols graphically represented in the digital images, and produces output data in the form of symbol character data representative of the read one or more code symbols. Each code symbol can be a bar code symbol selected from the group consisting of a 1 D bar code symbol, a 2D bar code symbol, and a data matrix type code symbol structure.
These and other aspects of the present invention will become apparent hereinafter and in the claims. It is now, therefore, appropriate at this juncture to now describe in detail, with reference to Figs. 2A through 57, the various illustrative embodiments of the digital image capture and processing system of the present invention depicted in Figs. I A through I F. In each of these illustrative embodiments shown in Figs. 2A through 57, the digital image capture and processing system 1000 of the present invention is either integrated or embodied into the structure, features and functionalities of the systems or products shown. After these illustrative embodiments have been described, the technical aspects of the plug-in programming methods of the present invention will be described in great detail with reference to Figs. 58 through 59C2.
Hand-Supportable Digital Imaging-Based Bar Code Reading Device Of The First Illustrative Embodiment Of The Present Invention Referring to Figs. 2A through 2L, the hand-supportable digital imaging-based bar code symbol reading device of the first illustrative embodiment of the present invention 1 is shown in detail comprising a hand-supportable housing 2 having a handle portion 2A and a head portion 2B that is provided with a light transmission window 3 with a high-pass (red-wavelength reflecting) optical filter element 4A having light transmission characteristics set forth in Fig. 5A2, in the illustrative embodiment. As will be described in greater detail hereinafter, high-pass optical filter element 4A cooperates within an interiorly mounted low-pass optical filter element 4B characterized in Fig. 5Al, which cooperates with the high-pass optical filter element 4A. These high and low pass filter elements 4A and 4B cooperate to provide a narrow-band optical filter system 4 that integrates with the head portion of the housing and permits only a narrow band of illumination (e.g. 633 nanometers) to exit and enter the housing during imaging operations.
As best shown in Figs. 21, 2J, and 2K, the hand-supportable housing 2 of the illustrative embodiment comprises: left and right housing handle halves 2Al and 2A2; a foot-like structure 2A3 which is mounted between the handle halves 2Al and 2A2; a trigger switch structure 2C which snap fits within and pivots within a pair of spaced apart apertures 2Dl and 2D2 provided in the housing halves; a light transmission window panel 5 through which light transmission window 3 is formed and supported within a recess formed by handle halves 2A 1 and 2A2 when they are brought together, and which supports all LED illumination arrays provided by the system; an optical bench 6 for supporting electro-optical components and operably connected an orthogonally-mounted PC board 7 which is mounted within the handle housing halves; a top housing portion 2Bl for connection with the housing handle halves 2Al and 2A2 and enclosing the head portion of the housing; light pipe lens element 7 for mounting over an array of light emitting diodes (LEDs) 9 and light pipe structures 10 mounted within the rear end of the head portion of the hand-supportable housing; and a front bumper structure 2E for holding together the top housing portion 2B1 and left and right handle halves 2Al and 2A2 with the light transmission window panel 5 sandwiched therebetween, while providing a level of shock protection thereto.
In other embodiments of the present invention shown in Figs. 27 through 33 the form factor of the hand-supportable housing might be different. In yet other applications, the housing need not even be hand-supportable, but rather might be designed for stationary support on a desktop or countertop surface, or for use in a commercial or industrial application.
Schematic Block Functional Diagram As System Design Model For The Hand-Supportable Digital Image-Based Bar Code Reading Device Of The Present Invention
As shown in the system design model of Fig. 2Ll, the hand-supportable Digital Imaging-Based Bar Code Symbol Reading Device 1 of the illustrative embodiment comprises: an IR-based Object Presence and Range Detection Subsystem 12; a Multi-Mode Area-type Image Formation and Detection (i.e. camera) Subsystem 13 having narrow-area mode of image capture, near-field wide-area mode of image capture, and a far-field wide-area mode of image capture; a Multi-Mode LED-Based Illumination Subsystem 14 having narrow-area mode of illumination, near-field wide-area mode of illumination, and a far-field wide-area mode of illumination; an Automatic Light Exposure Measurement and Illumination Control Subsystem 15; an Image Capturing and Buffering Subsystem 16; a Multi-Mode Image-Processing Bar Code Symbol Reading Subsystem 17 having five modes of image-processing based bar code symbol reading indicated in Fig. 2L2 and to be described in detail hereinabove; an Input/Output Subsystem 18; a manually-actuatable trigger switch 2C for sending user-originated control activation signals to the device; a System Mode Configuration Parameter Table 70; and a System Control Subsystem 18 integrated with each of the above-described subsystems, as shown.
The primary function of the IR-based Object Presence and Range Detection Subsystem 12 is to automatically produce an IR-based object detection field 20 within the FOV of the Multi-Mode Image Formation and Detection Subsystem 13, detect the presence of an object within predetermined regions of the object detection field (2OA, 20B), and generate control activation signals Al which are supplied to the System Control Subsystem 19 for indicating when and where an object is detected within the object detection field of the system.
In the first illustrative embodiment, the Multi-Mode Image Formation And Detection (I.E. Camera) Subsystem 13 has image formation (camera) optics 21 for producing a field of view (FOV) 23 upon an object to be imaged and a CMOS area-image sensing array 22 for detecting imaged light reflected off the object during illumination and image acquisition/capture operations.
In the first illustrative embodiment, the primary function of the Multi-Mode LED-Based Illumination Subsystem 14 is to produce a narrow-area illumination field 24, near-field wide-area illumination field 25, and a far-field wide-area illumination field 25, each having a narrow optical- bandwidth and confined within the FOV of the Multi-Mode Image Formation And Detection Subsystem 13 during narrow-area and wide-area modes of imaging, respectively. This arrangement is designed to ensure that only light transmitted from the Multi-Mode Illumination Subsystem 14 and reflected from the illuminated object is ultimately transmitted through a narrow-band transmission-type optical filter subsystem 4 realized by (1) high-pass (i.e. red-wavelength reflecting) filter element 4A mounted at the light transmission aperture 3 immediately in front of panel 5, and (2) low-pass filter element 4B mounted either before the image sensing array 22 or anywhere after panel 5 as shown in Fig. 3C. Fig. 5A4 sets forth the resulting composite transmission characteristics of the narrow-band transmission spectral filter subsystem 4, plotted against the spectral characteristics of the emission from the LED illumination arrays employed in the Multi-Mode Illumination Subsystem 14.
The primary function of the narrow-band integrated optical filter subsystem 4 is to ensure that the CMOS image sensing array 22 only receives the narrow-band visible illumination transmitted by the three sets of LED-based illumination arrays 27, 28 and 29 driven by LED driver circuitry 30 associated with the Multi-Mode Illumination Subsystem 14, whereas all other components of ambient light collected by the light collection optics are substantially rejected at the image sensing array 22, thereby providing improved SNR thereat, thus improving the performance of the system.
The primary function of the Automatic Light Exposure Measurement and Illumination Control Subsystem 15 is to twofold: (1 ) to measure, in real-time, the power density [joules/cm] of photonic energy (i.e. light) collected by the optics of the system at about its image sensing array 22, and generate Auto-Exposure Control Signals indicating the amount of exposure required for good image formation and detection; and (2) in combination with Illumination Array Selection Control Signal provided by the System Control Subsystem 19, automatically drive and control the output power of selected LED arrays 27, 28 and/or 29 in the Multi-Mode Illumination Subsystem, so that objects within the FOV of the system are optimally exposed to LED-based illumination and optimal images are formed and detected at the image sensing array 22.
The primary function of the Image Capturing and Buffering Subsystem 16 is to (1) detect the entire 2-D image focused onto the 2D image sensing array 22 by the image formation optics 21 of the system, (2) generate a frame of digital pixel data 31 for either a selected region of interest of the captured image frame, or for the entire detected image, and then (3) buffer each frame of image data as it is captured. Notably, in the illustrative embodiment, a single 2D image frame (31) is captured during each image capture and processing cycle, or during a particular stage of a processing cycle, so as to eliminate the problems associated with image frame overwriting, and synchronization of image capture and decoding processes, as addressed in US Patents Nos. 5,932,862 and 5,942,741 assigned to Welch Allyn, and incorporated herein by reference.
The primary function of the Multi-Mode Imaging-Based Bar Code Symbol Reading Subsystem 17 is to process images that have been captured and buffered by the Image Capturing and Buffering Subsystem 16, during both narrow-area and wide-area illumination modes of system operation. Such image processing operation includes image-based bar code decoding methods illustrated in Figs. 14 through 25, and described in detail hereinafter.
The primary function of the Input/Output Subsystem 18 is to support standard and/or proprietary communication interfaces with external host systems and devices, and output processed image data and the like to such external host systems or devices by way of such interfaces. Examples of such interfaces, and technology for implementing the same, are given in US Patent No. 6,619,549, incorporated herein by reference in its entirety.
The primary function of the System Control Subsystem 19 is to provide some predetermined degree of control or management signaling services to each subsystem component integrated, as shown. While this subsystem can be implemented by a programmed microprocessor, in the illustrative embodiment, it is implemented by the three-tier software architecture supported on computing platform shown in Fig. 2M, and as represented in Figs. 1 1 A through 13L, and described in detail hereinafter.
The primary function of the manually-activatable Trigger Switch 2C integrated with the hand- supportable housing is to enable the user to generate a control activation signal upon manually depressing the Trigger Switch 2C, and to provide this control activation signal to the System Control Subsystem 19 for use in carrying out its complex system and subsystem control operations, described in detail herein.
The primary function of the System Mode Configuration Parameter Table 70 is to store (in nonvolatile/persistent memory) a set of configuration parameters for each of the available Programmable Modes of System Operation specified in the Programmable Mode of Operation Table shown in Figs. 26A and 26B, and which can be read and used by the System Control Subsystem 19 as required during its complex operations.
The detailed structure and function of each subsystem will now be described in detail above.
Schematic Diagram As System Implementation Model For The Hand- Supportable Digital Imaging- Based Bar Code Reading Device Of The Present Invention
Fig. 2B shows a schematic diagram of a system implementation for the hand-supportable Digital Imaging-Based Bar Code Symbol Reading Device 1 illustrated in Figs. 2A through 2L. As shown in this system implementation, the bar code symbol reading device is realized using a number of hardware component comprising: an illumination board 33 carrying components realizing electronic functions performed by the LED-Based Multi-Mode Illumination Subsystem 14 and Automatic Light Exposure Measurement And Illumination Control Subsystem 15; a CMOS camera board 34 carrying high resolution (1280 X 1024 7-bit 6 micron pixel size) CMOS image sensing array 22 running at 25Mhz master clock, at 7 frames/second at 1280* 1024 resolution with randomly accessible region of interest (ROI) window capabilities, realizing electronic functions performed by the Multi-Mode Image Formation and Detection Subsystem 13; a CPU board 35 (i.e. computing platform) including (i) an Intel Sabinal 32-Bit Microprocessor PXA210 36 running at 200 mHz 1 .0 core voltage with a 16 bit lOOMhz external bus speed, (ii) an expandable (e.g. 7+ megabyte) Intel J3 Asynchronous 16-bit Flash memory 37, (iii) an 16 Megabytes of 100 MHz SDRAM 38, (iv) an Xilinx Spartan II FPGA FIFO 39 running at 50Mhz clock frequency and 60MB/Sec data rate, configured to control the camera timings and drive an image acquisition process, (v) a multimedia card socket 40, for realizing the other subsystems of the system, (vi) a power management module 41 for the MCU adjustable by the I2C bus, and (vii) a pair of UARTs 42A and 42B (one for an IRDA port and one for a JTAG port); an interface board 43 for realizing the functions performed by the I/O subsystem 18; and an IR-based object presence and range detection circuit 44 for realizing Subsystem 12, which includes a pair of IR LEDs and photodiodes 12A for transmitting and receiving a pencil-shaped IR-based object-sensing signal.
In the illustrative embodiment, the image formation optics 21 supported by the bar code reader provides a field of view of 103 mm at the nominal focal distance to the target, of approximately 70 mm from the edge of the bar code reader. The minimal size of the field of view (FOV) is 62 mm at the nominal focal distance to the target of approximately 10 mm. Preliminary tests of the parameters of the • optics are shown on Fig. 4B (the distance on Fig. 4B is given from the position of the image sensing array 22, which is located inside the bar code symbol reader approximately 80 mm from the edge). As indicated in Fig. 4C, the depth of field of the image formation optics varies from approximately 69 mm for the bar codes with resolution of 5 mils per narrow module, to 181 mm for the bar codes with resolution of 13 mils per narrow module.
The Multi-Mode Illumination Subsystem 14 is designed to cover the optical field of view (FOV) 23 of the bar code symbol reader with sufficient illumination to generate high-contrast images of bar codes located at both short and long distances from the imaging window. The illumination subsystem also provides a narrow-area (thin height) targeting beam 24 having dual purposes: (a) to indicate to the user where the optical view of the reader is; and (b) to allow a quick scan of just a few lines of the image and attempt a super-fast bar code decoding if the bar code is aligned properly. If the bar code is not aligned for a linearly illuminated image to decode, then the entire field of view is illuminated with a wide-area illumination field 25 or 26 and the image of the entire field of view is acquired by Image Capture and Buffering Subsystem 16 and processed by Multi-Mode Bar Code Symbol Reading Subsystem 17, to ensure reading of a bar code symbol presented therein regardless of its orientation.
The interface board 43 employed within the bar code symbol reader provides the hardware communication interfaces for the bar code symbol reader to communicate with the outside world. The interfaces implemented in system will typically include RS232, keyboard wedge, and/or USB, or some combination of the above, as well as others required or demanded by the particular application at hand.
Specification Of The Area-Type Image Formation And Detection (i.e. Camera) Subsystem During Its Narrow-Area (Linear) And Wide-Area Modes Of Imaging. Supported Bv The Narrow And Wide Area Fields Of Narrow-Band Illumination. Respectively
As shown in Figs. 3B through 3E, the Multi-Mode Image Formation And Detection (I FD) Subsystem 13 has a narrow-area image capture mode (i.e. where only a few central rows of pixels about the center of the image sensing array are enabled) and a wide-area image capture mode of operation (i.e. where all pixels in the image sensing array are enabled). The CMOS image sensing array 22 in the Image Formation and Detection Subsystem 13 has image formation optics 21 which provides the image sensing array with a field of view (FOV) 23 on objects to be illuminated and imaged. As shown, this FOV is illuminated by the Multi-Mode Illumination Subsystem 14 integrated within the bar code reader.
The Multi-Mode Illumination Subsystem 14 includes three different LED-based illumination arrays 27, 28 and 29 mounted on the light transmission window panel 5, and arranged about the light transmission window 4A. Each illumination array is designed to illuminate a different portion of the FOV of the bar code reader during different modes of operation. During the narrow-area (linear) illumination mode of the Multi-Mode Illumination Subsystem 14, the central narrow-wide portion of the FOV indicated by 23 is illuminated by the narrow-area illumination array 27, shown in Fig. 3A. During the near-field wide-area illumination mode of the Multi-Mode Illumination Subsystem 14, which is activated in response to the IR Object Presence and Range Detection Subsystem 12 detecting an object within the near-field portion of the FOV3 the near-field wide-area portion of the FOV is illuminated by the near-field wide-area illumination array 28, shown in Fig. 3A. During the far-field wide-area illumination mode of the Multi-Mode Illumination Subsystem 14, which is activated in response to the IR Object Presence and Range Detection Subsystem 12 detecting an object within the far-field portion of the FOV, the far-field wide-area portion of the FOV is illuminated by the far-field wide-area illumination array 29, shown in Fig. 3A. In Fig. 3A, the spatial relationships are shown between these fields of narrow-band illumination and the far and near field portions the FOV of the Image Formation and Detection Subsystem 13.
In Fig. 3B, the Multi-Mode LED-Based Illumination Subsystem 14 is shown transmitting visible narrow-band illumination through its narrow-band transmission-type optical filter subsystem 4, shown in Fig. 3C and integrated within the hand-supportable Digital Imaging-Based Bar Code Symbol Reading Device. The narrow-band illumination from the Multi-Mode Illumination Subsystem 14 illuminates an object with the FOV of the image formation optics of the Image Formation and Detection Subsystem 13, and light rays reflected and scattered therefrom are transmitted through the high-pass and low-pass optical filters 4A and 4B and are ultimately focused onto image sensing array 22 to form of a focused detected image thereupon, while all other components of ambient light are substantially rejected before reaching image detection at the image sensing array 22. Notably, in the illustrative embodiment, the red-wavelength reflecting high-pass optical filter element 4A is positioned at the imaging window of the device before the image formation optics 21, whereas the low-pass optical filter element 4B is disposed before the image sensing array 22 between the focusing lens elements of the image formation optics 21. This forms narrow-band optical filter subsystem 4 which is integrated within the bar code reader to ensure that the object within the FOV is imaged at the image sensing array 22 using only spectral components within the narrow-band of illumination produced from Subsystem 14, while rejecting substantially all other components of ambient light outside this narrow range (e.g. 15 nm).
As shown in Fig. 3D, the Image Formation And Detection Subsystem 14 employed within the hand-supportable image-based bar code reading device comprising three lenses 21 A, 21 B and 21 C, each made as small as possible (with a maximum diameter of 12mm), having spherical surfaces, and made from common glass, e.g. LAK2 (~ LaK9), ZFlO (=SF8), LAF2 (~LaF3). Collectively, these lenses are held together within a lens holding assembly 45, as shown in Fig. 3E, and form an image formation subsystem arranged along the optical axis of the CMOS image sensing array 22 of the bar code reader.
As shown in Fig. 3E, the lens holding assembly 45 comprises: a barrel structure 45Al , 45A2 for holding lens elements 2 I A, 2 IB and 21 C; and a base structure 45B for holding the image sensing array 22; wherein the assembly is configured so that the barrel structure 45A slides within the base structure 45B so as to focus the fixed-focus lens assembly during manufacture.
In Fig. 3Fl and 3F2, the lens holding assembly 45 and imaging sensing array 22 are mounted along an optical path defined along the central axis of the system. In the illustrative embodiment, the image sensing array 22 has, for example, a 1280x1024 pixel resolution (1/2" format), 6 micron pixel size, with randomly accessible region of interest (ROl) window capabilities. It is understood, though, that many others kinds of imaging sensing devices (e.g. CCD) can be used to practice the principles of the present invention disclosed herein, without departing from the scope or spirit of the present invention.
Details regarding a preferred Method of Designing the Image Formation (i.e. Camera) Optics Within the Image-Based Bar Code Reader Of The Present Invention Using The Modulation Transfer Function (MTF) are described in Applicants' U.S. Application No. 10/712,787 filed November 13, 2003, supra. AIso,_Method Of Theoretically Characterizing The DOF Of The Image Formation Optics Employed In The Imaging-Based Bar Code Reader Of The Present Invention are also described in detail in Applicants' U.S. Application No. 10/712,787 filed November 13, 2003, supra.
Specification Of Multi-Mode LED-Based Illumination Subsystem Employed In The Hand-Supportable Image-Based Bar Code Reading System Of The Present Invention
In the illustrative embodiment, the LED-Based Multi-Mode Illumination Subsystem 14 comprises: narrow-area illumination array 27; near-field wide-area illumination array 28; and far-field wide-area illumination array 29. The three fields of narrow-band illumination produced by the three illumination arrays of subsystem 14 are schematically depicted in Fig. 4Al . As will be described hereinafter, with reference to Figs. 27 and 28, narrow-area illumination array 27 can be realized as two independently operable arrays, namely: a near-field narrow-area illumination array and a far-field narrow-area illumination array, which are activated when the target object is detected within the near and far fields, respectively, of the automatic IR-based Object Presence and Range Detection Subsystem 12 during wide-area imaging modes of operation. However, for purposes of illustration, the first illustrative embodiment of the present invention employs only a single field narrow-area (linear) illumination array which is designed to illuminate over substantially entire working range of the system, as shown in Fig. 4Al .
As shown in Figs. 4B, 4C3 and 4C4, the narrow-area (linear) illumination array 27 includes two pairs of LED light sources 27Al and 27A2 provided with cylindrical lenses 27Bl and 27B2, respectively, and mounted on left and right portions of the light transmission window panel 5. During the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, the narrow- area (linear) illumination array 27 produces narrow-area illumination field 24 of narrow optical- bandwidth within the FOV of the system. In the illustrative embodiment, narrow-area illumination field 24 has a height less than 10 mm at far field, creating the appearance of substantially linear or rather planar illumination field.
The near-field wide-area illumination array 28 includes two sets of (flattop) LED light sources 28A1-28A6 and 28A7-28A13 without any lenses mounted on the top and bottom portions of the light transmission window panel 5, as shown in Fig. 4B. During the near-field wide-area image capture mode of the Image Formation and Detection Subsystem 13, the near-field wide-area illumination array 28 produces a near-field wide-area illumination field 25 of narrow optical-bandwidth within the FOV of the system,
As shown in Figs. 4B, 4D3 and 4D4, the far-field wide-area illumination array 29 includes two sets of LED light sources 29A1-29A6 and 29A7-29A13 provided with spherical (i.e. plano-convex) lenses 29B1 -29B6 and 29B7-29B13, respectively, and mounted on the top and bottom portions of the light transmission window panel 5. During the far-field wide-area image capture mode of the Image Formation and Detection Subsystem 13, the far-field wide-area illumination array 29 produces a far- field wide-area illumination beam of narrow optical-bandwidth within the FOV of the system.
Narrow-Area (Linear) Illumination Arrays Employed In the Multi-Mode Illumination Subsystem
As shown in Fig. 4Al , the narrow-area (linear) illumination field 24 extends from about 30mm to about 200 mm within the working range of the system, and covers both the near and far fields of the system. The near-field wide-area illumination field 25 extends from about 0 mm to about 100 mm within the working range of the system. The far-field wide-area illumination field 26 extends from about 100 mm to about 200 mm within the working range of the system. The Table shown in Fig. 4A2 specifies the geometrical properties and characteristics of each illumination mode supported by the Multi-Mode LED-based Illumination Subsystem 14 of the present invention.
The narrow-area illumination array 27 employed in the Multi-Mode LED-Based Illumination Subsystem 14 is optically designed to illuminate a thin area at the center of the field of view (FOV) of the imaging-based bar code symbol reader, measured from the boundary of the left side of the field of view to the boundary of its right side, as specified in Fig. 4Al . As will be described in greater detail hereinafter, the narrow-area illumination field 24 is automatically generated by the Multi-Mode LED- Based Illumination Subsystem 14 in response to the detection of an object within the object detection field of the automatic IR-based Object Presence and Range Detection Subsystem 12. In general, the object detection field of the IR-based Object Presence and Range Detection Subsystem 12 and the FOV of the Image Formation and Detection Subsystem 13 are spatially co-extensive and the object detection field spatially overlaps the FOV along the entire working distance of the imaging-based bar code symbol reader. The narrow-area illumination field 24, produced in response to the detection of an object, serves a dual purpose: it provides a visual indication to an operator about the location of the optical field of view of the bar code symbol reader, thus, serves as a field of view aiming instrument; and during its image acquisition mode, the narrow-area illumination beam is used to illuminated a thin area of the FOV within which an object resides, and a narrow 2-D image of the object can be rapidly captured (by a small number of rows of pixels in the image sensing array 22), buffered and processed in order to read any linear bar code symbols that may be represented therewithin.
Fig. 4Cl shows the Lambertian emittance versus wavelength characteristics of the LEDs used to implement the narrow-area illumination array 27 in the Multi-Mode Illumination Subsystem 14. Fig. 4C2 shows the Lambertian emittance versus polar angle characteristics of the same LEDs. Fig. 4C3 shows the cylindrical lenses used before the LEDs (633 nm InGaAlP) in the narrow-area (linear) illumination arrays in the illumination subsystem of the present invention. As shown, the first surface of the cylindrical lens is curved vertically to create a narrow-area (linear) illumination pattern, and the second surface of the cylindrical lens is curved horizontally to control the height of the of the linear illumination pattern to produce a narrow-area illumination pattern. Fig. 4C4 shows the layout of the pairs of LEDs and two cylindrical lenses used to implement the narrow-area illumination array of the illumination subsystem of the present invention. In the illustrative embodiment, each LED produces about a total output power of about 1 1.7 mW under typical conditions. Fig. 4C5 sets forth a set of six illumination profiles for the narrow-area illumination fields produced by the narrow-area illumination arrays of the illustrative embodiment, taken at 30, 40, 50, 80, 120, and 220 millimeters along the field away from the imaging window (i.e. working distance) of the bar code reader of the present invention, illustrating that the spatial intensity of the area-area illumination field begins to become substantially uniform at about 80 millimeters. As shown, the narrow-area illumination beam is usable beginning 40 mm from the light transmission/imaging window.
Near-Field Wide-Area Illumination Arrays Employed in the Multi-Mode Illumination Subsystem
The near-field wide-area illumination array 28 employed in the LED-Based Multi-Mode Illumination Subsystem 14 is optically designed to illuminate a wide area over a near-field portion of the field of view (FOV) of the imaging-based bar code symbol reader, as defined in Fig. 4Al . As will be described in greater detail hereinafter, the near-field wide-area illumination field 28 is automatically generated by the LED-based Multi-Mode Illumination Subsystem 14 in response to: (1 ) the detection of any object within the near-field of the system by the IR-based Object Presence and Range Detection Subsystem 12; and (2) one or more of following events, including, for example: (i) failure of the image processor to successfully decode process a linear bar code symbol during the narrow-area illumination mode; (ii) detection of code elements such as control words associated with a 2-D bar code symbol; and/or (iii) detection of pixel data in the image which indicates that object was captured in a state of focus.
In general, the object detection field of the IR-based Object Presence and Range Detection Subsystem 12 and the FOV of the Image Formation And Detection Subsystem 13 are spatially coextensive and the object detection field spatially overlaps the FOV along the entire working distance of the imaging-based bar code symbol reader. The near-field wide-area illumination field 23, produced in response to one or more of the events described above, illuminates a wide area over a near-field portion of the field of view (FOV) of the imaging-based bar code symbol reader, as defined in Fig. 5 A, within which an object resides, and a 2-D image of the object can be rapidly captured by all rows of the image sensing array 22, buffered and decode-processed in order to read any I D or 2-D bar code symbols that may be represented therewithin, at any orientation, and of virtually any bar code symbology. The intensity of the near-field wide-area illumination field during object illumination and image capture operations is determined by how the LEDs associated with the near-field wide array illumination arrays 28 are electrically driven by the Multi-Mode Illumination Subsystem 14. The degree to which the LEDs are driven is determined by the intensity of reflected light measured near the image formation plane by the automatic light exposure and control subsystem 15. If the intensity of reflected light at the photodetector of the Automatic Light Exposure Measurement and Illumination Control Subsystem 15 is weak, indicative that the object exhibits low light reflectivity characteristics and a more intense amount of illumination will need to be produced by the LEDs to ensure sufficient light exposure on the image sensing array 22, then the Automatic Light Exposure Measurement and Illumination Control Subsystem 15 will drive the LEDs more intensely (i.e. at higher operating currents).
Fig. 4Dl shows the Lambertian emittance versus wavelength characteristics of the LEDs used to implement the wide area illumination arrays in the illumination subsystem of the present invention. Fig. 4D2 shows the Lambertian emittance versus polar angle characteristics of the LEDs used to implement the near field wide-area illumination arrays in the Multi-Mode Illumination Subsystem 14. Fig. 4D4 is geometrical the layout of LEDs used to implement the narrow wide-area illumination array of the Multi-Mode Illumination Subsystem 14, wherein the illumination beam produced therefrom is aimed by angling the lenses before the LEDs in the near-field wide-area illumination arrays of the Multi-Mode Illumination Subsystem 14. Fig. 4D5 sets forth a set of six illumination profiles for the near-field wide-area illumination fields produced by the near-field wide-area illumination arrays of the illustrative embodiment, taken at 10, 20, 30, 40, 60, and 100 millimeters along the field away from the imaging window (i.e. working distance) of the imaging-based bar code symbol reader 1. These plots illustrate that the spatial intensity of the near-field wide-area illumination field begins to become substantially uniform at about 40 millimeters (i.e. centeπedge = 2: 1 max).
Far-Field Wide-Area Illumination Arrays Employed in The Multi-Mode Illumination Subsystem
The far-field wide-area illumination array 26 employed in the Multi-Mode LED-based Illumination Subsystem 14 is optically designed to illuminate a wide area over a far-field portion of the field of view (FOV) of the imaging-based bar code symbol reader, as defined in Fig. 4Al . As will be described in greater detail hereinafter, the far-field wide-area illumination field 26 is automatically generated by the LED-Based Multi-Mode Illumination Subsystem 14 in response to: (1) the detection of any object within the near-field of the system by the IR-based Object Presence and Range Detection Subsystem 12; and (2) one or more of following events, including, for example: (i) failure of the image processor to successfully decode process a linear bar code symbol during the narrow-area illumination mode; (ii) detection of code elements such as control words associated with a 2-D bar code symbol; and/or (iii) detection of pixel data in the image which indicates that object was captured in a state of focus. In general, the object detection field of the IR-based' Object Presence and Range Detection Subsystem 12 and the FOV 23 of the image detection and formation subsystem 13 are spatially coextensive and the object detection field 20 spatially overlaps the FOV 23 along the entire working distance of the imaging-based bar code symbol reader. The far-field wide-area illumination field 26, produced in response to one or more of the events described above, illuminates a wide area over a far- field portion of the field of view (FOV) of the imaging-based bar code symbol reader, as defined in Fig. 5A, within which an object resides, and a 2-D image of the object can be rapidly captured (by all rows of the image sensing array 22), buffered and processed in order to read any ID or 2-D bar code symbols that may be represented therewithin, at any orientation, and of virtually any bar code symbology. The intensity of the far-field wide-area illumination field during object illumination and image capture operations is determined by how the LEDs associated with the far-field wide-area illumination array 29 are electrically driven by the Multi-Mode Illumination Subsystem 14. The degree to which the LEDs are driven (i.e. measured in terms of junction current) is determined by the intensity of reflected light measured near the image formation plane by the Automatic Light Exposure Measurement And Illumination Control Subsystem 15. If the intensity of reflected light at the photo-detectόr of the Automatic Light Exposure Measurement and Illumination Control Subsystem 15 is weak, indicative that the object exhibits low light reflectivity characteristics and a more intense amount of illumination will need to be produced b the LEDs to ensure sufficient light exposure on the image sensing array 22, then the Automatic Light Exposure Measurement and Illumination Control Subsystem 15 will drive the LEDs more intensely (i.e. at higher operating currents).
During both near and far field wide-area illumination modes of operation, the Automatic Light Exposure Measurement and Illumination Control Subsystem (i.e. module) 15 measures and controls the time duration which the Multi-Mode Illumination Subsystem 14 exposes the image sensing array 22 to narrow-band illumination (e.g. 633 nanometers, with approximately 15 nm bandwidth) during the image capturing/acquisition process, and automatically terminates the generation of such illumination when such computed time duration expires. In accordance with the principles of the present invention, this global exposure control process ensures that each and every acquired image has good contrast and is not saturated, two conditions essential for consistent and reliable bar code reading
Fig. 4Dl shows the Lambertian emittance versus wavelength characteristics of the LEDs used to implement the far-field wide-area illumination arrays 29 in the Multi-Mode Illumination Subsystem 14. Fig. 4D2 shows the Lambertian emittance versus polar angle characteristics of the LEDs used to implement the same. Fig. 4D3 shows the plano-convex lenses used before the LEDs in the far-field wide-area illumination arrays in the Multi-Mode Illumination Subsystem 14. Fig. 4D4 shows a layout of LEDs and plano-convex lenses used to implement the far wide-area illumination array 29 of the illumination subsystem, wherein the illumination beam produced therefrom is aimed by angling the lenses before the LEDs in the far-field wide-area illumination arrays of the Multi-Mode Illumination Subsystem 14. Fig. 4D6 sets forth a set of three illumination profiles for the far-field wide-area illumination fields produced by the far-field wide-area illumination arrays of the illustrative embodiment, taken at 100, 150 and 220 millimeters along the field away from the imaging window (i.e. working distance) of the imaging-based bar code symbol reader 1, illustrating that the spatial intensity of the far-field wide-area illumination field begins to become substantially uniform at about 100 millimeters. Fig. 4D7 shows a table illustrating a preferred method of calculating the pixel intensity value for the center of the far field wide-area illumination field produced from the Multi-Mode Illumination Subsystem 14, showing a significant signal strength (greater than 80 DN at the far center field).
Specification Of The Narrow-Band Optical Filter Subsystem Integrated Within The Hand-Supportable Housing Of The Imager Of The Present Invention
As shown in Fig. 5Al, the hand-supportable housing of the bar code reader of the present invention has integrated within its housing, narrow-band optical filter subsystem 4 for transmitting substantially only the very narrow band of wavelengths (e.g. 620-700 nanometers) of visible illumination produced from the narrow-band Multi-Mode Illumination Subsystem 14, and rejecting all other optical wavelengths outside this narrow optical band however generated (i.e. ambient light sources). As shown, narrow-band optical filter subsystem 4 comprises: red-wavelength reflecting (high-pass) imaging window filter 4A integrated within its light transmission aperture 3 formed on the front face of the hand-supportable housing; and low pass optical filter 4B disposed before the CMOS image sensing array 22. These optical filters 4A and 4B cooperate to form the narrow-band optical filter subsystem 4 for the purpose described above. As shown in Fig. 5A2, the light transmission characteristics (energy versus wavelength) associated with the low-pass optical filter element 4B indicate that optical wavelengths below 620 nanometers are transmitted therethrough, whereas optical wavelengths above 620 nm are substantially blocked (e.g. absorbed or reflected). As shown in Fig. 5A3, the light transmission characteristics (energy versus wavelength) associated with the high-pass imaging window filter 4A indicate that optical wavelengths above 700 nanometers are transmitted therethrough, thereby producing a red-color appearance to the user, whereas optical wavelengths below 700 nm are substantially blocked (e.g. absorbed or reflected) by optical filter 4A.
During system operation, spectral band-pass filter subsystem 4 greatly reduces the influence of the ambient light, which falls upon the CMOS image sensing array 22 during the image capturing operations. By virtue of the optical filter of the present invention, a optical shutter mechanism is eliminated in the system. In practice, the optical filter can reject more than 85% of incident ambient light, and in typical environments, the intensity of LED illumination is significantly more than the ambient light on the CMOS image sensing array 22. Thus, while an optical shutter is required in nearly most conventional CMOS imaging systems, the imaging-based bar code reading system of the present invention effectively manages the exposure time of narrow-band illumination onto its CMOS image sensing array 22 by simply controlling the illumination time of its LED-based illumination arrays 27, 28 and 29 using control signals generated by Automatic Light Exposure Measurement and Illumination Control Subsystem 15 and the CMOS image sensing array 22 while controlling illumination thereto by way of the band-pass optical filter subsystem 4 described above. The result is a simple system design, without moving parts, and having a reduced manufacturing cost. While the band-pass optical filter subsystem 4 is shown comprising a high-pass filter element 4A and low-pass filter element 4B, separated spatially from each other by other optical components along the optical path of the system, subsystem 4 may be realized as an integrated multi-layer filter structure installed in front of the image formation and detection (IFD) module 13, or before its image sensing array 22, without the use of the high-pass window filter 4A, or with the use thereof so as to obscure viewing within the imaging-based bar code symbol reader while creating an attractive red- colored protective window. Preferably, the red-color window filter 4A will have substantially planar surface characteristics to avoid focusing or defocusing of light transmitted therethrough during imaging operations.
Specification Of The Automatic Light Exposure Measurement And Illumination Control Subsystem Of The Present Invention
The primary function of the Automatic Light Exposure Measurement and Illumination Control Subsystem 15 is to control the brightness and contrast of acquired images by (i) measuring light exposure at the image plane of the CMOS imaging sensing array 22 and (ii) controlling the time duration that the Multi-Mode Illumination Subsystem 14 illuminates the target object with narrow-band illumination generated from the activated LED illumination array. Thus, the Automatic Light Exposure Measurement and Illumination Control Subsystem 15 eliminates the need for a complex shuttering mechanism for CMOS-based image sensing array 22. This novel mechanism ensures that the imaging-based bar code symbol reader of the present invention generates non-saturated images with enough brightness and contrast to guarantee fast and reliable image-based bar code decoding in demanding end-user applications.
During object illumination, narrow-band LED-based light is reflected from the target object (at which the hand-supportable bar code reader is aimed) and is accumulated by the CMOS image sensing array 22. Notably, the object illumination process must be carried out for an optimal duration so that the acquired image frame has good contrast and is not saturated. Such conditions are required for the consistent and reliable bar code decoding operation and performance. The Automatic Light Exposure Measurement and Illumination Control Subsystem 15 measures the amount of light reflected from the target object, calculates the maximum time that the CMOS image sensing array 22 should be kept exposed to the actively-driven LED-based illumination array associated with the Multi-Mode Illumination Subsystem 14, and then automatically deactivates the illumination array when the calculated time to do so expires (i.e. lapses).
As shown in Fig. 6A of the illustrative embodiment, the Automatic Light Exposure Measurement and Illumination Control Subsystem 15 comprises: a parabolic light-collecting mirror 55 mounted within the head portion of the hand-supportable housing, for collecting narrow-band LED- based light reflected from a central portion of the FOV of the system, which is then transmitted through the narrow-band optical filter subsystem 4 eliminating wide band spectral interference; a light-sensing device (e.g. photo-diode) 56 mounted at the focal point of the light collection mirror 55, for detecting the filtered narrow-band optical signal focused therein by the light collecting mirror 55; and an electronic circuitry 57 for processing electrical signals produced by the photo-diode 56 indicative of the intensity of detected light exposure levels within the focal plane of the CMOS image sensing array 22. During light exposure measurement operations, incident narrow-band LED-based illumination is gathered from the center of the FOV of the system by the spherical light collecting mirror 55 and narrow-band filtered by the narrow-band optical filter subsystem 4 before being focused upon the photodiode 56 for intensity detection. The photo-diode 56 converts the detected light signal into an electrical signal having an amplitude which directly corresponds to the intensity of the collected light signal.
As shown in Fig. 6B, the System Control Subsystem 19 generates an illumination array selection control signal which determines which LED illumination array (i.e. the narrow-area illumination array 27 or the far-field and narrow-field wide-area illumination arrays 28 or 29) will be selectively driven at any instant in time of system operation by LED Array Driver Circuitry 64 in the Automatic Light Exposure Measurement and Illumination Control Subsystem 15. As shown, electronic circuitry 57 processes the electrical signal from photo-detector 56 and generates an auto exposure control signal for the selected LED illumination array. In term, this auto exposure control signal is provided to the LED array driver circuitry 64, along with an illumination array selection control signal from the System Control Subsystem 19, for selecting and driving (i.e. energizing) one or more LED illumination array(s) so as to generate visible illumination at a suitable intensity level and for suitable time duration so that the CMOS image sensing array 22 automatically detects digital high- resolution images of illuminated objects, with sufficient contrast and brightness, while achieving global exposure control objectives of the present invention disclosed herein. As shown in Fig. 6B and 7C3 the illumination array selection control signal is generated by the System Control Subsystem 19 in response to (i) reading the system mode configuration parameters from the system mode configuration parameter table 70, shown in Fig. 2Al , for the programmed mode of system operation at hand, and (ii) detecting the output from the automatic IR-based Object Presence and Range Detection Subsystem 12.
Notably, in the illustrative embodiment, there are three possible LED-based illumination arrays 27, 28 and 29 which can be selected for activation by the System Control Subsystem 19, and the upper and/or lower LED subarrays in illumination arrays 28 and 29 can be selectively activated or deactivated on a subarray-by-subarray basis, for various purposes taught herein, including automatic specular reflection noise reduction during wide-area image capture modes of operation.
Each one of these illumination arrays can be driven to different states depending on the auto- exposure control signal generated by electronic signal processing circuit 57, which will be generally a function of object distance, object surface reflectivity and the ambient light conditions sensed at photo- detector 56, and measured by signal processing circuit 57. The operation of signal processing circuitry 57 will now be detailed below. As shown in Fig. 6B, the narrow-band filtered optical signal that is produced by the parabolic light focusing mirror 55 is focused onto the photo-detector Dl 56 which generates an analog electrical signal whose amplitude corresponds to the intensity of the detected optical signal. This analog electrical signal is supplied to the signal processing circuit 57 for various stages of processing. The first step of processing involves converting the analog electrical signal from a current-based signal to a voltage-based signal which is achieved by passing it through a constant-current source buffer circuit 58, realized by one half of transistor Ql (58). This inverted voltage signal is then buffered by the second half of the transistor Ql (58) and is supplied as a first input to a summing junction 59. As shown in Fig. 7C, the CMOS image sensing array 22 produces, as output, a digital electronic rolling shutter (ERS) pulse signal 60, wherein the duration of this ERS pulse signal 60 is fixed to a maximum exposure time allowed in the system. The ERS pulse signal 60 is buffered through transistor Q2 61 and forms the other side of the summing junction 59. The outputs from transistors Ql and Q2 form an input to the summing junction 59. A capacitor C5 is provided on the output of the summing junction 59 and provides a minimum integration time sufficient to reduce any voltage overshoot in the signal processing circuit 57. The output signal across the capacitor C5 is further processed by a comparator UI 62. In the illustrative embodiment, the comparator reference voltage signal is set to 1.7 volts. This reference voltage signal sets the minimum threshold level for the light exposure measurement circuit 57. The output signal from the comparator 62 is inverted by inverter U3 63 to provide a positive logic pulse signal which is supplied, as auto exposure control signal, to the input of the LED array driver circuit 64 shown in Fig. 1C.
As will be explained in greater detail below, the LED array driver circuit 64 shown in Fig. 7C automatically drives an activated LED illuminated array, and the operation of LED array driver circuit 64 depends on the mode of operation in which the Multi-Mode Illumination Subsystem 14 is configured. In turn, the mode of operation in which the Multi-Mode Illumination Subsystem 14 is configured at any moment in time will typically depend on (i) the state of operation of the Object Presence and Range Detection Subsystem 12 and (ii) the programmed mode of operation in which the entire Imaging-Based Bar Code Symbol Reading System is configured using system mode configuration parameters read from Table 70 shown in Fig. 2Al .
As shown in Fig. 7C, the LED array driver circuit 64 comprises analog and digital circuitry which receives two input signals: (i) the auto exposure control signal from signal processing circuit 57; and (ii) the illumination array selection control signal. The LED array driver circuit 64 generates, as output, digital pulse-width modulated (PCM) drive signals provided to either the narrow-area illumination array 27, the upper and/or lower LED subarray employed in the near-field wide-area illumination array 28, and/or the upper and/or lower LED subarrays employed in the far-field wide-area illumination array 29. Depending on which mode of system operation the imagϊng-based bar code symbol reader has been configured, the LED array driver circuit 64 will drive one or more of the above- described LED illumination arrays during object illumination and imaging operations. As will be described in greater detail below, when all rows of pixels in the CMOS image sensing array 22 are in a state of integration (and thus' have a common integration time), such LED illumination array(s) are automatically driven by the LED array driver circuit 64 at an intensity and for duration computed (in an analog manner) by the Automatic Light Exposure and Illumination Control Subsystem 15 so as to capture digital images having good contrast and brightness, independent of the light intensity of the ambient environment and the relative motion of target object with respect to the imaging-based bar code symbol reader.
Global Exposure Control Method Of The Present Invention Carried Out Using The CMOS Image Sensing Array
In the illustrative embodiment, the CMOS image sensing array 22 is operated in its Single Frame Shutter Mode (i.e. rather than its Continuous Frame Shutter Mode) as shown in Fig. 6D, and employs a novel exposure control method which ensure that all rows of pixels in the CMOS image sensing array 22 have a common integration time, thereby capturing high quality images even when the object is in a state of high speed motion. This novel exposure control technique shall be referred to as "the global exposure control method" of the present invention, and the flow chart of Figs. 6El and 6E2 describes clearly and in great detail how this method is implemented in the imaging-based bar code symbol reader of the illustrative embodiment. The global exposure control method will now be described in detail below.
As indicated at Block A in Fig. 6El , Step A in the global exposure control method involves selecting the single frame shutter mode of operation for the CMOS imaging sensing array provided within an imaging-based bar code symbol reading system employing an automatic light exposure measurement and illumination control subsystem, a multi-mode illumination subsystem, and a system control subsystem integrated therewith, and image formation optics providing the CMOS image sensing array with a field of view into a region of space where objects to be imaged are presented.
As indicated in Block B in Fig. 6El , Step B in the global exposure control method involves using the automatic light exposure measurement and illumination control subsystem to continuously collect illumination from a portion of the field of view, detect the intensity of the collected illumination, and generate an electrical analog signal corresponding to the detected intensity, for processing.
As indicated in Block C in Fig. 6El , Step C in the global exposure control method involves activating (e.g. by way of the system control subsystem 19 or directly by way of trigger switch 2C) the CMOS image sensing array so that its rows of pixels begin to integrate photonically generated electrical charge in response to the formation of an image onto the CMOS image sensing array by the image formation optics of the system.
As indicated in Block D in Fig. 6EI , Step D in the global exposure control method involves the CMOS image sensing array 22 automatically (i) generating an electronic rolling shutter (ERS) digital pulse signal when all rows of pixels in the image sensing array are operated in a state of integration, and providing this ERS pulse signal to the Automatic Light Exposure Measurement And Illumination Control Subsystem 15 so as to activate light exposure measurement and illumination control functions/operations therewithin.
As indicated in Block E in Fig. 6E2, Step E in the global exposure control method involves, upon activation of light exposure measurement and illumination control functions within Subsystem 15, (i) processing the electrical analog signal being continuously generated therewithin, (ii) measuring the light exposure level within a central portion of the field of view 23 (determined by light collecting optics 55 shown in Fig. 6A), and (iii) generating an auto-exposure control signal for controlling the generation of visible field of illumination from at least one LED-based illumination array (27, 28 and/or 29) in the Multi-Mode Illumination Subsystem 14 which is selected by an illumination array selection control signal produced by the System Control Subsystem 19.
Finally, as indicated at Block F in Fig. 6E2, Step F in the global exposure control method involves using (i) the auto exposure control signal and (ii) the illumination array selection control signal to drive the selected LED-based illumination array(s) and illuminate the field of view of the CMOS image sensing array 22 in whatever image capture mode it may be configured, precisely when all rows of pixels in the CMOS image sensing array are in a state of integration, as illustrated in Fig. 6D, thereby ensuring that all rows of pixels in the CMOS image sensing array have a common integration time. By enabling all rows of pixels in the CMOS image sensing array 22 to have a common integration time, high-speed "global exposure control" is effectively achieved within the imaging-based bar code symbol reader of the present invention, and consequently, high quality images are captured independent of the relative motion between the bar code symbol reader and the target object.
Specification Of The IR-Based Automatic Object Presence And Range Detection Subsystem Employed In The Hand-Supportable Digital Image-Based Bar Code Reading Device Of The Present Invention
As shown in Fig. SA, IR-wavelength based Automatic Object Presence and Range Detection Subsystem 12 is realized in the form of a compact optics module 76 mounted on the front portion of optics bench 6, as shown in Fig. IJ.
As shown in Fig. 7, the object presence and range detection module 12 of the illustrative embodiment comprises a number of subcomponents, namely: an optical bench 77 having an ultra-small footprint for supporting optical and electro-optical components used to implement the subsystem 12; at least one IR laser diode 78 mounted on the optical bench 77, for producing a low power IR laser beam 79; IR beam shaping optics 80, supported on the optical bench for shaping the IR laser beam (e.g. into a pencil-beam like geometry) and directing the same into the central portion of the object detection field 20 defined by the field of view (FOV) of IR light collection/focusing optics 81 supported on the optical bench 77; an amplitude modulation (AM) circuit 82 supported on the optical bench 77, for modulating the amplitude of the IR laser beam produced from the IR laser diode at a frequency /0 (e.g. 75Mhz) with up to 7.5 milliWatts of optical power; optical detector (e.g. an avalanche-type IR photo-detector) 83, mounted at the focal point of the IR light collection/focusing optics 81, for receiving the IR optical signal reflected off an object within the object detection field, and converting the received optical signal Control Subsystem 15 so as to activate light exposure measurement and illumination control functions/operations therewithin.
As indicated in Block E in Fig. 6E2, Step E in the global exposure control method involves, upon activation of light exposure measurement and illumination control functions within Subsystem 15, (i) processing the electrical analog signal being continuously generated therewithin, (ii) measuring the light exposure level within a central portion of the field of view 23 (determined by light collecting optics 55 shown in Fig. 6A), and (iii) generating an auto-exposure control signal for controlling the generation of visible field of illumination from at least one LED-based illumination array (27, 28 and/or 29) in the Multi-Mode Illumination Subsystem 14 which is selected by an illumination array selection control signal produced by the System Control Subsystem 19.
Finally, as indicated at Block F in Fig. 6E2, Step F in the global exposure control method involves using (i) the auto exposure control signal and (ii) the illumination array selection control signal to drive the selected LED-based illumination array(s) and illuminate the field of view of the CMOS image sensing array 22 in whatever image capture mode it may be configured, precisely when all rows of pixels in the CMOS image sensing array are in a state of integration, as illustrated in Fig. 6D, thereby ensuring that all rows of pixels in the CMOS image sensing array have a common integration time. By enabling all rows of pixels in the CMOS image sensing array 22 to have a common integration time, high-speed "global exposure control" is effectively achieved within the imaging-based bar code symbol reader of the present invention, and consequently, high quality images are captured independent of the relative motion between the bar code symbol reader and the target object.
Specification Of The IR-Based Automatic Object Presence And Range Detection Subsystem Employed In The Hand-Supportable Digital Image-Based Bar Code Reading Device Of The Present Invention
As shown in Fig. SA, IR-wavelength based Automatic Object Presence and Range Detection Subsystem 12 is realized in the form of a compact optics module 76 mounted on the front portion of optics bench 6, as shown in Fig. IJ.
As shown in Fig. 7, the object presence and range detection module 12 of the illustrative embodiment comprises a number of subcomponents, namely: an optical bench 77 having an ultra-small footprint for supporting optical and electro-optical components used to implement the subsystem 12; at least one IR laser diode 78 mounted on the optical bench 77, for producing a low power IR laser beam 79; IR beam shaping optics 80, supported on the optical bench for shaping the IR laser beam (e.g. into a pencil-beam like geometry) and directing the same into the central portion of the object detection field 20 defined by the field of view (FOV) of IR light collection/focusing optics 81 supported on the optical bench 77; an amplitude modulation (AM) circuit 82 supported on the optical bench 77, for modulating the amplitude of the IR laser beam produced from the IR laser diode at a frequency /0 (e.g. 75Mhz) with up to 7.5 milliWatts of optical power; optical detector (e.g. an avalanche-type IR photo-detector) 83, mounted at the focal point of the IR light collection/focusing optics 81, for receiving the IR optical signal reflected off an object within the object detection field, and converting the received optical signal 84 into an electrical signal 85; an amplifier and filter circuit 86, mounted on the optical bench 77, for isolating the /0 signal component and amplifying it; a limiting amplifier 87, mounted on the optical bench, for maintaining a stable signal level; a phase detector 88, mounted on the optical bench 77, for mixing the reference signal component /0 from the AM circuit 82 and the received signal component /0 reflected from the packages and producing a resulting signal which is equal to a DC voltage proportional to the Cosine of the phase difference between the reference and the reflected /0 signals; an amplifier circuit 89, mounted on the optical bench 77, for amplifying the phase difference signal; a received signal strength indicator (RSSI) 90, mounted on the optical bench 77, for producing a voltage proportional to a LOG of the signal reflected from the target object which can be used to provide additional information; a reflectance level threshold analog multiplexer 91 for rejecting information from the weak signals; and a 12 bit A/D converter 92 , mounted on the optical bench 77, for converting the DC voltage signal from the RSSI circuit 90 into sequence of time-based range data elements ^Rn ,}, taken along nT discrete instances in time, where each range data element R11 . provides a measure of the distance of the object referenced from (i) the IR laser diode 78 to (ii) a point on the surface of the object within the object detection field 20; and range analysis circuitry 93 described below.
In general, the function of range analysis circuitry 93 is to analyze the digital range data from the A/D converter 90 and generate two control activation signals, namely: (i) "an object presence detection" type of control activation signal AIA indicating simply whether an object is presence or absent from the object detection field, regardless of the mode of operation in which the Multi-Mode Illumination Subsystem 14 might be configured; and (ii) "a near-field/far-field" range indication type of control activation signal AI B indicating whether a detected object is located in either the predefined near-field or far-field portions of the object detection field, which correspond to the near-field and far- field portions of the FOV of the Multi-Mode Image Formation and Detection Subsystem 13.
Various kinds of analog and digital circuitry can be designed to implement the IR-based Automatic Object Presence and Range Detection Subsystem 12. Alternatively, this subsystem can be realized using various kinds of range detection techniques as taught in US Patent No. 6,637,659, incorporated herein by reference in its entirely.
In the illustrative embodiment, Automatic Object Presence and Range Detection Subsystem 12 operates as follows. In System Modes of Operation requiring automatic object presence and/or range detection, Automatic Object Presence and Range Detection Subsystem 12 will be activated at system start-up and operational at all times of system operation, typically continuously providing the System Control Subsystem 19 with information about the state of objects within both the far and near portions of the object detection field 20 of the imaging-based symbol reader. In general, this Subsystem detects two basic states of presence and range, and therefore has two basic states of operation. In its first state of operation, the IR-based automatic Object Presence and Range Detection Subsystem 12 automatically detects an object within the near-field region of the FOV 20, and in response thereto generates a first control activation signal which is supplied to the System Control Subsystem 19 to indicate the occurrence of this first fact. In its second state of operation, the IR-based automatic Object Presence and Range Detection Subsystem 12 automatically detects an object within the far-field region of the FOV 20, and in response thereto generates a second control activation signal which is supplied to the System Control Subsystem 19 to indicate the occurrence of this second fact. As will be described in greater detail and throughout this patent specification, these control activation signals are used by the System Control Subsystem 19 during particular stages of the system control process, such as determining (i) whether to activate either the near-field and/or far-field LED illumination arrays, and (ii) how strongly should these LED illumination arrays be driven to ensure quality image exposure at the CMOS image sensing array 22.
Specification Of The Mapping Of Pixel Data Captured Bv The Imaging Array Into The SDRAM Under The Control Of The Direct Memory Access CDMA") Module Within The Microprocessor
As shown in Fig. 8, the CMOS image sensing array 22 employed in the digital imaging-based bar code symbol reading device hereof is operably connected to its microprocessor 36 through FIFO 39 (realized by way of a FPGA) and system bus shown in Fig. 2M. As shown, SDRAM 38 is also operably connected to the microprocessor 36 by way of the system bus, thereby enabling the mapping of pixel data captured by the CMOS image sensing array 22 into the SDRAM 38 under the control of the direct memory access (DMA) module within the microprocessor 36.
Referring to Fig. 9, details will now be given on how the bytes of pixel data captured by CMOS image sensing array 22 are automatically mapped (i.e. captured and stored) into the addressable memory storage locations of its SDRAM 38 during each image capture cycle carried out within the hand-supportable imaging-based bar code reading device of the present invention.
In the implementation of the illustrative embodiment, the CMOS image sensing array 22 sends 7-bit gray-scale data bytes over a parallel data connection to FPGA 39 which implements a FIFO using its internal SRAM. The FIFO 39 stores the pixel data temporarily and the microprocessor 36 initiates a DMA transfer from the FIFO (which is mapped to address OXOCOOOOOO, chip select 3) to the SDRAM 38. In general, modern microprocessors have internal DMA modules, and a preferred microprocessor design, the DMA module will contain a 32-byte buffer. Without consuming any CPU cycles, the DMA module can be programmed to read data from the FIFO 39, store read data bytes in the DMA's buffer, and subsequently write the data to the SDRAM 381 Alternatively, a DMA module can reside in FPGA 39 to directly write the FIFO data into the SDRAM 38. This is done by sending a bus request signal to the microprocessor 36, so that the microprocessor 36 releases control of the bus to the FPGA 39 which then takes over the bus and writes data into the SDRAM 38.
Below, a brief description will be given on where pixel data output from the CMOS image sensing array 22 is stored in the SDRAM 38, and how the microprocessor (i.e. implementing a decode algorithm) 36 accesses such stored pixel data bytes. Fig. 9F represents the memory space of the SDRAM 38. A reserved memory space of 1.3 MB is used to store the output of the CMOS image sensing array 22. This memory space is a 1 :1 mapping of the pixel data from the CMOS image sensing array 22. Each byte represents a pixel in the image sensing array 22. Memory space is a mirror image of the pixel data from the image sensing array 22. Thus, when the decode program (36) accesses the memory, it is as if it is accessing the raw pixel image of the image sensing array 22. No time code is needed to track the data since the modes of operation of the bar code reader guarantee that the microprocessor 36 is always accessing the up-to-date data, and the pixel data sets are a true representation of the last optical exposure. To prevent data corruption, i.e. new data coming in while old data are still being processed, the reserved space is protected by disabling further DMA access once a whole frame of pixel data is written into memory. The DMA module is re-enabled until either the microprocessor 36 has finished going through its memory, or a timeout has occurred.
During image acquisition operations, the image pixels are sequentially read out of the image sensing array 22. Although one may choose to read and column-wise or row-wise for some CMOS image sensors, without loss of generality, the row-by-row read out of the data is preferred. The pixel image data set is arranged in the SDRAM 38 sequentially, starting at address OXAOEC0000. To randomly access any pixel in the SDRAM 38 is a straightforward matter: the pixel at row y 1/4 column x located is at address (OXAOEC0000+ y x 1280 + x).
As each image frame always has a frame start signal out of the image sensing array 22, that signal can be used to start the DMA process at address OXAOEC0000, and the address is continuously incremented for the rest of the frame. But the reading of each image frame is started at address OXAOECOOOO to avoid any misalignment of data. Notably, however, if the microprocessor 36 has programmed the CMOS image sensing array 22 to have a ROI window, then the starting address will be modified to (OXAOECOOOO + 1280 X R1), where R| is the row number of the top left corner of the ROI.
Specification Of The Three-Tier Software Architecture Of The Hand-Supportable Digital Image-Based Bar Code Reading Device Of The Present Invention
As shown in Fig. 10, the hand-supportable digital imaging-based bar code symbol reading device of the present invention 1 is provided with a three-tier software architecture comprising the following software modules: (1) the Main Task module, the CodeGate Task module, the Metroset Task module, the Application Events Manager module, the User Commands Table module, the Command Handler module, the Plug-In Controller (Manager) and Plug-In Libraries and Configuration Files, each residing within the Application layer of the software architecture; (2) the Tasks Manager module, the Events Dispatcher module, the Input/Output Manager module, the User Commands Manager module, the Timer Subsystem module, the Input/Output Subsystem module and the Memory Control Subsystem module, each residing within the System Core (SCORE) layer of the software architecture; and (3) the Linux K.ernal module, the Linux File System module, and Device Drivers modules, each residing within the Linux Operating System (OS) layer of the software architecture. While the operating system layer of the imaging-based bar code symbol reader is based upon the Linux operating system, it is understood that other operating systems can be used (e.g. Microsoft Windows, Max OXS, Unix, etc), and that the design preferably provides for independence between the main Application Software Layer and the Operating System Layer, and therefore, enables of the Application Software Layer to be potentially transported to other platforms. Moreover, the system design principles of the present invention provides an extensibility of the system to other future products with extensive usage of the common software components, which should make the design of such products easier, decrease their development time, and ensure their robustness.
In the illustrative embodiment, the above features are achieved through the implementation of an event-driven multi-tasking, potentially multi-user, Application layer running on top of the System Core software layer, called SCORE. The SCORE layer is statically linked with the product Application software, and therefore, runs in the Application Level or layer of the system. The SCORE layer provides a set of services to the Application in such a way that the Application would not need to know the details of the underlying operating system, although all operating system APIs are, of course, available to the application as well. The SCORE software layer provides a real-time, event-driven, OS- independent framework for the product Application to operate. The event-driven architecture is achieved by creating a means for detecting events (usually, but not necessarily, when the hardware interrupts occur) and posting the events to the Application for processing in real-time manner. The event detection and posting is provided by the SCORE software layer. The SCORE layer also provides the product Application with a means for starting and canceling the software tasks, which can be running concurrently, hence, the multi-tasking nature of the software system of the present invention.
Specification of Software Modules Within The SCORE Layer Of The System Software Architecture Employed In Imaging-Based Bar Code Reader Of The Present Invention
The SCORE layer provides a number of services to the Application layer.
The Tasks Manager provides a means for executing and canceling specific application tasks (threads) at any time during the product Application run.
The Events Dispatcher provides a means for signaling and delivering all kinds of internal and external synchronous and asynchronous events
When events occur, synchronously or asynchronously to the Application, the Events Dispatcher dispatches them to the Application Events Manager, which acts on the events accordingly as required by the Application based on its current state. For example, based on the particular event and current state of the application, the Application Events Manager can decide to start a new task, or stop currently running task, or do something else, or do nothing and completely ignore the event.
The Input/Output Manager provides a means for monitoring activities of input / output devices and signaling appropriate events to the Application when such activities are detected. The Input/Output Manager software module runs in the background and monitors activities of external devices and user connections, and signals appropriate events to the Application Layer, which such activities are detected. The Input/Output Manager is a high-priority thread that runs in parallel with the Application and reacts to the input/output signals coming asynchronously from the hardware devices, such as serial port, user trigger switch 2C, bar code reader, network connections, etc. Based on these signals and optional input/output requests (or lack thereof) from the Application, it generates appropriate system events, which are delivered through the Events Dispatcher to the Application Events Manager as quickly as possible as described above.
The User Commands Manager provides a means for managing user commands, and utilizes the User Commands Table provided by the Application, and executes appropriate User Command Handler based on the data entered by the user.
The Input/Output Subsystem software module provides a means for creating and deleting input/output connections and communicating with external systems and devices
The Timer Subsystem provides a means of creating, deleting, and utilizing all kinds of logical timers.
The Memory Control Subsystem provides an interface for managing the multi-level dynamic memory with the device, fully compatible with standard dynamic memory management functions, as well as a means for buffering collected data. The Memory Control Subsystem provides a means for thread-level management of dynamic memory. The interfaces of the Memory Control Subsystem are fully compatible with standard C memory management functions. The system software architecture is designed to provide connectivity of the device to potentially multiple users, which may have different levels of authority to operate with the device.
The User Commands Manager, which provides a standard way of entering user commands, and executing application modules responsible for handling the same. Each user command described in the User Commands Table is a task that can be launched by the User Commands Manager per user input, but only if the particular user's authority matches the command's level of security.
The Events Dispatcher software module provides a means of signaling and delivering events to the Application Events Manager, including the starting of a new task, stopping a currently running task, or doing something or nothing and simply ignoring the event.
Fig. 12B provides a Table listing examples of System-Defined Events which can occur and be dispatched within the hand-supportable digital imaging-based bar code symbol reading device of the present invention, namely: SCORE_EVENT_POWER_UP which signals the completion of system start-up and involves no parameters;_SCORE_EVENT_TlMEOUT which signals the timeout of the logical timer, and involves the parameter "pointer to timer id"; SCORE_EVEMT_UNEXPECTED_INPUT which signals that the unexpected input data is available and involves the parameter "pointer to connection id"; SCORE_EVENT_TRIG_ON which signals that the user pulled the trigger and involves no parameters; SCORE_EVENT_TR1G_OFF which signals that the user released the trigger and involves no parameters; SCORE_EVENT_OBJECT_DETECT_ON which signals that the object is positioned under the bar code reader and involves no parameters; SCORE_EVENT_OBJECT_DETECT_OFF which signals that the object is removed from the field of view of the bar code reader and involves no parameters; SCORE-E VENTJEX ITJTASK. which signals the end of the task execution and involves the pointer UTID; and SCORE_EVENT_ABORT_TASK which signals the aborting of a task during execution.
The imaging-based bar code symbol reading device of the present invention provides the user with a command-line interface (CLI), which can work over the standard communication lines, such as RS232, available in the bar code reader. The CLI is used mostly for diagnostic purposes, but can also be used for configuration purposes in addition to the MetroSet® and MetroSelect® programming functionalities. To send commands to the bar code reader utilizing the CLI, a user must first enter the User Command Manager by typing in a special character, which could actually be a combination of multiple and simultaneous keystrokes, such Ctrl and S for example. Any standard and widely available software communication tool, such as Windows HyperTerminal, can be used to communicate with the bar code reader. The bar code reader acknowledges the readiness to accept commands by sending the prompt, such as "MTLG>" back to the user. The user can now type in any valid Application command. To quit the User Command Manager and return the scanner back to its normal operation, a user must enter another special character, which could actually be a combination of multiple and simultaneous keystrokes, such Ctrl and R for example.
An example of the valid command could be the "Save Image" command, which is used to upload an image from the bar code reader's memory to the host PC. This command has the following CLI format: save [ filename [ compr ] ] where
(1 ) save is the command name.
(2) filename is the name of the file the image gets saved in. If omitted, the default filename is "image.bmp".
(3) compr is the compression number, from 0 to 10. If omitted, the default compression number is 0, meaning no compression. The higher compression number, the higher image compression ratio, the faster image transmission, but more distorted the image gets.
The imaging-based bar code symbol reader of the present invention can have numerous commands. All commands are described in a single table (User Commands Table shown in Fig. 10) contained in the product Applications software layer. For each valid command, the appropriate record in the table contains the command name, a short description of the command, the command type, and the address of the function that implements the command.
When a user enters a command, the User Command Manager looks for the command in the table. If found, it executes the function the address of which is provided in the record for the entered command. Upon return from the function, the User Command Manager sends the prompt to the user indicating that the command has been completed and the User Command Manager is ready to accept a new command.
Specification of Software Modules Within The Application Layer Of The System Software Architecture Employed In Imaging-Based Bar Code Reader Of The Present Invention
The image processing software employed within the system hereof performs its bar code reading function by locating and recognizing the bar codes within the frame of a captured image comprising pixel data. The modular design of the image processing software provides a rich set of image processing functions, which could be utilized in the future for other potentiai applications, related or not related to bar code symbol reading, such as: optical character recognition (OCR) and verification (OCV); reading and verifying directly marked symbols on various surfaces; facial recognition and other biometrics identification; etc.
The CodeGate Task, in an infinite loop, performs the following task. It illuminates a "thin" narrow horizontal area at the center of the fϊeld-of-view (FOV) and acquires a digital image of that area. It then attempts to read bar code symbols represented in the captured frame of image data using the image processing software facilities supported by the Image-Processing Bar Code Symbol Reading Subsystem 17 of the present invention to be described in greater detail hereinafter. If a bar code symbol is successfully read, then Subsystem 17 saves the decoded data in the special Decode Data Buffer. Otherwise, it clears the Decode Data Buffer. Then, it continues the loop. The CodeGate Task routine never exits on its own. It can be canceled by other modules in the system when reacting to other events. For example, when a user pulls the trigger switch 2C, the event TRIGG ER_ON is posted to the application. The Application software responsible for processing this event, checks if the CodeGate Task is running, and if so, it cancels it and then starts the Main Task. The CodeGate Task can also be canceled upon OBJECT_DETECT_OFF event, posted when the user moves the bar code reader away from the object, or when the user moves the object away from the bar code reader. The CodeGate Task routine is enabled (with Main Task) when "semi-automatic-triggered" system modes of programmed operation (Modes of System Operation Nos. 1 1 -14 in Fig. 17AFig. 17A) are to be implemented on the illumination and imaging platform of the present invention.
The Narrow-Area Illumination Task illustrated in Fig. 13M is a simple routine which is enabled (with Main Task) when "manually-triggered" system modes of programmed operation (Modes of System Operation Nos. 1-5 in Fig. 17AFig. 17A) are to be implemented on the illumination and imaging platform of the present invention. However, this routine is never enabled simultaneously with CodeGate Task. As shown in the event flow chart of Fig. 13D, either CodeGate Task or Narrow-Area Illumination Task are enabled with the Main Task routine to realize the diverse kinds of system operation described herein.
Depending the System Mode in which the imaging-based bar code symbol reader is configured, Main Task will typically perform differently, but within the limits described in Fig. 13J. For example, when the imaging-based bar code symbol reader is configured in the Programmable Mode of System Operation No. 12 (i.e. Semi-Automatic-Triggered Multiple-Attempt 1 D/2D Single-Read Mode) to be described in greater detail hereinafter, the Main Task first checks if the Decode Data Buffer contains data decoded by the CodeGate Task. If so, then it immediately sends the data out to the user by executing the Data Output procedure and exits. Otherwise, in a loop, the Main Task does the following: it illuminates an entire area of the field-of-view and acquires a full-frame image of that area. It attempts to read a bar code symbol the captured image. If it successfully reads a bar code symbol, then it immediately sends the data out to the user by executing the Data Output procedure and exits. Otherwise, it continues the loop. Notably, upon successful read and prior to executing the Data Output procedure, the Main Task analyzes the decoded data for a "reader programming" command or a sequence of commands. If necessary, it executes the MetroSelect functionality. The Main Task can be canceled by other modules within the system when reacting to other events. For example, the bar code reader of the present invention can be re-configured using standard Metrologic configuration methods, such as MetroSelec® and MetroSet®. The MetroSelect functionality is executed during the Main Task.
The MetroSet functionality is executed by the special MetroSet Task. When the Focus RS232 software driver detects a special NULL-signal on its communication lines, it posts the METROSET_ON event to the Application. The Application software responsible for processing this event starts the MetroSet task. Once the MetroSet Task is completed, the scanner returns to its normal operation.
The function of the Plug-In Controller (i.e. Manager) is to read configuration files and find plug-in libraries within the Plug-In and Configuration File Library, and install plug-in into the memory of the operating system, which returns back an address to the Plug-In Manager indicating where the plug-in- has been installed, for future access. As will be described in greater detail hereinafter, the Plug-In Development Platform support development of plug-ins that enhance , extend and/or modify the features and functionalities of the image-processing based bar code symbol reading system, and once developed, to upload developed plug-ins within the file system of the operating system layer, while storing the addresses of such plug-ins within the Plug-In and Configuration File Library in the Application Layer.
Modes of System Operation Nos. 6-10 in Fig. 17A can be readily implemented on the illumination and imaging platform of the present invention by making the following software system modifications: (1) an Auto-Read Task routine would be added to the system routine library (wherein Auto-Read Task could be an infinite loop routine where the primary operations of CodeGate Task and Main Task are sequenced together to attempt first automatic narrow-area illumination and image capture and processing, followed by automatic wide-area illumination and image capture and processing, and repeating the wide-area operation in an infinite loop, until the object is no longer detected within a particular predetermined time period; and (2) modifying the query block "Is CodeGate Task or Narrow-Area Illumination Task Enabled?" in the Object_Detect_On event handling routine shown in Fig. 13D, to further ask whether the "Auto-Read Task Routine is enabled", and on the "Yes" control path, providing a block, which starts "Auto-Read Task" and then advancing control to Return.
Operating System Layer Software Modules Within The Application Layer Of The System Software Architecture Employed In Imaging-Based Bar Code Reader Of The Present Invention
The Devices Drivers software modules, which includes trigger drivers, provide a means for establishing a software connection with the hardware-based manually-actuated trigger switch 2C employed on the imaging-based device, an image acquisition driver for implementing image acquisition functionality aboard the imaging-based device, and an IR driver for implementing object detection functionality aboard the imaging-based device.
As shown in Fig. 121, the Device Drive software modules include: trigger drivers for establishing a software connection with the hardware-based manually-actuated trigger switch 2C employed on the imaging-based bar code symbol reader of the present invention; an image acquisition driver for implementing image acquisition functionality aboard the imaging-based bar code symbol reader; and an IR driver for implementing object detection functionality aboard the imaging-based bar code symbol reader.
Basic System Operations Supported By The Three-Tier Software Architecture Of The Hand- Supportable Digital Imaging-Based Bar Code Reading Device Of The Present Invention
In Figs 13A through 13L, the basic systems operations supported by the three-tier software architecture of the digital imaging-based bar code symbol reader of the present invention are schematically depicted. Notably, these basic operations represent functional modules (or building blocks) with the system architecture of the present invention, which can be combined in various combinations to implement the numerous Programmable Modes of System Operation listed in Fig. 23 and described in detail below, using the image acquisition and processing platform disclosed herein. For purposes of illustration, and the avoidance of obfuscation of the present invention, these basic system operations will be described below with reference to Programmable Mode of System Operation "No. 12: Semi-Automatic-Triggered Multiple-Attempt 1 D/2D Single-Read Mode Employing The No- Finder Mode And The Manual Or Automatic Modes Of the Multi-Mode Bar Code Reading Subsystem 17.
Fig. 13A shows the basic operations carried out within the System Core Layer of the system when the user points the bar code reader towards a bar code symbol on an object. Such operations include the by IR device drivers enabling automatic detection of the object within the field, and waking up of the Input/Output Manager software module. As shown in Fig. 13B, the Input/Output Manager then posts the SCOREJDBJ ECT_DETECT_ON event to the Events Dispatcher software module in response to detecting an object. Then as shown in Fig. 13C, the Events Dispatcher software module passes the SCORE_OBJECT_DETECT_ON event to the Application Layer. Upon receiving the SCORE_OBJECT_DETECT_ON event at the Application Layer, the ' Application Events Manager executes an event handling routine (shown in Fig. 13D) which activates the narrow-area (linear) illumination array 27 (i.e. during narrow-area illumination and image capture modes), and then depending on whether the presentation mode has been selected and whether CodeGate Task or Narrow-Area Illumination Mode has been enabled during system configuration, this even handling routine executes either Main Task described in Fig. 13J, CodeGate Task described in Fig. 13E, or Narrow-Area Illumination Task described in 13M. As shown in the flow chart of Fig. 13 D, the system event handling routine first involves determining whether the Presentation Mode has been selected (i.e. enabled), then the event handling routine determines whether the CodeGate Task or Narrow-Area Illumination Routines have been enabled (with Main Task). If CodeGate Task has been enabled, then Application Layer starts CodeGate Task. If the Narrow-Area Illumination Task has been enabled, then the Application Layer starts the Narrow-Area Illumination Task, as shown.
As shown in Fig. 13E, the Application Layer executes the CodeGate Task by first activating the narrow-area image capture mode in the Multi-Mode Image Formation and Detection Subsystem 13 (i.e. by enabling a few middle rows of pixels in the CMOS sensor array 22), and then acquiring/capturing a narrow image at the center of the FOV of the Bar Code Reader. CodeGate Task then performs image processing operations on the captured narrow-area image using No-Finder Module which has been enabled by the selected Programmable Mode of System Operation No. 12. If the image processing method results in a successful read of a bar code symbol, then the Codegate Task saves the decoded symbol character data in the Codegate Data Buffer; and if not, then the task clears the Codegate Data Buffer, and then returns to the main block of the Task where image acquisition reoccurs.
As shown in Fig. 13 F, when the user pulls the trigger switch 2C on the bar code reader while the Code Task is executing, the trigger switch driver in the OS Layer automatically wakes up the Input/Output Manager at the System Core Layer. As shown in Fig. 13G, the Input/Output Manager, in response to being woken up by the trigger device driver, posts the SCORE_TRIGGER_ON event to the Events Dispatcher also in the System Core Layer. As shown in Fig. 13 H, the Events Dispatcher then passes on the SCORE_TRIGGER_ON event to the Application Events Manager at the Application Layer. As shown in Figs. 1311 and 1312, the Application Events Manager responds to the SCORE_TRIGGER_ON event by invoking a handling routine (Trigger On Event) within the Task Manager at the System Core Layer.
As shown the flow chart of Figs. 1311 and 1312, the routine determines whether the Presentation Mode (i.e. Programmed Mode of System Operation No. 10) has been enabled, and if so, then the routine exits. If the routine determines that the Presentation Mode (i.e. Programmed Mode of System Operation No. 10) has not been enabled, then it determines whether the CodeGate Task is running, and if it is running, then it first cancels the CodeGate Task and then deactivates the narrow- area illumination array 27 associated with the Multi-Mode Illumination Subsystem 14, and thereafter executes the Main Task. If however the routine determines that the CodeGate Task is not running, then it determines whether Narrow-Area Illumination Task is running, and if it is not running, then Main Task is started. However, if Narrow- Area Illumination Task is running, then the routine increases the narrow-illumination beam to full power and acquires a narrow-area image at the center of the field of view of the system, then attempts to read the bar code in the captured narrow-area image. If the read attempt is successful, then the decoded (symbol character) data is saved in the Decode Data Buffer, the Narrow-Area Illumination Task is canceled, the narrow-area illumination beam is stopped, and the routine starts the Main Task, as shown. If the read attempt is unsuccessful, then the routine clears the Decode Data Buffer, the Narrow-Area Illumination Task is canceled, the narrow-area illumination beam is stopped, and the routine starts the Main Task, as shown.
As shown in Fig. 13M, the Narrow- Area Task routine is an infinite loop routine that simply keeps a narrow-area illumination beam produced and directed at the center of the field of view of the system in a recursive manner (e.g. typically at half or less power in comparison with the full-power narrow-area illumination beam produced during the running of CodeGate Task).
As shown in Fig. 13J, the first step performed in the Main Task by the Application Layer is to determine whether CodeGate Data is currently available (i.e. stored in the Decode Data Buffer), and if such data is available, then the Main Task directly executes the Data Output Procedure described in Fig. 13K. However, if the Main Task determines that no such data is currently, available, then it starts the Read TimeOut Timer, and then acquires a wide-area image of the detected object, within the time frame permitted by the Read Timeout Timer. Notably, this wide-area image acquisition process involves carrying out the following operations, namely: (i) first activating the wide-area illumination mode in the Multi-Mode Illumination Subsystem 14 and the wide-area capture mode in the CMOS image formation and detection module; (ii) determining whether the object resides in the near-field or far-field portion of the FOV (through object range measurement by the IR-bascd Object Presence and Range Detection Subsystem 12); and (iii) then activating either the near or far field wide-area illumination array to illuminate either the object in either the near or far field portions of the FOV using either the near-field illumination array 28 or the far-field illumination array 29 (or possibly both 28 and 29 in special programmed cases) at an intensity and duration determined by the automatic light exposure measurement and control subsystem 15; while (iv) sensing the spatial intensity of light imaged onto the CMOS image sensing array 22 in accordance with the Global Exposure Control Method of the present invention, described in detail hereinabove. Then the Main Task performs image processing operations on the captured image using either the Manual, ROI-Specific or Automatic Modes of operation (although it is understood that other image-processing based reading methods taught herein, such as Automatic or OmniScan (as well we other suitable alternative decoding algorithms/processes not disclosed herein), can be used depending on which Programmed Mode of System Operation has been selected by the end user for the imaging-based bar code symbol reader of the present invention. Notably, in the illustrative embodiment shown in Fig. 13J, the time duration of each image acquisition/processing frame is set by the Start Read Timeout Timer and Stop Read Timeout Timer blocks shown therein, and that within the Programmed Mode of System Operation No. 12, the Main Task will support repeated (i.e. multiple) attempts to read a single bar code symbol so long as the trigger switch 2C is manually depressed by the operator and a single bar code has not yet been read. Then upon successfully reading a (single) bar code symbol, the Main Task will then execute the Data Output Procedure. Notably, in other Programmed Modes of System Operation, in which a single attempt at reading a bar code symbol is enabled, the Main Task will be modified accordingly to support such system behavior. In such a case, an alternatively named Main Task (e.g. Main Task No. 2) would be executed to enable the required system behavior during run-time.
It should also be pointed out at this juncture, that it is possible to enable and utilize several of different kinds of symbol reading methods during the Main Task, and to apply particular reading methods based on the computational results obtained while processing the narrow-area image during the CodeGate Task, and/or while preprocessing of the captured wide-area image during one of the image acquiring/processing frames or cycles running in the Main Task. The main point to be made here is that the selection and application of image-processing based bar code reading methods will preferably occur through the selective activation of the different modes available within the multi-mode image- processing based bar code symbol reading Subsystem 17, in response to information learned about the graphical intelligence represented within the structure of the captured image, and that such dynamic should occur in accordance with principles of dynamic adaptive learning commonly used in advanced image processing systems, speech understanding systems, and alike. This general approach is in marked contrast with the approaches used in prior art imaging-based bar code symbol readers, wherein permitted methods of bar code reading are pre-selected based on statically defined modes selected by the end user, and not in response to detected conditions discovered in captured images on a real-time basis.
As shown in Fig. 13K, the first step carried out by the Data Output Procedure, called in the Main Task, involves determining whether the symbol character data generated by the Main Task is for programming the bar code reader or not. If the data is not for programming the bar code symbol reader, then the Data Output Procedure sends the data out according to the bar code reader system configuration, and then generates the appropriate visual and audio indication to the operator, and then exits the procedure. If the data is for programming the bar code symbol reader, then the Data Output Procedure sets the appropriate elements of the bar code reader configuration (file) structure, and then saves the Bar Code Reader Configuration Parameters in non-volatile RAM (i.e. NOVRAM). The Data Output Procedure then reconfigures the bar code symbol reader and then generates the appropriate visual and audio indication to the operator, and then exits the procedure. As shown in Fig. 13L, decoded data is sent from the Input/Output Module at the System Core Layer to the Device Drivers within the Linux OS Layer of the system.
_Wide-Area Illumination Control Method For Use During The Main Task System Control Routine So As To Illuminate Objects With Wide-Area Illumination In A Manner Which Substantially Reduces Specular-Type Reflection At The CMOS Image Sensing Array Of The Bar Code Symbol Reader W 2
66
Referring to Figs. 13Nl through 13N3, the method of illuminating objects without specular reflection, according to the present invention, will now be described in detail. This control routine can be called during the acquisition of wide-area image step in the Main Task routine, shown in Fig. 13 J.
As indicated at Step A in Fig. 13Nl , the first step of the illumination control method involves using the Automatic Light Exposure Measurement And Illumination Control Subsystem 15 to measure the ambient light level to which the CMOS image sensing array 22 is exposed prior to commencing each illumination and imaging cycle within the Bar Code Symbol Reading System.
As indicated at Step B, the illumination control method involves using the Automatic IR-based Object Presence and Range Detection Subsystem 12 to measure the presence and range of the object in either the near or far field portion of the field of view (FOV) of the System.
As indicated at Step C, the illumination control method involves using the detected range and the measured light exposure level to drive both the upper and lower LED illumination subarrays associated with either the near-field wide-area illumination array 28 or far-field wide-area illumination array 29.
As indicated at Step D, the illumination control method involves capturing a wide-area image at the CMOS image sensing array 22 using the illumination field produced during Step C.
As indicated at Step E, the illumination control method involves rapidly processing the captured wide-area image during Step D to detect the occurrence of high spatial-intensity levels in the captured wide-area image, indicative of a specular reflection condition.
As indicated at Step F, the illumination control method involves determining if a specular reflection condition is detected in the processed wide-area image, and if so then driving only the upper LED illumination subarray associated with either the near-field or far-field wide-area illumination array. Also, if a specular reflection condition is not detected in the processed wide-area image, then the detected range and the measured light exposure level is used to drive both the upper and lower LED subarrays associated with either the near-field or far-field wide-area illumination array.
As indicated at Step G, the illumination control method involves capturing a wide-area image at the CMOS image sensing array 22 using the illumination field produced during Step F.
As indicated at Step H, the illumination control method involves rapidly processing the captured wide-area image during Step G to detect the occurrence of high spatial-intensity levels in the captured wide-area image, indicative of a specular reflection condition.
As indicated at Step 1, the illumination control method involves determining if a specular reflection condition is still detected in the processed wide-area image, and if so, then drive the other LED subarray associated with either the near-field or far-field wide-area illumination array. If a specular reflection condition is not detected in the processed wide-area image, then the detected Range and the measured Light Exposure Level is used to drive the same LED illumination subarray (as in Step C) associated with either the near-field wide-area illumination array 28 or far field wide-area illumination array 29. As indicated at Step J, the illumination control method involves capturing a wide-area image at the CMOS image sensing array using the illumination field produced during Step I.
As indicated at Step K, the illumination control method involves rapidly processing the captured wide-area image during Step J to detect the absence of high spatial-intensity levels in the captured wide-area image, confirming the elimination of the earlier detected specular reflection condition.
As indicated at Step L, the illumination control method involves determining if no specular reflection condition is detected in the processed wide-area image at Step K, and if not, then the wide- area image is processed using the mode(s)selected for the Multi-Mode Image-Processing Bar Code Reading Subsystem 17. If a specular reflection condition is still detected in the processed wide-area image, then the control process returns to Step A repeats Steps A through K, as described above.
Specification Of Symbologies And Modes Supported Bv The Multi-Mode Bar Code Symbol Reading Subsystem Module Employed Within The Hand-Supportable Digital Image-Based Bar Code Reading Device Of The Present Invention
Fig. 14 lists the various bar code symbologies supported by the Multi-Mode Bar Code Symbol Reading Subsystem 17 employed within the hand-supportable digital imaging-based bar code symbol reading device of the present invention. As shown therein, these bar code symbologies include: Code 128; Code 39; I2of5; Code93; Codabar; UPC/EAN; Telepen; UK-Plessey; Trioptic; Matrix 2of5; Ariline 2of5; Straight 2of5; MSI-Plessey; Codel l ; and PDF417.
Specification of the Various Modes Of Operation in the Multi-Mode Bar Code Symbol Reading Subsystem of the Present Invention
As shown in Fig. 15, the Multi-Mode Image-Processing Based Bar Code Symbol Reading Subsystem 17 of the illustrative embodiment supports five primary modes of operation, namely: the Automatic Mode of Operation; the Manual Mode of Operation; the ROI-Specific Mode of Operation; the No-Finder Mode of Operation; and Omniscan Mode of Operation. As will be described in greater detail herein, various combinations of these modes of operation can be used during the lifecycle of the image-processing based bar code reading process of the present invention.
Fig. 16 is a exemplary flow chart representation showing the steps involved in setting up and cleaning up the software sub-Application entitled "Multi-Mode I mage- Process ing Based Bar Code Symbol Reading Subsystem 17", once called from either (i) the CodeGate Task software module at the Block entitled READ BAR CODE(S) IN CAPTURED NARROW-AREA IMAGE indicated in Fig. 13E, or (ii) the Main Task software module at the Block entitled "READ BAR CODE(S) IN CAPTURED WIDE-AREA IMAGE" indicated in Fig. 13J.
The Automatic Mode of Multi-Mode Bar Code Symbol Reading Subsystem W 2
68
In its Automatic Mode of Operation, the Multi-Mode Bar Code Symbol Reading Subsystem 17 is configured to automatically start processing a captured frame of digital image data, prior to the complete buffering thereof, so as to search for one or more bar codes represented therein in an incremental manner, and to continue searching until the entire image is processed.
This mode of image-based processing enables bar code locating and reading when no prior knowledge about the location of, or the orientation of, or the number of bar codes that may be present within an image, is available. In this mode of operation, the Multi-Mode Bar Code Symbol Reading Subsystem 17 starts processing the image from the top-left corner and continues until it reaches the bottom-right corner, reading any potential bar codes as it encounters them.
The Manual Mode of the Multi-Mode Bar Code Symbol Reading Subsystem
In its Manual Mode of Operation, the Multi-Mode Bar Code Symbol Reading Subsystem 17 is configured to automatically process a captured frame of digital image data, starting from the center or sweep spot of the image at which the user would have aimed the bar code reader, so as to search for (i.e. find) a at least one bar code symbol represented therein. Unlike the Automatic Mode, this is done by searching in a helical manner through frames or blocks of extracted image feature data, and then marking the same and image-processing the corresponding raw digital image data until a bar code symbol is recognized/read within the captured frame of image data.
This mode of image processing enables bar code locating and reading when the maximum number of bar codes that could be present within the image is known a priori and when portions of the primary bar code have a high probability of spatial location close to the center of the image. The Multi- Mode Bar Code Symbol Reading Subsystem 17 starts processing the image from the center, along rectangular strips progressively further from the center and continues until either the entire image has been processed or the programmed maximum number of bar codes has been read.
The ROI-Specific Mode of the Multi-Mode Bar Code Symbol Reading Subsystem
In its ROI-Specific Mode of Operation, the Multi-Mode Bar Code Symbol Reading Subsystem 17 is configured to automatically process a captured frame of digital image data, starting from the region of interest (ROI) in the captured image, specified by coordinates acquired during a previous mode of operation within the Multi-Mode Bar Code Symbol Reading Subsystem 17. Unlike the Manual Mode, this is done by analyzing the received ROI-specified coordinates, derived during either a previous NoFinder Mode, Automatic Mode, or Omniscan Mode of operation, and then immediately begins processing image feature data, and image-processing the corresponding raw digital image data until a bar code symbol is recognized/read within the captured frame of image data. Thus, typically, the ROI-Specific Mode is used in conjunction with other modes of the Multi-Mode Bar Code Symbol Reading Subsystem 17.
This mode of image processing enables bar code locating and reading when the maximum number of bar codes that could be present within the image is known a priori and when portions of the primary bar code have a high probability of spatial location close to specified ROI in the image. The Multi-Mode Bar Code Symbol Reading Subsystem starts processing the image from these initially specified image coordinates, and then progressively further in a helical manner from the ROI-specified region, and continues until either the entire image has been processed or the programmed maximum number of bar codes have been read.
The No-Finder Mode of the Multi-Mode Bar Code Symbol Reading Subsystem
In its No-Finder Mode of Operation, the Multi-Mode Bar Code Symbol Reading Subsystem 17 is configured to automatically process a captured narrow-area (linear) frame of digital image data, without the feature extraction and marking operations used in the Automatic, Manual and ROI-Specific Modes, so as to read a one or more bar code symbols represented therein.
This mode enables bar code reading when it is known, a priori, that the image contains at most one (1 -dimensional) bar code symbol, portions of which have a high likelihood of spatial location close to the center of the image and when the bar code is known to be oriented at zero degrees relative to the horizontal axis. Notably, this is typically the case when the bar code reader is used in a hand-held mode of operation, where the bar code symbol reader is manually pointed at the bar code symbol to be read. In this mode, the Multi-Mode Bar Code Symbol Reading Subsystem 17 starts at the center of the image, skips all bar code location steps, and Filters the image at zero (0) degrees and 180 degrees relative to the horizontal axis. Using the "bar-and-space-count" data generated by the filtration step, it reads the potential bar code symbol.
The Omni-Scan Mode of the Multi-Mode Bar Code Reading Subsystem
In its Omniscan Mode of Operation, the Multi-Mode Bar Code Symbol Reading Subsystem 17 is configured to automatically process a captured frame of digital image data along any one or more predetermined virtual scan line orientations, without feature extraction and marking operations used in the Automatic, Manual and ROI-Specific Modes, so as to read a single bar code symbol represented in the processed image.
This mode enables bar code reading when it is known, a priori, that the image contains at most one (1 -dimensional) bar code, portions of which have a high likelihood of spatial location close to the center of the image but which could be oriented in any direction. Multi-Mode Bar Code Symbol Reading Subsystem 17 starts at the center of the image, skips all bar code location steps, and filters the image at different start-pixel positions and at different scan-angles. Using the bar-and-space-count data generated by the filtration step, the Omniscan Mode reads the potential bar code symbol.
Programmable Modes Of Bar Code Reading Operation Within The Hand-Supportable Digital Image- Based Bar Code Reading Device Of The Present Invention
As indicated in Fig. 26, the imaging-based bar code symbol reader of the present invention has at least seventeen (17) Programmable System Modes of Operation, namely: Programmed Mode of System Operation No. 1 —Manually-Triggered Single-Attempt ID Single-Read Mode Employing the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem; Programmed Mode Of System Operation No. 2— Manually-Triggered Multiple-Attempt I D Single-Read Mode Employing the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem; Programmed Mode Of System Operation No. 3— Manually-Triggered Single-Attempt 1 D/2D Single-Read Mode Employing the No- Finder Mode And The Automatic Or Manual Modes of the Multi-Mode Bar Code Reading Subsystem; Programmed Mode of System Operation No. 4— Manually-Triggered Multiple-Attempt 1 D/2D Single- Read Mode Employing the No-Finder Mode And The Automatic Or Manual Modes of the Multi-Mode Bar Code Reading Subsystem; Programmed Mode of System Operation No. 5— Manually-Triggered Multiple-Attempt 1D/2D Multiple-Read Mode Employing the No-Finder Mode And The Automatic Or Manual Modes of the Multi-Mode Bar Code Reading Subsystem; Programmed Mode of System Operation No. 6— Automatically-Triggered S ingle- Attempt I D Single-Read Mode Employing The No- Finder Mode Of the Multi-Mode Bar Code Reading Subsystem; Programmed Mode of System Operation No. 7— Automatically-Triggered Multi-Attempt I D Single-Read Mode Employing The No- Finder Mode Of the Multi-Mode Bar Code Reading Subsystem; Programmed Mode of System Operation No. 7— Automatically-Triggered Multi-Attempt 1 D/2D Single-Read Mode Employing The No-Finder Mode and Manual and/or Automatic Modes Of the Multi-Mode Bar Code Reading Subsystem; Programmed Mode of System Operation No. 9~AutomaticalIy-Trϊggered Multi-Attempt I D/2D Multiple-Read Mode Employing The No-Finder Mode and Manual and/or Automatic Modes Of the Multi-Mode Bar Code Reading Subsystem; Programmable Mode of System Operation No. I 0-- Automatically-Triggered Multiple-Attempt 1D/2D Single-Read Mode Employing The Manual, Automatic or Omniscan Modes Of the Multi-Mode Bar Code Reading Subsystem; Programmed Mode of System Operation No. 1 1— Semi- Automatic-Triggered Single- Attempt 1 D/2D Single-Read Mode Employing The No-Finder Mode And The Automatic Or Manual Modes Of the Multi-Mode Bar Code Reading Subsystem; Programmable Mode of System Operation No. 12— Semi-Automatic-Triggered Multiple-Attempt 1D/2D Single-Read Mode Employing The No-Finder Mode And The Automatic Or Manual Modes Of the Multi-Mode Bar Code Reading Subsystem; Programmable Mode of Operation No. 13—Semi- Automatic-Triggered Multiple-Attempt 1 D/2D Multiple-Read Mode Employing The No- Finder Mode And The Automatic Or Manual Modes Of the Multi-Mode Bar Code Reading Subsystem; Programmable Mode of Operation No. 14— Semi-Automatic-Triggered Multiple-Attempt 1 D/2D Multiple-Read Mode Employing The No-Finder Mode And The Omniscan Modes Of the Multi-Mode Bar Code Reading Subsystem; Programmable Mode of Operation No. 15— Continously-Automatically- Triggered Multiple-Attempt 1 D/2D Multiple-Read Mode Employing The Automatic, Manual Or Omniscan Modes Of the Multi-Mode Bar Code Reading Subsystem; Programmable Mode of System Operation No. 16-- Diagnostic Mode Of Imaging-Based Bar Code Reader Operation; and Programmable Mode of System Operation No. 17-Live Video Mode Of Imaging-Based Bar Code Reader Operation.
Preferably, these Modes Of System Operation can programmed by reading a sequence of bar code symbols from a programming menu as taught, for example, in US Patent No. 6,565,005, which describes a bar code scanner programming technology developed by Metrologic Instruments, Inc., and marketed under the name MetroSelect® Single Line Configuration Programming Method.
These Programmable System Modes of Operation will be described in detail hereinbelow. Alternatively, the MetroSet® Graphical User Interface (GUI) can be used to view and change configurationparameters in the bar code symbol reader using a PC. Alternatively, a Command Line Interface (CLI) may also be used to view and change configuration parameters in the bar code symbol reader,
Each of these programmable modes of bar code reader operation shall be now described in greater detail with reference to other components of the system that are configured together to implement the same in accordance with the principles of the present invention.
Overview of the Imaging-Based Bar Code Reader Start-Up Operations
When the bar code reader hereof boots up, its FPGA is programmed automatically with 12.5/50/25 MHz clock firmware and all required device drivers are also installed automatically. The login to the Operating System is also done automatically for the user "root", and the user is automatically directed to the /root/ directory. For nearly all programmable modes of system operation employing automatic object detection, the IR object detection software driver is installed automatically. Also, for all Programmable System Modes of operation employing the narrow-area illumination mode, the narrow-area illumination software drivers are automatically installed, so that a Pulse Width Modulator (PWM) is used to drive the narrow-area LED-based illumination array 27. To start the bar code reader operation, the operating system calls the/tmp/ directory first ("cd /tmp"), and then the focusapp program, located in /root/ directory is run, because the /root/ directory is located in Flash ROM, and to save captured images, the directory /tmp/ should be the current directory where the image is stored in transition to the host), which is located in RAM.
Operating The Hand-Supportable Image-Processing Bar Code Symbol Reader of the Present Invention in a Manually-Triggered Mode of Operation
The hand-supportable image-processing bar code symbol reader of the present invention can be programmed to operate in any one of a number of different "manually-triggered" modes of system operation, as identified in Nos. 1 through 5 in Fig. 17AFig. 17 A. However, during each of these manually-triggered modes of operation, the image-processing bar code symbol reader controls and coordinates its subsystem components in accordance with a generalized method of manually-triggered operation.
In particular, upon automatic detection of an object within its IR-based object detection field, the IR-based object presence detection subsystem automatically generates an object detection event, and in response thereto, the multi-mode LED-based illumination subsystem automatically produces a narrow-area field of narrow-band illumination within the FOV of said image formation and detection subsystem.
Then, upon the generation of the trigger event by the user depressing the manually-actuatable trigger, the following operations are automatically carried out:
(i) the image capturing and buffering subsystem automatically captures and buffers a narrow- area digital image of the object using the narrow-area field of narrow-band illumination within the FOV, during the narrow-area image capture mode of said multi-mode image formation and detection subsystem; and
(ii) the image processing bar code symbol reading subsystem automatically processes said I D digital image attempts processes the narrow-area digital image in effort to read a 1 D bar code symbol represented therein, and upon successfully decoding a 1 D bar code symbol therein, automatically produces symbol character data representative thereof.
Then, upon said multi-mode image processing bar code symbol reading subsystem failing to successfully read the I D bar code symbol represented in the narrow-area digital image, the following operations are automatically carried out:
(i) the multi-mode LED-based illumination subsystem automatically produces a wide-area field of narrow-band illumination within the FOV of the multi-mode image formation and detection subsystem,
(ii) the image capturing and buffering subsystem captures and buffers a wide-area digital image during the wide-area image capture mode of the image capturing and buffering subsystem, and
(Hi) the image processing bar code symbol reading subsystem processes the wide-area digital image in effort to read a I D or 2D bar code symbol represented therein, and upon successfully decoding a I D or 2D bar code symbol therein, automatically produces symbol character data representative thereof.
Programmed Mode of System Operation No. 1 : Manually-Triggered Single-Attempt 1 D Single-Read Mode Employing the No-Finder Mode of the Multi-Mode Bar Code Symbol Reading Subsystem
Programmed Mode of System Operation No. 1 involves configuration of the system as follows: disabling the IR-based Object Presence and Range Detection Subsystem 12; and enabling the use of manual-trigger activation, the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode in the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17.
During this mode of system operation, when a user pulls the trigger switch 2C, the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Symbol Reading Subsystem 17. Then, the bar code reader illuminates the target object using narrow-area illumination, captures a narrow-area image of the target object, and launches the No-Finder Mode of the Multi-Mode Bar Code Symbol Reading Subsystem 17. The captured image is then processed using the No-Finder Mode. If a single cycle of programmed image processing results in the successful reading of a I D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system. If single cycle of programmed image processing is not result in a successful reading of a I D bar code symbol, then the cycle is terminated, all subsystems are deactivated, and the bar code reader returns to its sleep mode of operation, and wait for the next event (e.g. manually pulling trigger switch 2C) which will trigger the system into active operation.
Programmed Mode Of System Operation No. 2: Manually-Triggered Multiple-Attempt ID Single-Read Mode Employing the No-Finder Mode of the Multi-Mode Bar Code Symbol Reading Subsystem
Programmed Mode of System Operation No. 2 involves configuration of the system as follows: disabling the IR-based Object Presence and Range Detection Subsystem 12; and enabling the use of manual-trigger activation, the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode in the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Symbol Reading Subsystem 17.
During this mode of system operation, when a user pulls the trigger switch 2C, the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17. Then, the bar code reader illuminates the target object using narrow-area illumination, captures a narrow-area image of the target object, and launches the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17. The captured narrow-area image is then processed using the No-Finder Mode. If the single cycle of programmed image processing results in the successful reading of a I D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem for use by the host system. If the cycle of programmed image processing does not produce a successful read, then the system automatically enables successive cycles of illumination/capture/processing so long as the trigger switch 2C is being pulled, and then until the system reads a bar code symbol within a captured image of the target object; only thereafter, or when the user releases the trigger switch 2C, will the bar code symbol reader return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation. In the illustrative embodiment, the default decode timeout is set to 500 ms which can be simply changed by programming. This default decode timeout setting ensures that while the trigger switch 2C is being pulled by the user, the imaging-based bar code symbol reader will re-attempt reading every 500 ms (at most) until it either succeeds or the trigger switch 2C is manually released.
Programmed Mode Of System Operation No. 3: Manually-Triggered Single-Attempt 1 D/2D Single- Read Mode Employing the No-Finder Mode And The Automatic, Manual Or ROI-Specific Modes of the Multi-Mode Bar Code Symbol Reading Subsystem
Programmed Mode of System Operation No. 3 involves configuration of the system as follows: disabling the IR-based Object Presence and Range Detection Subsystem 12; and enabling the use of manual-trigger activation, the narrow-area and wide-area illumination modes within the Multi-Mode Illumination Subsystem 14, the narrow-area and wide-area image capture modes in the Image Formation and Detection Subsystem 13, and the No-Finder Mode and Manual, ROJ-Specific and/or Automatic Modes of the Multi-Mode Bar Code Reading Subsystem 17.
During this programmable mode of system operation, the bar code reader is idle (in its sleep mode) until a user points the bar code reader towards an object with a bar code label, and then pulls the trigger switch 2C. When this event occurs, the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14 (i.e. drives the narrow-area illumination array 27), the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17. Then, the bar code reader illuminates the target object using narrow-area illumination, captures a narrow-area image of the target object, and launches the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17. The captured narrow-area image is then processed using the No-Finder Mode. If this single cycle of programmed image processing results in the successful reading of a I D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system. If this cycle of programmed image processing does not produce a successful read, then the system deactivates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17, and then activates the wide-area illumination mode within the Multi-Mode Illumination Subsystem 14, the wide-area image capture mode of the Image Formation and Detection Subsystem 13, and the Manual, ROI-Specific and/or Automatic Modes of the Multi-Mode Bar Code Reading Subsystem 17. Then the bar code reader illuminates the target object using both near-field and far-field wide-area illumination, captures a wide- area image of the target object, and launches the Manual, ROI-Specific or Automatic Mode of the Multi-Mode Bar Code Reading Subsystem 17. The captured wide-area image is then processed using the Manual, ROI-Specific or Automatic Mode. If this single cycle of programmed image processing results in the successful reading of a I D or 2D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system. If this cycle of programmed image processing does not produce a successful read, then the subsystem 19 deactivates all subsystems and then returns to its sleep mode, and waits for an event, which will cause it to re-enter its active mode of operation.
Programmed Mode of System Operation No. 4: Manually-Triggered Multiple-Attempt 1 D/2D Single- Read Mode Employing the No-Finder Mode And The Automatic. Manual Or ROI-Specific Modes of the Multi-Mode Bar Code Symbol Reading Subsystem
Programmed Mode of System Operation No. 4 involves configuration of the system as follows: disabling the IR-based object detection subsystem 12; and enabling the use of manual-trigger activation, the narrow-area and wide-area illumination modes within the Multi-Mode Illumination Subsystem 14, the narrow-area and wide-area image capture modes of the Image Formation and Detection Subsystem 13, and the No-Finder Mode and Manual, ROI-Specific and/or Automatic Modes of the Multi-Mode Bar Code Reading Subsystem 17.
During this programmed mode of system operation, when a user pulls the trigger switch 2C, the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17. Then, the bar code reader illuminates the target object using narrow-area illumination, captures a narrow-area image of the target object, and launches the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17. The captured narrow-area image is then processed using the No-Finder Mode. If this single cycle of programmed image processing results in the successful reading of a I D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem for use by the host system. If this cycle of programmed image processing does not produce a successful read, then the system deactivates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17, and then activates the wide-area illumination mode within the Multi-Mode Illumination Subsystem 14, the wide-area image capture mode of the Image Formation and Detection Subsystem 13, and the Manual and/or Automatic Mode of the Multi-Mode Bar Code Reading Subsystem 17. Then, the bar code reader illuminates the target object using both near-field and far-field wide-area illumination, captures a wide-area image of the target object, and launches the Manual (or Automatic) Mode of the Multi-Mode Bar Code Reading Subsystem. The captured wide-area image is then processed using the Manual Mode of bar code symbol reading. If this single cycle of programmed processing results in the successful reading of a I D or 2D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system. If this cycle of programmed image processing does not produce a successful read of a single I D or 2 D bar code symbol, then the Subsystem 19 automatically enables successive cycles of wide-area illumination/wide-area image capture and processing so long as the trigger switch 2C is being pulled, and then until the system reads a single ID or 2D bar code symbol within a captured image of the target object; only thereafter, or when the user releases the trigger switch 2C3 will the bar code reader return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation. In the illustrative embodiment, the default decode timeout is set to 500 ms which can be simply changed by programming. This default decode timeout setting ensures that while the trigger switch is being pulled by the user, the imaging-based bar code symbol reader will re-attempt reading every 500 ms (at most) until it either succeeds or the trigger switch 2C is manually released.
Programmed Mode of System Operation No. 5: Manually-Triggered Multiple-Attempt 1 D/2D Multiple- Read Mode Employing the No-Finder Mode And The Automatic, Manual Or RQI-Specific Modes of the Multi-Mode Bar Code Reading Symbol Subsystem
Programmed Mode of System Operation No. 5 involves configuration of the system as follows: disabling the IR-based Object Presence and Range Detection Subsystem 12; and enabling the use of manual-trigger activation, the narrow-area and wide-area illumination modes within the Multi-Mode Illumination Subsystem 14, the narrow-area and wide-area image capture modes of the Image Formation and Detection Subsystem 13, and the No-Finder Mode and Manual, ROI-Specific and/or Automatic Modes of the Multi-Mode Bar Code Reading Subsystem 17.
During this mode of system operation, when a user pulls the trigger switch 2C, the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem. Then, the bar code reader illuminates the target object using narrow-area illumination, captures a narrow-area image of the target object, and launches the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem. The captured narrow- area image is then processed using the No-Finder Mode. If this single cycle of programmed processing results in the successful decoding of a I D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system. If this cycle of.programmed decode image processing does not produce a successful read, then the system deactivates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17, and then activates the wide-area illumination mode within the Multi- Mode Illumination Subsystem, the wide-area image capture mode of the Image Formation and Detection Subsystem 13, and the Manual and/or Automatic Mode of the Multi-Mode Bar Code Reading Subsystem 17. Then, the bar code reader illuminates the target object using both near-field and far-field wide-area illumination, captures a wide-area image of the target object, and launches the Manual (ROI- Specific and/or Automatic) Mode of the Multi-Mode Bar Code Reading Subsystem 17. The captured wide-area image is then processed using the Manual Mode of reading. If this single cycle of programmed processing results in the successful reading of a 1 D or 2D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system. If this cycle of programmed image processing does not produce a successful reading of one or more I D and/or 2D bar code symbols, then the system automatically enables successive cycles of wide-area illumination/wide-area image capture/image processing so long as the trigger switch is being pulled, and then until the system reads one or more ID and/or 2D bar code symbols within a captured image of the target object; only thereafter, or when the user releases the trigger switch 2C, will the bar code reader return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation. In the illustrative embodiment, the default decode timeout is set to 500 ms which can be simply changed by programming. This default decode timeout setting ensures that while the trigger switch 2C is being pulled by the user, the imaging-based bar code symbol reader will re-attempt reading every 500 ms (at most) until it either succeeds or the trigger switch 2C is manually released.
Programmed Mode of System Operation No. 6: Automatically-Triggered Single-Attempt I D Single- Read Mode Employing The No-Finder Mode Of the Multi-Mode Bar Code Symbol Reading Subsystem
Programmed Mode of System Operation No. 6 involves configuration of the system as follows: disabling the use of manual-trigger activation; and enabling IR-based Object Presence and Range Detection Subsystem 12, the narrow-area illumination mode only within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode only in the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17.
During this programmed mode of system operation, the bar code reader is idle until a user points the reader towards an object with a bar code label. Once the object is under the field-of-view of the bar code reader, and the object is automatically detected, the bar code reader "wakes up" and the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17. This causes the system to illuminate a "narrow" horizontal area of the target object at the center of the field-of-view (FOV) of the bar code reader, indicating to the user where the area targeted by the bar code reader is, and thus, enabling the user to position and align the narrow-area illumination beam on the target bar code. Then, the system captures/acquires a narrow-area image, which is then processed using the Bar Code Symbol Reading Subsystem 17 configured in its No-Finder Mode of operation. If this single cycle of programmed decode processing results in the successful reading of a 1 D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system. If this cycle of programmed image processing does not produce a successful read, then the system deactivates all subsystems, causing the bar code reader return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation.
Programmed Mode of System Operation No. 7: Automatically-Triggered Multi-Attempt I D Single- Read Mode Employing The No-Finder Mode Of the Multi-Mode Bar Code Symbol Reading Subsystem
Programmed Mode of System Operation No. 7 involves configuration of the system as follows: disabling the use of manual-trigger activation; and enabling IR-based Object Presence And Range Detection Subsystem 12, the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode in the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17.
During this programmed mode of system operation, the bar code reader is idle until a user points the bar code reader towards an object with a bar code label. Once the object is under the field- of-view of the bar code reader, and the object is automatically detected, the bar code reader "wakes up" and the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17. This causes the system to illuminate a "narrow" horizontal area of the target object at the center of the field-of-view (FOV) of the bar code reader, indicating to the user where the area targeted by the bar code reader is, and thus, enabling the user to position and align the narrow-area illumination beam on the target bar code. Then, the system captures/acquires a narrow-area image, which is then processed using the No- Finder Mode. If this single cycle of programmed image processing results in the successful reading of a 1 D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system. If this cycle of programmed image processing does not produce a successful decode, then the system automatically enables successive cycles of narrow-area illumination/narrow- area image capture/processing so long as the trigger switch 2C is being pulled, and then until the system reads a single I D bar code symbol within a captured image of the target object; only thereafter, or when the user releases the trigger switch 2C, will the bar code reader return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation. In the illustrative embodiment, the default decode timeout is set to 500 ms which can be simply changed by programming. This default decode timeout setting ensures that while the trigger switch is being pulled by the user, the imaging-based bar code symbol reader will re-attempt reading every 500 ms (at most) until it either succeeds or the trigger switch 2C is manually released.
Programmed Mode of System Operation No. 7: Automatically-Triggered Multi-Attempt 1 D/2D Single- Read Mode Employing The No-Finder Mode and Manual. ROI-Specifϊc and/or Automatic Modes Of the Multi-Mode Bar Code Symbol Reading Subsystem
Programmed Mode of System Operation No. 7 involves configuration of the system as follows: disabling the use of manual-trigger activation during all phase of system operation; and enabling IR- based Object Presence and Range Detection Subsystem 12, the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode in the Image Formation and Detection Subsystem 13, and the No-Finder Mode and Manual, ROl-Specific and/or Automatic Modes of the Multi-Mode Bar Code Reading Subsystem 17.
During this programmed mode of system operation, the bar code reader is idle until a user points the reader towards an object with a bar code label. Once the object is under the field-of-view of the scanner, and the object is automatically detected, the bar code reader "wakes up" and the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem, 14 the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17. This causes the system to illuminate a "narrow" horizontal area of the target object at the center of the field-of-view (FOV) of the bar code reader, indicating to the user where the area targeted by the bar code reader is, and thus, enabling the user to position and align the narrow-area illumination beam on the target bar code. Then, the system captures/acquires a narrow-area image, which is then processed using the No-Finder Mode of operation. If this single cycle of programmed image processing results in the successful reading of a ID bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system. If this cycle of programmed image processing does not produce a successful read, then the system deactivates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17, and then activates the wide-area illumination mode within the Multi-Mode Illumination Subsystem 14, the wide-area image capture mode in the Image Formation and Detection Subsystem 13, and the Manual, ROI-Specific and/or Automatic Modes of the Multi-Mode Bar Code Reading Subsystem 17. Then, the bar code symbol reader illuminates the target object using either near-field or far-field wide-area illumination (depending on the detected range of the target object), captures a wide-area image of the target object, and launches the Manual, ROI-Specific or Automatic Mode of the Multi-Mode Bar Code Reading Subsystem 17. The captured wide-area image is then processed using the Manual Mode of reading. If this cycle of programmed image processing results in the successful reading of a single I D or 2D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system. If this cycle of programmed image processing does not produce a successful reading of a single I D or 2D bar code symbol, then the system automatically enables successive cycles of wide-area illumination/wide-area image capture/processing so long as the target object is being detected, and then until the system reads one or morel D and/or 2D bar code symbols within a captured image of the target object; only thereafter, or when the user moves the object out of the FOV of the bar code reader, will the bar code reader return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation. In the illustrative embodiment, the default decode timeout is set to 500 ms which can be simply changed by programming. This default decode timeout setting ensures that while the object is being detected by the bar code reader, the bar code symbol reader will re-attempt reading every 500 ms (at most) until it either succeeds or the object is moved away from the FOV of the bar code reader.
Programmed Mode of System Operation No. 9: Automatically-Triggered Multi-Attempt 1 D/2D Multiple-Read Mode Employing The "No-Finder Mode and Manual, ROI-Specific and/or Automatic Modes Of the Multi-Mode Bar Code Symbol Reading Subsystem Programmed Mode of System Operation No. 9 involves configuration of the system as follows: disabling the use of manual-trigger activation during all phases of system operation; and enabling IR- based Object Presence and Range Detection Subsystem 12, the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode in the Image Formation and Detection Subsystem 13, and the No Finder Mode and Manual or Automatic Modes of the Multi-Mode Bar Code Symbol Reading Subsystem 17.
During this programmed mode of system operation, the bar code reader is idle until a user points the reader towards an object with a bar code label. Once the object is under the field-of-view of the bar code reader, and the object is automatically detected, the bar code reader "wakes up" and the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17. This causes the system to illuminate a "narrow" horizontal area of the target object at the center of the field-of-view (FOV) of the bar code reader, indicating to the user where the area targeted by the bar code reader is, and thus, enabling the user to position and align the narrow-area illumination beam on the target bar code. Then, the system captures/acquires a narrow-area image, which is then processed using the No-Finder Mode. If this single cycle of programmed processing results in the successful reading of a I D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system. If this cycle of programmed image processing does not produce a successful read, then the system deactivates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17, and then activates the wide-area illumination mode within the Multi-Mode Illumination Subsystem 14, the wide-area image capture mode in the Image Formation and Detection Subsystem 13, and the Manual and/or Automatic Modes of the Multi-Mode Bar Code Reading Subsystem 17. Then, the bar code reader illuminates the target object using either near-field or far-field wide-area illumination (depending on the detected range of the target object), captures a wide-area image of the target object, and launches the Manual (ROI-Specific or Automatic) Mode of the Multi-Mode Bar Code Reading Subsystem 17. The captured wide-area image is then processed using the Manual Method of decoding. If this cycle of programmed image processing results in the successful reading of a single 1 D or 2D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem IS for use by the host system. If this cycle of programmed image processing does not produce a successful read of a single I D or 2D bar code symbol, then the system automatically enables successive cycles of wide-area-illumination/wide-area image-capture/processing so long as the target object is being detected, and then until the system reads one or morel D and/or 2D bar code symbols within a captured image of the target object; only thereafter, or when the user moves the object out of the FOV of the bar code symbol reader, will the bar code reader return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation. In the' illustrative embodiment, the default decode timeout is set to 500 ms which can be simply changed by programming. This default decode timeout setting ensures that while the object is being detected by the bar code reader, the bar code reader will re-attempt reading every 500 ms (at most) until it either succeeds or the object is moved away from the FOV of the bar code reader.
Programmable Mode of System Operation No. 10: Automatically-Triggered Multiple- Attempt 1 D/2D Single-Read Mode Employing The Manual, ROI-Specific. Automatic or Omniscan Modes Of the Multi-Mode Bar Code Symbol Reading Subsystem
Programmed Mode of System Operation No. 10 involves configuration of the system as follows: disabling the use of manual-trigger activation during all phase of system operation; and enabling IR-based Object Presence and Range Detection Subsystem 12, the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode in the Image Formation and Detection Subsystem 13, and the Manual, ROI-Specific, Automatic or Omniscan Modes of the Multi-Mode Bar Code Reading Subsystem 17.
During this programmed mode of system operation, the bar code reader is idle until a user present an object with a bar code symbol under the field-of-view of the bar code reader, and the object is automatically detected, the bar code reader "wakes up" and the system activates the wdie-area illumination mode within the Multi-Mode Illumination Subsystem 14, the wide-area image capture mode in the Image Formation and Detection Subsystem 13, and either Manual, ROI-Specific, Automatic or Omniscan Mode of the Multi-Mode Bar Code Reading Subsystem 17. This causes the system to illuminate a wide area of the target object within the field-of-view (FOV) of the bar code reader with far-field or near-field wide area illumination (depending on the detected range of the target object), and capture/acquire a wide-area image which is then processed using either the Manual, ROI- Specific, Automatic or Omniscan Method of reading. If this single cycle of programmed processing results in the successful reading of a I D or 2D bar code symbol (when the Manual, ROI-Specific and Automatic Methods are used), then the resulting symbol character data is sent to the Input/Output Subsystem for use by the host system. If this cycle of programmed image processing does not produce a successful read, then the system automatically enables successive cycles of wide-area illumination/wide-area-image-capture/processing so long as the target object is being detected, and then until the system reads a single I D and/or 2D bar code symbol within a captured image of the target object; only thereafter, or when the user moves the object out of the FOV of the bar code reader, will the bar code reader return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation. In the illustrative embodiment, the default decode timeout is set to 500 ms which can be simply changed by programming. This default decode timeout setting ensures that while the object is being detected by the bar code reader, the bar code reader will re-attempt reading every 500 ms (at most) until it either succeeds or the object is moved away from the FOV of the bar code reader. Programmed Mode of System Operation No. 1 1 : Semi-Automatic-Triggered Single-Attempt 1 D/2D Single-Read Mode Employing The Mo-Finder Mode And The Automatic. ROI-Specific Or Manual Modes Of the Multi-Mode Bar Code Symbol Reading Subsystem
Programmed Mode of System Operation No. 1 1 involves configuration of the system as follows: disabling the use of the manual-trigger activation during the system activation phase of operation; and enabling the IR-based Object Presence and Range Detection Subsystem 12, the narrow- area and wide-area illumination modes within the Multi-Mode Illumination Subsystem 14, the narrow- area and wide-area image capture modes in the Image Formation and Detection Subsystem 13, and the No-Finder Mode and Manual, ROJ-Specific and/or Automatic Modes of the Multi-Mode Bar Code Reading Subsystem 17.
During this programmed mode of system operation, the bar code reader is idle until a user points the reader towards an object with a bar code label. Once the object is under the field-of-view of the bar code reader, and the object is automatically detected, the bar code reader "wakes up" and the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17. This causes the system to illuminate a "narrow" horizontal area of the target object at the center of the field-of-view (FOV) of the bar code reader, indicating to the user where the area targeted by the bar code reader is, and thus, enabling the user to position and align the narrow-area illumination beam on the target bar code. Then, the system captures/acquires a narrow-area image, which is then processed using the No-Finder Mode. If this single cycle of programmed image processing results in the successful reading of a I D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system. If this cycle of programmed image processing does not produce a successful read, then the system deactivates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17, and then activates the wide-area illumination mode within the Multi-Mode Illumination Subsystem 14, the wide-area image capture mode of the Image Formation and Detection Subsystem 13, and the Manual, ROI-Specific and/or Automatic Mode of the Multi-Mode Bar Code Reading Subsystem 17. Then, if the user pulls the trigger switch 2C during narrow-area illumination and image capture and continues to do so, the bar code reader will automatically illuminate the target object using wide-area illumination, capture a wide- area image of the target object, and launch the Manual, ROI-Specific or Automatic Mode of the Multi- Mode Bar Code Symbol Reading Subsystem 17. The captured wide-area image is then processed using the Manual, ROI-Specific or Automatic Mode/Method of bar code reading. If this single cycle of programmed image processing results in the successful reading of a single I D or 2D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system. If this cycle of programmed image processing does not produce a successful reading of a single 1 D or 2D bar code symbol, then the subsystem 19 automatically deactivates all subsystems, causing the bar code reader return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation.
Programmable Mode of System Operation No. 12: Semi-Automatic-Triggered Multiple-Attempt 1 D/2D Single-Read Mode Employing The No-Finder Mode And The Automatic, ROI-Specific Or Manual Modes Of the Multi-Mode Bar Code Symbol Reading Subsystem
Programmed Mode of System Operation No. 12 involves configuration of the system as follows: disabling the use of manual-trigger activation during the system activation phase of operation; and enabling the IR-based Object Presence and Range Detection Subsystem 12, the narrow-area and wide-area illumination modes within the Multi-Mode Illumination Subsystem 14, the narrow-area and wide-area image capture modes in the Image Formation and Detection Subsystem 13, and the No- Finder Mode and Manual, ROI-Specific and/or Automatic Modes of the Multi-Mode Bar Code Reading Subsystem 17.
During this programmed mode of system operation, the bar code reader is idle until a user points the reader towards an object with a bar code label. Once the object is under the field-of-view of the bar code reader, and the object is automatically detected, the bar code reader "wakes up" and the system activates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No- Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17. This causes the system to illuminate a "narrow" horizontal area of the target object at the center of the fϊeld-of-view (FOV) of the bar code reader, indicating to the user where the area targeted by the bar code reader is, and thus, enabling the user to position and align the narrow-area illumination beam on the target bar code. Then, the system captures/acquires a narrow-area image, which is then processed using the No-Finder Mode. If this single cycle of programmed image processing results in the successful reading of a I D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system. If this cycle of programmed image processing does not produce a successful read, then the system deactivates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17, and then activates the wide-area illumination mode within the Multi-Mode Illumination Subsystem, the wide-area image capture mode of the Image Formation and Detection Subsystem 13, and the Manual, ROI-Specific and/or Automatic Mode of the Multi-Mode Bar Code Reading Subsystem 17. Then, if the user pulls the trigger switch 2C during narrow-area illumination and image capture and continues to do so, the bar code reader will automatically illuminate the target object using wide-area illumination, capture a wide-area image of the target object, and launches the Manual, ROI-Specific or Automatic Mode of the Multi-Mode Bar Code Reading Subsystem 17. The captured wide-area image is then processed using the Manual Mode of reading. If this single cycle of programmed image processing results in the successful reading of a single I D or 2D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system. If this cycle of programmed image processing does not produce a successful decode of a single I D or 2D bar code symbol, then the system automatically enables successive cycles of wide-area illumination/wide-area-image-capture/processing so long as the trigger switch 2C is being pulled, and then until the system reads one or more 1 D and/or 2D bar code symbols within a captured image of the target object; only thereafter, or when the user releases the trigger switch 2C, will the bar code reader return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation. In the illustrative embodiment, the default decode timeout is set to 500 ms which can be simply changed by programming. This default decode timeout setting ensures that while the trigger switch 2c is being pulled by the user, the imaging-based bar code symbol reader will re-attempt reading every 500 ms (at most) until it either succeeds or the trigger switch 2C is manually released.
Implementation of Programmable Mode of System Operation No. 12
When the Focus IR module detects an object in front of object detection field 20, it posts the OBJECT_DETECT_ON event to the Application Layer. The Application Layer software responsible for processing this event starts the CodeGate Task. When the user pulls the trigger switch 2C, the TRIGGER-ON event is posted to the Application. The Application Layer software responsible for processing this event checks if the CodeGate Task is running, and if so, it cancels it and then starts the Main Task. When the user releases the trigger switch 2C, the TRIGGER_OFF event is posted to the Application. The Application Layer software responsible for processing this event, checks if the Main Task is running, and if so, it cancels it. If the object is stiU within the object detection field 20, the Application Layer starts the CodeGate Task again.
When the user moves the bar code reader away from the object (or the object away from the bar code reader), the OBJECT_DETECT_OFF event is posted to the Application Layer. The Application Layer software responsible for processing this event checks if the CodeGate Task is running, and if so, it cancels it. The CodeGate Task, in an infinite loop, does the following. It activates the narrow-area illumination array 27 which illuminates a "narrow" horizontal area at the center of the field-of-view and then the Image Formation and Detection Subsystem 13 acquires an image of that narrow-area (i.e. few rows of pixels on the CMOS image sensing array 22), and then attempts to read a bar code symbol represented in the image. If the read is successful, it saves the decoded data in the special CodeGate data buffer. Otherwise, it clears the CodeGate data buffer. Then, it continues the loop. The CodeGate Task never exits on its own; it can be canceled by other modules of the Focus software when reacting to other events.
When a user pulls the trigger switch 2C, the event TRIGGER_ON is posted to the Application Layer. The Application Layer software responsible for processing this event, checks if the CodeGate Task is running, and if so, it cancels it and then starts the Main Task. The CodeGate Task can also be canceled upon OBJECT_DETECT_OFF event, posted when the user moves the bar code reader away from the object, or the object away from the bar code reader.
Programmable Mode of Operation No. 13: Semi-Automatic-Triggered Multiple-Attempt 1 D/2D Multiple-Read Mode Employing The No-Finder Mode And The Automatic, ROI-Specific Or Manual Modes Of the Multi-Mode Bar Code Reading Subsystem
Programmed Mode of System Operation No. 13 involves configuration of the system as follows: disabling the use of manual-trigger activation during the system activation phase of operation; and enabling the IR-based Object Presence and Range Detection Subsystem 12, the narrow-area and wide-area illumination modes within the Multi-Mode Illumination Subsystem 14, the narrow-area and wide-area image capture modes in the Image Formation and Detection Subsystem 13, and the No- Finder Mode and Manual, ROI-Specific and/or Automatic Modes of the Multi-Mode Bar Code Reading Subsystem 17.
During this programmed mode of system operation, the bar code reader is idle until a user points the reader towards an object with a bar code label. Once the object is under the field-of-view of the bar code reader, and the object is automatically detected by the Object Presence and Range Detection Subsystem 12, the bar code reader "wakes up" and the system activates the narrow-area illumination mode in the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode in the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17. This causes the system to illuminate a "narrow" horizontal area of the target object at the center of the field-of-view (FOV) of the bar code reader, indicating to the user where the area targeted by the bar code reader is, and thus, enabling the user to position and align the narrow- area illumination beam on the target bar code. Then, the system captures/acquires a narrow-area image which is then processed using the No-Finder Mode. If this single cycle of programmed image processing results in the successful reading of a ID bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system. If this cycle of programmed image processing does not produce a successful read, then the system deactivates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17, and then activates the wide-area illumination mode within the Multi-Mode Illumination Subsystem 14, the wide-area image capture mode of the Image Formation and Detection Subsystem 13, and the Manual and/or Automatic Mode of the Multi-Mode Bar Code Reading Subsystem 17. Then, if the user pulls the trigger switch 2C during narrow-area illumination and image capture and continues to do so, the bar code reader will automatically illuminate the target object using wide-area illumination, capture a wide-area image of the target object, and invoke the Manual, ROI-Specific and/or Automatic Mode of the Multi-Mode Bar Code Reading Subsystem 17. The captured wide-area image is then processed using the Manual, ROI-Specific or Automatic Mode of reading. If this single cycle of programmed image processing results in the successful reading of one or more 1 D and/or 2D bar code symbols, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system. If this cycle of programmed decode image processing does not produce a successful reading of one or more 1 D and/or 2D bar code symbols then the system automatically enables successive cycles of wide-area illumination/wide-area- image-capture/image-processing so long as the trigger switch 2C is being pulled, and then until the system reads one or morel D and/or 2D bar code symbols within a captured image of the target object; only thereafter, or when the user releases the trigger switch 2C, will the bar code reader return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation. In the illustrative embodiment, the default decode timeout is set to 500 ms which can be simply changed by programming. This default decode timeout setting ensures that while the trigger switch 2C is being pulled by the user, the Imaging-Based Bar Code Symbol Reader will re-attempt reading every 500 ms (at most) until it either succeeds or the trigger switch 2C is manually released.
Programmable Mode of Operation No. 14: Semi- Automatic-Triggered Multiple-Attempt 1 D/2D Multiple-Read Mode Employing The No-Finder Mode And The Omniscan Modes Qf the Multi-Mode Bar Code Symbol Reading Subsystem
Programmed Mode of System Operation No. 14 involves configuration of the system as follows: disabling the use of manual-trigger activation during the system activation phase of operation; and enabling the IR-based Object Presence and Range Detection Subsystem 12, the narrow-area and wide-area illumination modes within the Multi-Mode Illumination Subsystem 14, the narrow-area and wide-area image capture modes in the Image Formation and Detection Subsystem 13, and the No- Finder Mode and OmniScan Mode of the Multi-Mode Bar Code Reading Subsystem 17.
During this programmed mode of system operation, the bar code reader is idle until a user points the reader towards an object with a bar code label. Once the object is under the field-of-view of the bar code reader, and the object is automatically detected by the Object Presence and Range Detection Subsystem 12, the bar code reader "wakes up" and the system activates the narrow-area illumination mode in the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode in the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17. This causes the narrow-area illumination array 27 to illuminate a "narrow" horizontal area of the target object at the center of the field-of-view (FOV) of the bar code reader, indicating to the user where the area targeted by the bar code reader is, and thus, enabling the user to position and align the narrow-area illumination beam on the target bar code. Then, Subsystem 13 captures/acquires a narrow-area image which is then processed by Subsystem 17 using its No-Finder Mode. If this single cycle of programmed image processing results in the successful reading of a 1 D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system, and then the system deactivates all subsystems and resumes its sleep state of operation. If this cycle of programmed image processing does not produce a successful read, it may nevertheless produce one or more code fragments indicative of the symbology represented in the image, (e.g. PDF 417). In this case, the system deactivates the narrow-area illumination mode within the Multi-Mode Illumination Subsystem 14, the narrow-area image capture mode of the Image Formation and Detection Subsystem 13, and the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem 17; and then, if the user is pulling the trigger switch 2C at about this time, the system activates the wide-area illumination mode within the Multi-Mode Illumination Subsystem 14, the wide- area image capture mode of the Image Formation and Detection Subsystem, and either the Omniscan Mode of the Multi-Mode Bar Code Reading Subsystem 17 if code fragments have been found indicating a 2D code format (e.g. PDF format code) within the image at perhaps a particular orientation. Then, the bar code reader proceeds to automatically illuminate the target object using wide-area illumination, capture a wide-area image of the target object, and invoke the Omniscan Mode of the Multi-Mode Bar Code Reading Subsystem 17. The captured wide-area image is then first processed using the Omniscan Mode, using a first processing direction (e.g. at 0 degrees), and sequentially advances the Omniscan Mode of reading at an different angular orientation (e.g. 6 possible directions/orientations) until a single bar code symbol is successfully read. If this single cycle of programmed decode processing (using the Omniscan Mode) results in the successful decoding of a single I D and/or 2D bar code symbol, then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system. If this cycle of programmed image processing does not produce a successful reading of a single ID and/or 2D bar code symbol, then the system automatically enables successive cycles of wide-area illumination/wide-area image capture/ processing so long as the trigger switch 2C is being pulled, and then until the system reads a single I D and/or 2D bar code symbol within a captured image of the target object. Only thereafter, or when the user releases the trigger switch 2C, the system will return to its sleep mode of operation, and wait for the next event that will trigger the system into active operation. In the illustrative embodiment, the default decode timeout is set to 500 ms which can be simply changed by programming. This default decode timeout setting ensures that while the trigger switch 2C is being pulled by the user, the Imaging-Based Bar Code Symbol Reader will re-attempt reading every 500 ms (at most) until it either succeeds or the trigger switch is manually released.
Programmable Mode of Operation No. 15: Continuouslv-Automaticallv-Triggered Multiple-Attempt 1D/2D Multiple-Read Mode Employing The Automatic, Manual, ROI-Specific Or Omniscan Modes Of the Multi-Mode Bar Code Reading Subsystem
Programmed Mode of System Operation No. 15, typically used for testing purposes, involves configuration of the system as follows: disabling the use of manual-trigger activation during all phase of system operation; and enabling IR-based Object Presence and Range Detection Subsystem 12, the wide-area illumination mode in the Multi-Mode Illumination Subsystem, 14 the wide-area image capture mode in the Image Formation and Detection Subsystem 13, and the Manual, ROI-Specific, Automatic or OmniScan Modes of the Multi-Mode Bar Code Reading Subsystem 17. During this programmed mode of system operation, the bar code reader continuously and sequentially illuminates a wide area of the target object within the field-of-view (FOV) of the bar code reader with both far-field and near-field wide-area illumination, captures a wide-area image thereof, and then processes the same using either the Manual, ROl-Specific, Automatic or Omniscan Modes of operation. If any cycle of programmed image processing results in the successful reading of a I D or 2D bar code symbol (when the Manual, ROI-Specific and Automatic Modes are used), then the resulting symbol character data is sent to the Input/Output Subsystem 18 for use by the host system (i.e. typically a test measurement system). If when any cycle of programmed image processing does not produce a successful read, the system automatically enables successive cycles of wide-area illumination/wide-area image-capture/processing. In the illustrative embodiment, the default decode timeout is set to 500 ms which can be simply changed by programming. This default decode timeout setting ensures that while the object is being detected by the bar code reader, the bar code reader will re-attempt reading every 500 ms (at most) until it either succeeds or the object is moved away from the FOV of the bar code reader.
Diagnostic Mode Of Imaging-Based Bar Code Reader Operation: Programmable Mode of System Operation No. 16
Programmed Mode of System Operation No. 16 is a Diagnostic Mode. An authorized user can send a special command to the bar code reader to launch a Command Line Interface (CLI) with the bar code reader. When the bar code reader receives such request from the user, it sends a prompt "MTLG>" back to the user as a handshaking indication that the scanner is ready to accept the user commands. The user then can enter any valid command to the bar code reader and view the results of its execution. To communicate with the reader in diagnostic mode over such communication line as RS232, the user can use any standard communication program, such as Windows HyperTerminal for example. This mode of operation can be used to test/debug the newly introduced features or view/change the bar code reader configuration parameters. It can also be used to download images and/or a backlog of the previously decoded bar code data from the reader memory to the host computer.
Automatic ("Live") Video Mode Of Imaging-Based Bar Code Reader Operation: Programmable Mode of System Operation No. 17
In Programmed Mode of System Operation No. 17, automatic IR-based object presence detection is enabled, and the CMOS imaging array is operated in its Video Mode as illustrated in Fig. 27E. When a trigger signal is automatically generated in response to the automatic detection of an object in the field of view (FOV) of the system, frames of digital images are automatically captured by the CMOS imaging array and are processed subsystem 17 et al in accordance with the principles of the present invention. Such captured frames of digital video data can be transmitted to the host computer in real-time, along with the results of image-processing (i.e. symbol character data) by subsystem 17 (if W 2
89
such results are available). This mode of system operation is well suited for use at point of sale (POS) applications, as shown in Figs. 55B and 55C, where bar coded objects are either presented to or passed by the bar code reading system. Also, Programmed Mode of System Operation No. 17 can be used in combination with any other supported imaging modes.
Second Illustrative Embodiment of Digital Imaging-Based Bar Code Symbol Reading Device Of the Present Invention, Wherein Four Distinct Modes of Illumination Are Provided
In the first illustrative embodiment described above, the Multi-mode Illumination Subsystem 14 had three primary modes of illumination: (1 ) narrow-area illumination mode; (2) near-field wide-area illumination mode; and (3) far-field wide-area illumination mode.
In a second alternative embodiment of the digital imaging-based bar code symbol reading device of the present invention shown in Figs. 27A, 27B and 28, the Multi-Mode Illumination Subsystem 14 is modified to support four primary modes of illumination: (1) near-field narrow-area illumination mode; (2) far-field narrow-area illumination mode; (3) near-field wide-area illumination mode; and (4) far-field wide-area illumination mode. In general, these near-field and far-field narrow- area illumination modes of operation are conducted during the narrow-area image capture mode of the Multi-Mode Image Formation and Detection Subsystem 13, and are supported by a near-field narrow- illumination array 27A and a far field narrow-area illumination array 27B illustrated in Fig. 19, and as shown in Figs. 2Al . In the second illustrative embodiment, each of these illumination arrays 27A, 27B are realized using at least a pair of LEDs, each having a cylindrical lens of appropriate focal length to focus the resulting narrow-area (i.e. linear) illumination beam into the near-field portion 24A and far- field portion 24B of the field of view of the system, respectively.
One of advantages of using a pair of independent illumination arrays to produce narrow-area illumination fields over near and far field portions of the FOV is that it is possible to more tightly control the production of a relatively " narrow" or "narrowly-tapered" narrow-area illumination field along its widthwise dimension. For example, as shown in Fig. 26BB, during bar code menu reading applications, the near-field narrow area illumination array 27A can be used to generate (over the near- field portion of the FOV) an illumination field 24A that is narrow along both its widthwise and height- wise dimensions, to enable the user to easily align the illumination field (beam) with a single bar code symbol to be read from a bar code menu of one type or another, thereby avoiding inadvertent reads of two or more bar code symbols or simply the wrong bar code symbol. At the same time, the far-field narrow area illumination array 27B can be used to generate (over the far-field portion of the FOV) an illumination field 24B that is sufficient wide along its widthwise dimension, to enable the user to easily read elongated bar code symbols in the far-field portion of the field of view of the bar code reader, by simply moving the object towards the far portion of the field.
Third Illustrative Embodiment of Digital Imaging-Based Bar Code Symbol Reading Device Of the Present Invention Alternatively, the imaging-based bar code symbol reading device of the present invention can have virtually any type of form factor that would support the reading of bar code symbols at diverse application environments. One alternative form factor for the bar code symbol reading device of the present invention is shown in Figs. 29A through 29C, wherein a portable digital imaging-based bar code symbol reading device of the present invention 1 " is shown from various perspective views, while arranged in a Presentation Mode (i.e. configured in Programmed System Mode No. 12).
The Digital Imaging-Based Bar Code Reading Device of The Present Invention
As shown in Fig. 21 , the digital imaging-based bar code symbol reading device of the present invention 1 ', 1 " can also be realized in the form of a Digital Imaging-Based Bar Code Reading Engine 100 that can be readily integrated into various kinds of information collection and processing systems. Notably, trigger switch 2C shown in Fig. 21 is symbolically represented on the housing of the engine design, and it is understood that this trigger switch 2C or functionally equivalent device will be typically integrated with the housing of the resultant system into which the engine is embedded so that the user can interact with and actuate the same. Such Engines according to the present invention can be realized in various shapes and sizes and be embedded within various kinds of systems and devices requiring diverse image capture and processing functions as taught herein. Details regarding one illustrative embodiment of the Digital Imaging-Based Bar Code Reading Engine of the present invention are shown in Figs. 34 through and which will be described in detail hereinafter.
Illustrative Embodiment of A Wireless Bar Code-Driven Portable Data Terminal fPDT) System of The Present Invention
Figs. 22, 23, and 24 show a Wireless Bar Code-Driven Portable Data Terminal (PDT) System 140 according to the present invention which comprises: a Bar Code Driven PDT 150 embodying the Digital Imaging-Based Bar Code Symbol Reading Engine of the present invention 100, described herein; and a cradle-providing Base Station 155.
As shown in Figs. 22 and 23, the Digital Imaging-Based Bar Code Symbol Reading Engine 100 can be used to read bar code symbols on packages and the symbol character data representative of the read bar code can be automatically transmitted to the cradle-providing Base Station 155 by way of an RF-enabled 2-way data communication link 170. At the same time, robust data entry and display capabilities are provided on the PDT 150 to support various information based transactions that can be carried out using System 140 in diverse retail, industrial, educational and other environments.
As shown in Fig. 23, the Wireless Bar Code Driven Portable Data Terminal System 140 comprises: a hand-supportable housing 151; Digital Imaging-Based Bar Code Symbol Reading Engine 100 as shown in Fig. 21, and described herein above, mounted within the head portion of the hand- supportable housing 151 ; a user control console 151 A; a high-resolution color LCD display panel 152 and drivers mounted below the user control console 151 A and integrated with the hand-supportable housing, for displaying, in a real-time manner, captured images, data being entered into the system, and graphical user interfaces (GUIs) generated by the end-user application running on the virtual machine of the wireless PDT; and PDT computing subsystem 180 contained within the PDT housing, for carrying out system control operations according to the requirements of the end-user application to be implemented upon the hardware and software platforms of the wireless PDT 2B of this illustrative embodiment.
As shown in block schematic diagram of Fig. 25, a design model for the Wireless Hand- Supportable Bar Code Driven Portable Data Terminal System 140 shown in Figs. 31 and 32, and its cradle-supporting Base Station 155 interfaced with possible host systems 173 and/or networks 174, comprises a number of subsystems integrated about a system bus, namely: a data transmission circuit 156 for realizing the PDT side of the electromagnetic-based wireless 2-way data communication link 170; program memory (e.g. DRAM) 158; non-volatile memory (e.g. SRAM) 159; Digital Imaging- Based Bar Code Symbol Reading Engine 100 for optically capturing narrow and wide area images and reading bar code symbols recognized therein; a manual data entry device such as a membrane-switching type keypad 160; LCD panel 152; an LCD controller 161 ; LCD backlight brightness control circuit 162; and a system processor 163 integrated with a systems bus (e.g. data, address and control buses). Also, a battery power supply circuit 164 is provided for supplying regulated power supplies to the various subsystems, at particular voltages determined by the technology used to implement the PDT device.
As shown in Fig. 25, the Base Station 155 also comprises a number of integrated subsystems, namely: a data receiver circuit 165 for realizing the base side of the electromagnetic-based wireless 2- way data communication link 170; a data transmission subsystem 171 including a communication control module; a base station controller 172 (e.g. programmed microcontroller) for controlling the operations of the Base Station 155. As shown, the data transmission subsystem 171 interfaces with the host system 173 or network 174 by way of the USB or RS232 communication interfaces, TCP/IP, AppleTalk or the like, well known in the art. Taken together, data transmission and reception circuits 156 and 165 realize the wireless electromagnetic 2-way digital data communication link 170 employed by the wireless PDT of the present invention.
Notably, Wireless Hand-Supportable Bar Code Driven Portable Data Terminal System 140, as well as the POS Digital Imaging-Based Bar Code Symbol Reader 1 " shown in Figs. 2OA through 2OC, each have two primary modes of operation: (1 ) a hands-on mode of operation, in which the PDT 150 or POS Reader 1 " is removed from its cradle and used as a bar code driven transaction terminal or simply bar code symbol reader; and (2) a hands-free mode of operation, in which the PDT 150 or POS Reader 1 " remains in its cradle-providing Base Station 155, and is used a presentation type bar code symbol reader, as required in most retail point-of-sale (POS) environments. Such hands-on and hands-free modes of system operation are described in greater detail in copending US Patent Application No. 10/684,273 filed on October 1 1, 2003, and incorporated herein by reference in its entirety. In such hands-on and hands-free kinds of applications, the trigger switch 2C employed in the digital imaging-based bar code symbol reading device of the present invention can be readily modified, and augmented with a suitable stand-detection mechanism, which is designed to automatically configure and invoke the PDT 150 and its Engine 100 into its Presentation Mode (i.e. System Mode of Operation No. 12) or other suitable system mode when the PDT is placed in its Base Station 155 as shown in Fig. 24. Then when the PDT 150 is picked up and removed from its cradling supporting Base Station 155 as shown in Figs. 22 and 23, the trigger switch 2C and stand-detection mechanism, arrangement can be arranged so as to automatically configure and invoke the PDT 150 and its Engine 100 into a suitable hands-on supporting mode of system operation to enable hands-on mode of operation.
Similarly, the trigger switch 2C employed in the POS Digital Imaging Bar Code Symbol Reading Device 1 " can be readily modified, and augmented with stand-detection mechanism, which is designed to automatically configure and invoke the POS Reader 1 " into its Presentation Mode (i.e. System Mode of Operation No. 12) or other suitable system mode, when the Reader 1 " is resting on a countertop surface, as shown in Figs. 2OA and 2OB. Then when the POS Reader 1 " is picked up off the countertop surface, for use in its hands-on mode of operation, the trigger switch 2C and stand-detection mechanism, arrangement will automatically configure and invoke Reader 1 " into a suitable hands-on supporting mode of system operation, as shown in Fig. 2OC. In such embodiments, the stand-detection mechanism can employ a physical contact switch, or IR object sensing switch, which is actuated then the device is picked up off the countertop surface. Such mechanisms will become apparent in view of the teachings disclosed herein.
Hand-Supportable. Digital Imaging-Based Bar Code Symbol Reading Device Employing Automatic Light Exposure Measurement and Illumination Control Subsystem and a Software-Based Illumination Metering Program
In the system shown in Figs. 1 through.25, automatic illumination control is provided by precise controlling the duration of LED illumination during exposure, thereby capturing well- illuminated images. However, in some circumstances, greater degrees of illumination control may be required and the method shown in Figs. 26 through 26B may be helpful.
In Figs. 26 through 26B, an enhanced auto-illumination control scheme is embodied within the hand-held image-processing bar code reader of the present invention. According to this alternative illumination control scheme, the illumination level of a captured image is first (i.e. intitially) determined by measuring the actual light illumination level at a central portion of the image detection array, and then computing an appropriate illumination duration level based on this measurement. Then, after an image is captured using this initial illumination level, a software illumination metering program is used to analyze the spatial intensity distribution of the captured image and determine if a new illumination duration should be calculated for use in subsequent image illumination and capture operations, to provide more fine-tuned images. If the light/illumination level represented in a captured digital image is determined to be acceptable by the software-based illumination metering program, then the program automatically (i) calculates a corrected illumination duration (count) for use by the Automatic Light Exposure Measurement and Illumination Control Subsystem, and (ii) provides the corrected illumination duration thereto. Then the Automatic Light Exposure Measurement and Illumination Control Subsystem uses this corrected illumination duration to control the illumination delivered to the field of view (FOV) during the next object illumination and image capturing operation supported by the system. By using this enhanced auto-illumination control method, the image- processing based bar code symbol reader of the present invention is provided additional flexibility in its ability to capture fine-tuned images in real-time having the optimal illumination levels.
Fig. 26 schematically illustrates the hand-supportable digital imaging-based bar code symbol reading device of the present invention, wherein a Software- Based Illumination Metering Program is used to help the Automatic Light Exposure Measurement and Illumination Control Subsystem control the operation of the LED-Based Multi-Mode Illumination Subsystem. Fig. 26A illustrates in greater detail this enhanced method of automatic illumination control, namely how the current illumination duration (determined by the Automatic Light Exposure Measurement and Illumination Control Subsystem) is automatically over-written by the illumination duration computed by a software- implemented, image-processing-based Illumination Metering Program carried out within the Image- Processing Based Bar Code Symbol Reading Subsystem. This over-written illumination duration is then used by the Automatic Light Exposure Measurement and Illumination Control Subsystem to control the amount of LED illumination produced and delivered to the CMOS image detection array during the next image frame captured by the system, in accordance with this Enhanced Auto- Illumination Control Scheme of the present invention.
Fig. 26B is a flow chart setting forth the steps involved in carrying out the enhanced auto- illumination control scheme/method illustrated in Fig. 26A. As indicated at Block in Fig. 26B, the first step of the method involves using the Automatic Light Exposure Measurement and Illumination Control Subsystem to automatically (i) measure the illumination level at a particular (e.g. central) portion of field of view of the CMOS image sensing array and (ii) determine the illumination duration (i.e. time count) necessary to achieve a desired spatial intensity in the captured image.
As indicated at Block B in Fig. 26B, the Automatic Light Exposure Measurement and Illumination Control Subsystem uses this computed/determined illumination duration to drive the LED- based illumination subsystem and capture a digital image of the object within the field of view of the Image Formation and Detection Subsystem.
As indicated at Block C in Fig. 26B, the Image-Processing Bar Code Reading Subsystem (e.g. image processor) analyzes and measures in real-time the spatial intensity distribution of the captured image and determines whether or not a corrected illumination duration is required or desired when capturing the next or subsequent frames of image data, during the current or subsequent image capture cycle. As indicated at Block D in Fig. 26B, within the Automatic Light Exposure Measurement and Illumination Control Subsystem, the previously determined illumination duration (used to captured the analyzed image) is automatically over-written with the corrected illumination duration (count) determined at Block C above.
As indicated at Block E in Fig. 26B, the Automatic Light Exposure Measurement and Illumination Control Subsystem then uses the corrected illumination duration (computed by the software-based Illumination Metering Program) to drive the LED-based Illumination Subsystem and capture a subsequent digital image of the illuminated object within the field of view of the system.
As indicated in Fig. 26B, the steps indicated at Blocks C through E can be repeated a number of times in a recursive manner, each image capture cycle, to finally produce a digital image having an optimized spatial intensity level with excellent image contrast.
Adaptive Method Of Controlling Object Illumination And Image Capturing Operations Within The Multi-Mode Image-Processing Based Bar Code Symbol Reader System Of The Illustrative Embodiment Of The Present Invention
In Figs. 6O through 6E2, the Global Exposure Control Method of the present invention was described in connection with the automatic illumination measurement and control subsystem of the present invention. Also, in Figs. 26 through 26B, an Enhanced Auto-Illumination Control Scheme was described for use in connection with the automatic illumination measurement and control subsystem of the present invention, wherein software-based illumination metering is employed. However, while these techniques provide numerous advantages and benefits, there are many end-user applications and operating environments in which it would be beneficial for the system of the present invention to provide a higher degree of adaptability to ambient illumination levels having great dynamic range. Such challenges are addressed by the adaptive control method set forth in Figs. 27A and 27B, wherein object illumination and image capturing operations are dynamically controlled within the multi-mode image-processing based bar code symbol reader system of the present invention, by analyzing the exposure quality of captured digital images and reconfiguring system control parameters based on the results of such exposure quality analysis. Figs. 27C through 27E illustrate the three basic modes of operation of the CMOS image sensing array employed in the illustrative embodiment, (i.e. Single Frame Shutter Mode, Rolling Shutter Mode and Video Mode), which are dynamically and automatically controlled within the system in accordance with the adaptive system control method of the present invention.
The details of the adaptive system control method of the present invention will be generally described below in the content of a multi-mode image-capturing and processing system with bar code reading capabilities.
As indicated at Block A in Fig. 27A, upon the occurrence of the power-up" event within the system (i.e. STEP 0), the following three basic operations are performed:. (a) Initialize System using set default System Control Parameters (SCP), such as:
(1) shutter mode of the image sensing array (e.g. Single Frame Shutter Mode illustrated in Fig. 27C, and Rolling Shutter Mode illustrated in Fig. 27D);
(2) electronic gain of image sensing array;
(3) programmable exposure time for each block of pixels in the image sensing array;
(4) illumination mode (e.g., off, continuous and strobe/flash);
(5) automatic illumination control (e.g. ON or OFF);
(6) illumination field type (e.g. narrow-area near-field illumination, wide-area far-field illumination, narrow-area field of illumination, and wide-area field of illumination);
(7) image capture mode (e.g. narrow-area image capture, and wide-area image capture);
(8) image capture control (e.g. single frame, video frames);
(9) image processing mode; and
(10) automatic object detection mode (e.g. ON or OFF).
(b) Reset the SCP Reconfiguration (SCPR) flag to the value "FALSE".
(c) Calculate and Set Exposure Quality Threshold (EQT) Parameters or criteria (e.g. brightness level, image saturation, etc. )
Then, at Block B, upon the occurrence of the "trigger signal" event within the system, the following control process is executed within the system as generally described below:
STEP 1 : If the system needs to be reconfigured (i.e. SCPR flag=TRUE), then configure the system using new SCPs. Otherwise, maintain the system using current SCPs.
STEP 2: Illuminate an object using the method of illumination indicated by the Illumination Mode parameter, and capture a digital image thereof.
STEP 3: Analyze the captured digital image for exposure quality.
In connection with the practice of the present invention, exposure quality is a quantitative measure of the quality of the image brightness. Setting system control parameters (SCPs), such as the type and the intensity of the object illumination, value of the image sensor gain, and the type and the value of the image sensor exposure parameters, will affect the image brightness. The value of the exposure quality can be presented in the range from 0 to 100, with 0 being an extremely poor exposure that would generally be fruitless to process (in cases when the image is too dark or too bright), and 100 being an excellent exposure. It is almost always worthwhile to process an image when the value of the exposure quality is close to 100. Conversely, it is almost never worthwhile to process an image when the value of the exposure quality is as low as 0. As will be explained in greater detail below, for the latter case where the computed exposure quality is as low as 0, the system control parameters (SCPs) will need to be dynamically re-evaluated and set to the proper values in accordance with the principles of the present invention.
STEP 4: If the exposure quality measured in STEP 3 does not satisfy the Exposure Quality Threshold (EQT) parameters set in STEP 0, then calculate new SCPs for the system and set the SCPR flag to TRUE indicating that system must be reconfigured prior to acquiring a digital image during the next image acquisition cycle. Otherwise, maintain the current SCPs for the system.
STEP 5: If barcode decoding is required in the application at hand, then attempt to process the digital image and decode a barcode symbol represented therein. STEP 6: If barcode decoding fails, or if barcode decoding was not required but the exposure quality did not satisfy the Exposure Quality Threshold parameters, go to STEP 1.
STEP 7: Tf barcode decoding succeeded, then transmit results to the host system.
STEP 8: If necessary, transmit the digital image to the host system, or store the image in internal memory.
STEP 9: EXIT.
Notably, the system control process is intended for practice during any "system mode" of any digital image capture and processing system, including the bar code symbol reader of the illustrative embodiments, with its various modes of system operation described in Figs 17A and 17B. Also as this control method is generally described in Figs. 27A and 27B, it is understood that its principles will be used to modify particular system control processes that might be supported in any particular digital image capture and processing system. The salient features of this adaptive control method involve using (i) automated real-time analysis of the exposure quality of captured digital images, and (ii) automated reconfiguring of system control parameters (particularly illumination and exposure control parameters) based on the results of such exposure quality analysis, so as to achieve improved system functionality and/or performance in diverse environments.
At this juncture, it will be helpful to describe how the adaptive control process of Figs. 27A and 27B can be practiced in systems having diverse modes of "system operation" as well as "subsystem operation", as in the case of the multi-mode image-processing bar code reading system of the illustrative embodiment. For illustration purposes, it will helpful to consider this bar code symbol reading system when it is configured with system control parameters (SCPs) associated with the Programmed Modes of System Operation Nos. 8 through 12, described in Figs. 17A and 17B. In any of these Programmed Modes of System Operation, in response to a "trigger event" (automatically or manually generated), the system will be able to automatically generate, (i) a narrow-area field of illumination during the narrow-area image capture mode of the system; and if the system fails to read a bar code symbol reading during this mode, then the system will automatically generate (ii) a wide-area field of illumination during its wide-area image capture mode. In the context of such modes of system operation, the adaptive control method described in Figs. 27A and 27B will now be described below as an illustrative embodiment of the control method. It is understood that there are many ways to practice this control method, and in each instance, a system with different operation or behavior can and will typically result.
For illustrative purposes, two (2) different modes of system operation will be considered below in detail to demonstrate the breathe of applicability of the adaptive system control method of the present invention. Case 1 : System Operated in Programmed Mode of System Operation No. S: Automatically-Triggered Multi-Attempt 1 P/2D Single-Read Mode Employing The No-Finder and Manual and/or Automatic Modes of Operation
In the first example, upon "power up" of the system, at STEP 0, the system control parameters (SCPs) will be configured to implement the selected Programmed Mode of System Operation. For System Mode No. 8, the SCPs would be initially configured as follows:
(1) the shutter mode parameter will be set to the "single frame shutter mode"(illustrated in Fig. 27C, for implementing the Global Illumination/Exposure Method of the present invention described in Figs. 6D through 6E2);
(2) the electronic gain of the image sensor will be set to a default value determined during factory calibration;
(3) the exposure time for blocks of image sensor pixels will be set to a default determined during factory calibration;
(4) the illumination mode parameter will be set to "flash/strobe";
(5) the automatic illumination control parameter will be set to "ON";
(6) the illumination field type will be set to "narrow-area field";
(7) the image capture mode parameter will be set to "narrow-area image capture";
(8) the image capture control parameter will be set to "single frame";
(9) the image processing mode will be set, for example, to a default value; and
(10) the automatic object detection mode will be set to ON. Also, the SCPR flag will be set to its FALSE value.
Upon the occurrence of a trigger signal from the system (e.g. generated by automatic object detection by IR object presence and range detection subsystem in System Mode No. 8-10, or by manually pulling the activation switch in System Modes 1 1-12), the system will reconfigure itself only if the SCPR flag is TRUE; otherwise, the system will maintain its current SCPs. During the first pass through STEP 1 , the SCPR flag will be false, and therefore the system will maintain its SCPs at their default settings.
Then at STEP 2 in Fig. 27A, the object will be illuminated within a narrow-field of LED-based illumination produced by the illumination subsystem, and a narrow-area digital image will be captured by the image formation and detection subsystem.
At STEP 3 in Fig. 27B, the narrow-area digital image will be analyzed for exposure quality (e.g. brightness level, saturation etc.).
At STEP 4, if the measured/calculated exposure quality values do not satisfy the exposure quality threshold (EQT) parameters, then the system recalculates new SCPs and sets the SCPR flag to TRUE, indicating that the system must be reconfigured prior to acquiring a digital image during the next image acquisition cycle. Otherwise, the SCPs are maintained by the system.
At STEP 5, the system attempts to read a 1 D bar code symbol in the captured narrow-area image. At STEP 6, if the system is incapable of reading the bar code symbol (i.e. decoding fails), then the system returns to STEP 1 and reconfigures its SCPs if the SCPR flag is set to TRUE (i.e. indicative of unsatisfactory exposure quality in the captured image). In the case of reconfiguration, the system might reset the SCPs as follows:
(1 ) the shutter mode parameter - set to "Rolling Shutter M ode" illustrated in Fig. 27D;
(2) the electronic gain of the image sensor -- set to the value calculated during STEP 4;
(3) the exposure time for blocks of image sensor pixels — set to a values determined during STEP 4;
(4) the illumination mode parameter — set to "off;
(5) the automatic illumination control parameter will be set to "OFF";
(6) the illumination field type will be set to "narrow-area field";
(7) the image capture mode parameter will be set to "narrow-area image capture";
(8) the image capture control parameter will be set to "single frame";
(9) the image processing mode will be set to the default value; and
(10) the automatic object detection mode will be set to ON.
Then at STEPS 2-4, the system captures a second narrow-area image using ambient illumination and the image sensing array configured in its rolling shutter mode (illustrated in Fig. 27D), and recalculates Exposure Quality Threshold Parameters and if the exposure quality does not satisfy the current Exposure Quality Threshold Parameters, then the system calculates new SCPs (including switching to the wide-area image capture mode, and possibly) and sets the SCPR flag to TRUE. Otherwise, the system maintains the SCPs, and proceeds to attempt to decode a bar code symbol in the narrow-area digital image captured using ambient illumination.
If at STEPS 5 and 6, bar code decoding is successful, then at STEP 7 the system transmits the results (i.e. symbol character data) to the host the system, and/or at STEP 8, transmits the captured digital image to the host system for storage or processing, or to internal memory for storage, and then exits the control process at STEP 9.
If at STEPS 5 and 6 in Block B2 in Fig. 27B, bar code decoding fails, then the system returns to STEP 1 , and reconfigures for wide-area illumination and image capture. If while operating in its narrow-area illumination and image capture modes of operation, the image captured by the system had an "exposure quality" which did not satisfy the Exposure Quality Threshold Parameters and indicated that the light exposure was still too bright and saturated, and the recalculated SCPs required switching to a new level of electronic gain, to reduce the exposure brightness level of the analyzed image, then at STEP 1 the SCPs are reconfigured using the SCPs previously computed at STEP 4. Thereafter, the object is illuminated with ambient illumination and captured at STEP 2, and at STEP 3, the captured image is analyzed for exposure quality, as described above. At STEP 4, the exposure quality measured in STEP 3 is compared with the Exposure Quality Threshold parameters, and if it does not satisfy these parameters, then new SCPs are calculated and the SCPR flag is set to TRUE. Otherwise the system maintains the SCPs using current SCPs. At STEPs 5 and 6, bar code decoding is attempted, and if it is successful, then at STEPS 7 and 8, symbol character data and image data are transmitted to the host system, and then the system exits the control process at STEP 9. If bar code decoding fails, then the system returns to STEP 1 to repeat STEPS within Blocks Bl and B2 of Figs. 27A and 27B, provided that the trigger signal is still persistence. During this second pass through the control loop of Blocks Bl and B2, the system will reconfigure the system as determined by the exposure quality analysis performed at STEP Bl , and calculations performed at STEP 4. Notably, such calculations could involve calculating new SCPs that require activating system modes using wide-area LED illumination during the wide-area image capture mode, that is, if analysis of the facts may require, according to the adaptive control process of the present invention. Recycling this control loop will reoccur as long as a bar code symbol has not been successfully read, and the trigger signal is persistently generated.
CASE 2: Programmable Mode of System Operation No. 17: Live Video Mode of Imaging Based Bar Code Reader Operation
In this second example, upon "power up" of the system, at STEP 0, the system control parameters (SCPs) will be configured to implement the selected Programmed Mode of System Operation. For System Mode No. 17, wherein the digital imaging system of the present invention might be used as a POS-based imager for reading bar code symbols, the SCPs would be initially configured as follows:
(1 ) the shutter mode parameter will be set to the "Video Mode"(illustrated in Fig. 2E);
(2) the electronic gain of the image sensor will be set to a default value determined during factory calibration;
(3) the exposure time for blocks of image sensor pixels will be set to a default determined during factory calibration;
(4) the illumination mode parameter will be set to "continuous";
(5) the automatic illumination control parameter will be set to "ON";
(6) the illumination field type will be set to "wide-area field";
(7) the image capture mode parameter will be set to "wide-area image capture";
(8) the image capture control parameter will be set to "video frame";
(9) the image processing mode will be set, for example, to a default value; and
(10) the automatic object detection mode will be set to ON. Also, the SCPR flag will be set to its FALSE value.
Upon the occurrence of a trigger signal from the system (i.e. generated by automatic object detection by IR object presence and range detection subsystem), the system will reconfigure itself only if the SCPR flag is TRUE; otherwise, the system will maintain its current SCPs. During the first pass through STEP 1, the SCPR flag will be FALSE, and therefore the system will maintain its SCPs at their default settings. Then at STEP 2 in Fig. 27A, the object will be continuously illuminated within a wide-field of LED-based illumination produced by the illumination subsystem, and a wide-area digital image will be captured by the image formation and detection subsystem, while the CMOS image sensing array is operated in its Video Mode of operation.
At STEP 3 in Fig. 27B, the wide-area digital image will be analyzed for exposure quality (e.g. brightness level, saturation etc.).
At STEP 4, if the measured/calculated exposure quality values do not satisfy the exposure quality threshold (EQT) parameters, then the system recalculates new SCPs and sets the SCPR flag to TRUE, indicating that the system must be reconfigured prior to acquiring a digital image during the next image acquisition cycle while the CMOS sensing array is operated in its Video Mode. Otherwise, the SCPs are maintained by the system.
At STEP 5, the system attempts to read a 1 D bar code symbol in the captured wide-area digital image.
At STEP 6, if the system is incapable of reading the bar code symbol (i.e. decoding fails), then the system returns to STEP 1 and reconfigures its SCPs if the SCPR flag is set to TRUE (i.e. indicative of unsatisfactory exposure quality in the captured image). In the case of reconfiguration, the system might reset the SCPs as follows:
(1) the shutter mode parameter - set to "Video Mode" illustrated in Fig. 27E;
(2) the electronic gain of the image sensor -- set to the value calculated during STEP 4;
(3) the exposure time for blocks of image sensor pixels ~ set to a values determined during STEP 4;
(4) the illumination mode parameter — set to "continous";
(5) the automatic illumination control parameter will be set to "ON";
(6) the illumination field type will be set to "wide-area field";
(7) the image capture mode parameter will be set to "wide-area image capture";
(8) the image capture control parameter will be set to "video frame";
(9) the image processing mode will be set to the default value; and
(10) the automatic object detection mode will be set to ON.
Then at STEPS 2-4, the system captures a second wide-area image using continous LED illumination and the image sensing array configured in its Video Mode (illustrated in Fig. 27E), and recalculates Exposure Quality Threshold Parameters and if the exposure quality does not satisfy the current Exposure Quality Threshold Parameters, then the system calculates new SCPs (including switching to the wide-area image capture mode, and possibly) and sets the SCPR flag to TRUE. Otherwise, the system maintains the SCPs, and proceeds to attempt to decode a bar code symbol in the narrow-area digital image captured using continuous LED illumination.
If at STEPS 5 and 6, bar code decoding is successful, then at STEP 7 the system transmits the results (i.e. symbol character data) to the host the system, and/or at STEP 8, transmits the captured digital image to the host system for storage or processing, or to internal memory for storage, and then exits the control process at STEP 9.
If at STEPS 5 and 6 in Block B2 in Fig. 27B, bar code decoding fails, then the system returns to STEP 1, and reconfigures for wide-area illumination and image capture. If while operating in its wide- area illumination and image capture modes of operation, the image captured by the system had an "exposure quality" which did not satisfy the Exposure Quality Threshold Parameters and indicated that the light exposure was still too bright and saturated, and the recalculated SCPs required switching to a new level of electronic gain (or illumination control), to reduce exposure brightness, then at STEP 1 the SCPs are reconfigured using the SCPs previously computed at STEP 4. Thereafter, the object is illuminated with ambient illumination and captured at STEP 2, and at STEP 3, the captured image is analyzed for exposure quality, as described above. At STEP 4, the exposure quality measured in STEP 3 is compared with the Exposure Quality Threshold parameters, and if it does not satisfy these parameters, then new SCPs are calculated and the SCPR flag is set to TRUE. Otherwise the system maintains the SCPs using current SCPs. At STEPs 5 and 6, bar code decoding is attempted, and if it is successful, then at STEPS 7 and 8, symbol character data and image data are transmitted to the host system, and then the system exits the control process at STEP 9. If bar code decoding fails, then the system returns to STEP 1 to repeat STEPS within Blocks Bl and B2 of Figs. 27A and 27B, provided that the automatic trigger signal is still persistent (indicative that the object is still within the field of view of the digital imager). During this second pass through the control loop of Blocks Bl and B2, the system will reconfigure the system as determined by the exposure quality analysis performed at STEP Bl , and calculations performed at STEP 4. Notably, such calculations could involve calculating new SCPs that require adjusting illumination and/or image sensing array parameters during the wide-area image capture mode, that is, as the analysis of the facts may require, according to the adaptive control process of the present invention. Recycling this control loop will reoccur as long as a bar code symbol has not been successfully read, and the automatic trigger signal is persistently generated by the IR- based automatic object detecting subsystem.
The adaptive control method of the present invention described above can be applied to any of the System Modes of Operation specified in Figs. 17A and 17B, as well as to any system modes not specifying specified herein. In each such illustrative embodiment, the particular SCPs that will be set in a given system will depend on the structure of and functionalities supported by the system. In each such system, there will be SCPs that relate to the image sensing array of the system, and SCPs that relate to the illumination subsystem thereof, as well as SCPs that relate to other aspects of the system. The subsystems with the system may have a single or multiple modes of suboperation, depending on the nature of the system design. In accordance with the principles of the present invention, each system will involve the using (i) automated real-time analysis of the exposure quaiity of captured digital images and (ii) automated reconfiguring of system control parameters (particularly illumination and exposure control parameters) based on the results of such exposure quality analysis, so as to achieve improved system functionality and/or performance in diverse environments. First Illustrative Embodiment of the Hand-Supportable Digital Image-Processing Based Bar Code Symbol Reader of the Present Invention. Employing An Image Cropping Zone (ICZ) Framing Pattern. And An Automatic Post-Imaee Capture Cropping Method
The hand-held image-processing bar code symbol readers described hereinabove employs a narrow-area illumination beam which provides a visual indication to the user on the vicinity of the narrow-area field of view of the system. However, while operating the system during its wide-area image capture modes of operation, it may be desirable in particular applications to provide a visual indication of the wide-area field of view of the system. While various techniques are known in the art to provide such targeting/marking functions, a novel method of operation will be described below with reference to Figs. 28 through 30.
Fig. 28 shows a hand-supportable image-processing based bar code symbol reader of the present invention 1' employing an image cropping zone (ICZ) framing pattern, and an automatic post- image capture cropping method involving the projection of the ICZ within the field of view (FOV) of the reader and onto a targeted object to be imaged during object illumination and imaging operations. As shown in Fig. 29, this hand-supportable image-processing based bar code symbol reader 1 ' is similar to the designs described above in Figs. I B through 14, except that it includes one or more image cropping zone (ICZ) illumination framing source(s) operated under the control of the System Control Subsystem. Preferably, these ICZ framing sources are realized using four relative bright LEDs indicating the corners of the ICZ in the FOV, which will be cropped during post-image capture operations. Alternatively, the ICZ framing source could be a VLD that produces a visible laser diode transmitted through a light diffractive element (e.g. volume transmission hologram) to produce four beamlets indicating the corners of the ICZ, or bright lines that appear in the captured image. The ICZ frame created by such corner points or border lines (formed thereby) can be located using edge-tracing algorithms, and then the corners of the ROI can be identified from the traced border lines.
Referring to Fig. 30, the ICZ Framing and Post-Image Capture Cropping Process of the present invention will now be described.
As indicated at Block A in Fig. 30, the first step of the method involves projecting an ICZ framing pattern within the FOV of the system during wide-area illumination and image capturing operations.
As indicated at Block B in Fig. 30, the second step of the method involves the user visually aligning the object to be imaged within the ICZ framing pattern (however it might be realized).
As indicated at Block C in Fig. 30, the third step of the method involves the Image Formation and Detection Subsystem and the Image Capture and Buffering Subsystem forming and capturing the wide-area image of the entire FOV of the system, which embraces (i.e. spatially encompasses) the ICZ framing pattern aligned about the object to be imaged. As indicated at Block D in Fig. 30, the fourth step of the method involves using an automatic software-based image cropping algorithm, implemented within the Image-Processing Bar Code Reading Subsystem, to automatically crop the pixels within the spatial boundaries defined by the ICZ, from those pixels contained in the entire wide-area image frame captured at Block B. Due to the fact that image distortion may exist in the captured image of the ICZ framing pattern, the cropped rectangular image may partially contain the ICZ framing pattern itself and some neighboring pixels that may fall outside the ICZ framing pattern.
As indicated at Block E in Fig. 30, the fifth step of the method involves the Image-Processing Bar Code Reading Subsystem automatically decode processing the image represented by the cropped image pixels in the ICZ so as to read a 1 D or 2D bar code symbol graphically represented therein.
As indicated at Block F in Fig. 30, the sixth step of the method involves the Image-Processing Bar Code Reading Subsystem outputting (to the host system) the symbol character data representative of the decoded bar code symbol.
Notably, in prior art FOV targeting methods, the user captures an image that is somewhat coinciding with what he intended to capture. This situation is analogous to a low-cost point-and-shoot camera, wherein the field of view of the viewfinder and camera lens only substantially coincide with each other. In the proposed scheme employing the above-described ICZ framing and post-processing pixel cropping method, the user captures an image that is exactly what s/he framed with the ICZ framing pattern. The advantage of this system to prior art FOV methods is analogous to the advantage of a SLR camera over a point-and-shoot camera, namely: accuracy and reliability.
Another advantage of using the JCZ framing and post-processing pixel cropping method is that the FCZ framing pattern (however realized) does not have to coincide with the field of view of the Image Formation And Detection Subsystem. The ICZ framing pattern also does not have to have parallel optical axes. The only basic requirement of this method is that the ICZ framing pattern fall within the field of view (FOV) of the Image Formation And Detection Subsystem, along the working distance of the system.
However, one may design the ICZ framing pattern and the optical axis angle of the system such that when the ICZ framing pattern does not fall completely inside the camera's field of view (i.e. the ICZ framing pattern does not fall within the complete acquired image), this visually implies to the user that the captured and cropped image is outside the depth of focus of the imaging system. Thus, the imager can provide a visual or audio feedback to the user so that he may repeat the image acquisition process at a more appropriate distance.
Second Illustrative Embodiment of the Hand-Supportable Digital Image-Processing Based Bar Code Symbol Reader of the Present Invention, Employing An Image Cropping Pattern (ICP), And An Automatic Post-Image Capture Cropping Method Referring to Figs. 31 through 37B, another novel method of operation will be described for use in a hand-held digital image-processing bar code symbol reader operating during its wide-area image capture modes of operation.
As shown in Fig. 31 , during object illumination and wide-area image capture modes of operations, the hand-supportable image-processing based bar code symbol reader 1 " is provided with the capacity to generate and project a visible illumination-based Image Cropping Pattern (ICP) 200 within the field of view (FOV) of the reader. During these modes of bar code reader operation, the operator will align the visibly projected ICP onto the object (or graphical indicia) to be imaged so that the graphical indicia generally falls within, or is framed by the outer boundaries covered by the ICP. The object to be imaged may be perfectly planar in geometry, or it may have a particular degree of surface curvature. The angle of the object surface may also be inclined with respect to the bar code symbol reader, which may produce "keystone" type effects during the projection process. In either event, during object illumination and image capture operations, the operator will then proceed to use the reader to illuminate the object using its multi-mode illumination subsystem 14, and capture an image of the graphical indicia and the ICP aligned therewith using the multi-mode image formation and detection subsystem 13. After the image has been captured and buffered within the image capturing and buffering system 16, it is then transferred to the ICP locating/finding module 201 for image processing that locates the features and elements of the ICP and determines therefrom an image region (containing the graphical indicia) to be cropped for subsequent processing. The coordinate/pixel location of the ICP elements relative to each other in the captured image are then analyzed using computational analysis to determine whether or not the captured image has been distorted due to rotation or tilting of the object relative to the bar code reader during image capture operations. If this condition is indicated, then the cropped image will be transferred to the image perspective correction and scaling module 202 for several stages of image processing. The first stage of image processing will typically involve correction of image "perspective", which is where the cropped image requires processing to correct for perspective distortion cause by rotation or tilting of the object during imaging. Perspective distortion is also know as keystone effects. The perspective/tilt corrected image is then cropped. Thereafter, the cropped digital image is processed to scale (i.e. magnify or minify) the corrected digital image so that it has a predetermined pixel size (e.g. NxM) optimized for image processing by the image processing based bar code symbol reading module 17. Such digital image scaling, prior to decode processing, enables most conventional image-based decoding processing algorithms to operate on the digital images. The details of this bar code reading method of the present invention will be described in greater detail herein, after the system architecture of the bar code symbol reader is described below.
In most respects, the digital image-processing based bar code symbol reader 1 " shown in Fig. 31 is very similar to the system 1 shown in Figs. IB through 14, with the exception of a few additional subcomponents indicated below.
As shown in Fig. 32, the digital imaging-based bar code symbol reading device depicted in Fig. 31 comprises the following system components: a Multi-Mode Area-Type Image Formation and Detection (i.e. Camera) Subsystem 13 having image formation (camera) optics for producing a field of view (FOV) upon an object to be imaged and a CMOS or like area-type image sensing array 22 for detecting imaged light reflected off the object during illumination operations in either (i) a narrow-area image capture mode in which a few central rows of pixels on the image sensing array are enabled, or (ii) a wide-area image capture mode in which substantially all rows of the image sensing array are enabled; a Multi-Mode LED-Based Illumination Subsystem 14 for producing narrow and wide area fields of narrow-band illumination within the FOV of the Image Formation And Detection Subsystem 13 during narrow and wide area modes of image capture, respectively, so that only light transmitted from the Multi-Mode Illumination Subsystem 14 and reflected from the illuminated object and transmitted through a narrow-band transmission-type optical filter realized within the hand-supportable housing (i.e. using a red-wavelength high-pass reflecting window filter element disposed at the light transmission aperture thereof and a low-pass filter before the image sensor) is detected by the image sensor and all other components of ambient light are substantially rejected; an Image Cropping Pattern Generator 203 for generating a visible illumination-based Image Cropping Pattern (ICP) 200 projected within the field of view (FOV) of the Multi-Mode Area-type Image Formation and Detection Subsystem 13; an IR-based object presence and range detection subsystem 12 for producing an IR- based object detection field within the FOV of the Image Formation and Detection Subsystem 13: an Automatic Light Exposure Measurement and Illumination Control Subsystem 15 for measuring illumination levels in the FOV and controlling the operation of the LED-Based Multi-Mode Illumination Subsystem 14; an Image Capturing and Buffering Subsystem for capturing and buffering 2-D images detected by the Image Formation and Detection Subsystem 13; an Image Processing and Cropped Image Locating Module 201 for processing captured and buffered images to locate the image region corresponding to the region defined by the Image Cropping Pattern (ICP) 200; an Image Perspective Correction and Scaling Module 202 for correcting the perspective of the cropped image region and scaling the corrected image to a predetermined (i.e. fixed) pixel image size suitable for decode-processing; (8) a Multi-mode Image-Processing Based Bar Code Symbol Reading Subsystem 17 for processing cropped and scaled images generated by the Image Perspective and Scaling Module
202 and reading I D and 2 D bar code symbols represented, and (9) an Input/Output Subsystem 18 for outputting processed image data and the like to an external host system or other information receiving or responding device, in which each said subsystem component is integrated about a System Control Subsystem 19, as shown.
In general, there are many possible ways of realizing the Image Cropping Pattern Generator
203 employed in the system of Fig. 31. In Figs. 33 A through 34D5, several refractive-based designs are disclosed for generating an image cropping pattern (ICP) 200, from a single two-dot pattern, to a more complex four dot pattern. While the four dot ICP is a preferred pattern, in some applications, the two dot pattern may be suitable for the requirements at hand where 1 D bar code symbols are primarily employed. Also, as shown in Fig. 35, light diffractive technology (e.g. volume holograms, computer generated holograms CGHs, etc) can be used in conjunction with a VLD and a light focusing lens to generate an image cropping pattern (ICP) having diverse characteristics. It is appropriate at this juncture to describe these various embodiments for the Image Cropping Pattern Generator of the present invention.
In Fig. 33A, a first illustrative embodiment of the VLD-based Image Cropping Pattern Generator 203 A is shown comprising: a VLD 205 located at the symmetrical center of the focal plane of a pair of flat-convex lenses 206A and 206B arranged before the VLD 205, and capable of generating and projecting a two (2) dot image cropping pattern (ICP) 200 within the field of view of the of the Multi-Mode Area-type Image Formation and Detection Subsystem 13. In Figs. 33B and 33C, a composite ray-tracing diagram is provided for the VLD-based Image Cropping Pattern Generator depicted in Fig. 33A. As shown, the pair of flat-convex lenses 206A and 206B focus naturally diverging light rays from the VLD 205 into two substantially parallel beams of laser illumination which to produce a two (2) dot image cropping pattern (ICP) 200 within the field of view (FOV) of the Multi- Mode Area-type Image Formation and Detection Subsystem. Notably, the distance between the two spots of illumination in the ICP is a function of distance from the pair of lenses 206A and 206B . Fig. 33Dl through 33D5 are simulated images of the two dot Image Cropping Pattern produced by the ICP Generator 203A of Fig. 33A, at distances of 40mm, 80mm, 120mm, 160mm and 200mm, respectively, from its pair of flat-convex lenses, within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem.
In Fig. 34A, a second illustrative embodiment of the VLD-based Image Cropping Pattern Generator of the present invention 203B is shown comprising: a VLD 206 located at the focus of a biconical lens 207 (having a biconical surface and a cylindrical surface) arranged before the VLD 206, and four flat-convex lenses 208A, 208B, 208C and 208D arranged in four corners. This optical assembly is capable of generating and projecting a four (4) dot image cropping pattern (ICP) within the field of view of the of the Multi-Mode Area-type Image Formation and Detection Subsystem. Figs. 34B and 34C show a composite ray-tracing diagram for the third illustrative embodiment of the VLD- based Image Cropping Pattern Generator depicted in Fig. 34A. As shown, the biconical lens 207 enlarges naturally diverging light rays from the VLD 206 in the cylindrical direction (but not the other) and thereafter, the four flat-convex lenses 208 A through 208D focus the enlarged laser light beam to generate a four parallel beams of laser illumination which form a four (4) dot image cropping pattern (ICP) within the field of view of the Multi-Mode Area-type Image Formation and Detection Subsystem. The spacing between the four dots of illumination in the ICP is a function of distance from the flat- convex lens 208A through 208D. Figs. 34Dl through 34D5 are simulated images of the linear Image Cropping Pattern produced by the ICP Generator of Fig. 34A, at distance of 40mm, 80mm, 120mm, 160mm and 200mm, respectively, from its flat-convex lens, within the field of view of the Multi-Mode Image Formation and Detection Subsystem 13.
In Fig. 35, a third illustrative embodiment of the VLD-based Image Cropping Pattern Generator of the present invention 203C is shown comprising: a VLD 210, focusing optics 21 1 , and a light diffractive optical element (DOE) 212 (e.g. volume holographic optical element) forming an ultra- compact optical assembly. This optical assembly is capable of generating and projecting a four (4) dot image cropping pattern (ICP) within the field of view of the of the Multi-Mode Area-type Image Formation and Detection Subsystem, similar to that generated using the refractive optics based device shown in Fig. 35A.
Hand-Supportable Digital Image-Processing Based Bar Code Symbol Reader of the Present Invention Employing A Second Method of Digital Image Capture and Processing Using An Image Cropping Pattern (ICP") And Automatic Post-Image Capture Cropping and Processing Methods
Referring to Fig. 36 and 37, the second illustrative embodiment of the method of digital image capture and processing will now be described in connection with the bar code symbol reader illustrated in Figs. 31 and 32.
As indicated at Block A in Fig. 37, the bar code symbol reader during wide-area imaging operations, projects an illumination-based Image Cropping Pattern (ICP) 200 within the field of view (FOV) of the system, as schematically illustrated in Fig. 36.
As indicated at Block B in Fig. 37, the operator aligns an object to be imaged within the projected Image Cropping Pattern (ICP) of the system.
As indicated at Block C in Fig. 37, during the generation of the Image Cropping Pattern, the bar code symbol reader captures a wide-area digital image of the entire FOV of the system.
As indicated at Block D in Fig. 37, the bar code symbol reader uses module 201 to process the captured digital image and locate/find features and elements (e.g. illumination spots) associated with the Image Capture Pattern 200 within the captured digital image. As shown in the schematic representation of Fig. 37, the clusters of pixels indicated by reference characters (a,b,c,d) represent the four illumination spots (i.e. dots) associated with the Image Cropping Pattern (ICP) projected in the FOV. The coordinates associated with such features and elements of the ICP would be located/found using module 201 during this step of the image processing method of the present invention.
As indicated at Block E in Fig. 37, the bar code symbol reader uses module 201 to analyze the coordinates of the located image features (a,b,c,d) and determine the geometrical relationships among certain of such features (e.g. if the vertices of the ICP have been distorted during projection and imaging due to tilt angles, rotation of the object, etc), and reconstruct an undistorted image cropping pattern (ICP) independent of the object tilt angle (or perspective) computed therefrom. Module 210 supports real-time computational analysis to analyze the coordinates of the pixel locations of the ICP elements relative to each other in the captured image, and determine whether or not the captured image has been distorted due to rotation or tilting of the object relative to the bar code reader during image capture operations. If this condition is indicated, then the digital image will be transferred to the image perspective correction and scaling module 202 for several stages of image processing. The first stage of image processing performed by module 202 will typically involve correction of image "perspective", which is where the cropped image requires processing to correct for perspective distortion cause by rotation or tilting of the object during imaging. Perspective distortion is also known as keystone effects. As indicated at Block F in Fig. 37, the bar code symbol reader uses module 202 to crops a set of pixels from the corrected digital image, that corresponds to the ICP projected in the FOV of the system.
As indicated at Block G in Fig. 37, the bar code symbol reader uses module 202 to carry out a digital zoom algorithm to process the cropped and perspective-corrected ICP region and produce a scaled digital image having a predetermined pixel size independent of object distance. This step involves processing the cropped perspective-corrected image so as to scale (i.e. magnify or minify) the same so that it has a predetermined pixel size (e.g. NxM) optimized for image processing by the image processing based bar code symbol reading module 17. Such image scaling, prior to decode processing, enables conventional image-based decoding processing algorithms to operate on the digital images of constant magnitude.
As indicated at Block H in Fig. 37, the bar code symbol reader transmits the scaled perspective- corrected digital image to the decode processing module 17 (and optionally, a visual display).
As indicated at Block I in Fig. 37, the bar code symbol reader decode-processes the scaled digital image so as to read ID or 2D bar code symbols represented therein and generate symbol character data representative of a decoded bar code symbol.
As indicated at Block J in Fig. 37, the input/output subsystem 18 of the bar code symbol reader outputs the generated symbol character data to a host system.
PLIIM-Based Object Identification And Attribute Acquisition System Of The Present Invention
In Fig. 38A, there is shown a PLIIM-based object identification and attribute acquisition system of the present invention, having a housing of unity design. As shown, the housing 1540 has the same light transmission apertures of the housing design shown in Figs. 12A and 12B of WIPO Publication No. WO 02/43195, incorporated herein by reference in its entirety, but has no housing panels disposed about the light transmission apertures 1541A, 1541 B and 1542, through which planar laser illumination beams (PLIBs) and the field of view (FOV) of the PLIIM-based subsystem extend, respectively. This feature of the present invention provides a region of space (i.e. housing recess) into which an optional device (not shown) can be mounted for carrying out a speckle-noise reduction solution within a compact box that fits within said housing recess, in accordance with the principles of the present invention. Light transmission aperture 1543 enables the AM laser beams 1 167A/1 167B from the LDIP subsystem 1122 to project out from the housing.
Bioptical PLIIM-Based Product Dimensioning. Analysis And Identification System Of The First Illustrative Embodiment Of The Present Invention
As shown in Fig. 38BI a pair of PLIIM-based package identification (PID) systems 25' of Figs. 3E4 through 3E8 of WIPO Publication No. WO 02/43195 are modified and arranged within a compact POS housing 1581 having bottom and side light transmission apertures 1582 and 1583 (beneath bottom and side imaging windows 584 and 585, respectively), to produce a bioptical PLIIM- based product identification, dimensioning and analysis (PIDA) system 1580 according to a first illustrative embodiment of the present invention. As shown, the bioptical PIDA system 580 comprises: a bottom PLIIM-based unit 1586A mounted within the bottom portion of the housing 1581 ; a side PLIIM-based unit 586B mounted within the side portion of the housing 1581 ; an electronic product weigh scale 1587, mounted beneath the bottom PLIIM-based unit 1587A, in a conventional manner; and a local data communication network 1588, mounted within the housing, and establishing a highspeed data communication link between the bottom and side units 586A and 586B, and the electronic weigh scale 1587, and a host computer system (e.g. cash register) 1589.
In order that the bioptical PLIIM-based PIDA system 1580 is capable of capturing and analyzing color images, and thus enabling, in supermarket environments, "produce recognition" on the basis of color as well as dimensions and geometrical form, each PLIIM-based subsystem 25' employs (i) a plurality of visible laser diodes (VLDs) having different color producing wavelengths to produce a multi-spectral planar laser illumination beam (PLIB) from the side and bottom light transmission apertures 1582 and 1583, and also (ii) a 1-D (linear-type) CCD image detection array for capturing color images of objects (e.g. produce) as the objects are manually transported past the imaging windows 1584 and 1585 of the bioptical system, along the direction of the indicator arrow, by the user or operator of the system (e.g. retail sales clerk).
Bioptical PLIIM-Based Product Identification, Dimensioning and Analysis System Of The Second Illustrative Embodiment Of The Present Invention
As shown in Figs. 38C l and 38C2, a pair of PLIIM-based package identification (PID) systems 25" of Figs. 6Dl through 6E3 in WIPO Publication No. 02/43 195, supra, are modified and arranged within a compact POS housing 601 having bottom and side light transmission windows 1602 and 1603 (beneath bottom and side imaging windows 604 and 605, respectively), to produce a bioptical PLIIM- based product identification, dimensioning and analysis (PIDA) system 1600 according to a second illustrative embodiment of the present invention. As shown, the bioptical PIDA system 1600 comprises: a bottom PLIIM-based unit 1606A mounted within the bottom portion of the housing 1601; a side PLIIM-based unit 1606B mounted within the side portion of the housing 1601 ; an electronic product weigh scale 1589, mounted beneath the bottom PLIIM-based unit 1606A, in a conventional manner; and a local data communication network 1588, mounted within the housing, and establishing a high-speed data communication link between the bottom and side units 1606A and 1606B, and the electronic weigh scale 1589.
In order that the bioptical PLIIM-based PlDA system 1600 is capable of capturing and analyzing color images, and thus enabling, in supermarket environments, "produce recognition" on the basis of color as well as dimensions and geometrical form, each PLIIM-based subsystem 25" employs (i) a plurality of visible laser diodes (VLDs) having different color producing wavelengths to produce a multi-spectral planar laser illumination beam (PLIB) from the bottom and side imaging windows 604 and 605, and also (ii) a 2-D (area-type) CCD image detection array for capturing color images of objects (e.g. produce) as the objects are presented to the imaging windows of the bioptical system by the user or operator of the system (e.g. retail sales clerk).
Hand-Supportable Planar Laser Illumination And Imaging ('PLUM') Devices Employing Linear Image Detection Arrays And Optically-Combined Planar Laser Illumination Beams CPLIBS) Produced From A Multiplicity Of Laser Diode Sources To Achieve A Reduction In Speckle-Pattern Noise Power In Said Devices
In Fig. 39A, there is shown a first illustrative embodiment of the PLIIM-based hand- supportable imager of the present invention 1200. As shown, the PLIIM-based imager 1200 comprises: a hand-supportable housing 1201 ; a PLIIM-based image capture and processing engine 1202 contained therein, for projecting a planar laser illumination beam (PLIB) 1203 through its imaging window 1204 in coplanar relationship with the field of view (FOV) 1205 of the linear image detection array 1206 employed in the engine; a LCD display panel 1207 mounted on the upper top surface 1208 of the housing in an integrated manner, for displaying, in a real-time manner, captured images, data being entered into the system, and graphical user interfaces (GUIs) required in the support of various types of information-based transactions; a data entry keypad 1209 mounted on the middle lop surface of the housing 1210 for enabling the user to manually enter data into the imager required during the course of such information-based transactions; and an embedded-type computer and interface board 121 1 contained within the handle of the housing, for carrying out image processing operations such as, for example, bar code symbol decoding operations, signature image processing operations, optical character recognition (OCR) operations, and the like, in a high-speed manner, as well as enabling a high-speed data communication interface 1212 with a digital communication network 1213, such as a LAN or WAN supporting a networking protocol such as TCP/IP, AppleTalk or the like.
A First Illustrative Embodiment Of The Transportable PLHM-Based 3-D Digitization Device ("3-D Digitizer") Of The Present Invention
In Fig. 78 A, a first illustrative embodiment of the transportable PLIIM-based 3-D digitization device ("3-D digitizer") 2830 of the present invention is shown comprising: a transportable housing 2831 of lightweight construction, having a handle 2832 on its top portion for transporting system device about from one location to another, and four rubber feet 2834 on its base portion for supporting the device on any stable surface, indoors and outdoors alike; a PLIIM-based imaging and profiling subsystem 120 as described above, contained within the transportable housing 2831 , and including a PLIIM-based camera subsystem 25' and a LDIP subsystem 122, both described in detail in WIPO Publication No. 02/43195, supra. A Second Illustrative Embodiment Of The Transportable PLIIM-Based 3-D Digitization Device ("3-D Digitizer"') Of The Present Invention
In Fig. 79A, a second illustrative embodiment of the transportable PLIIM-based 3-D digitization device ("3-D digitizer") of the present invention 2850 is shown comprising: a transportable housing 2851 of lightweight construction, having a handle 2852 on its top portion for transporting system device about from one location to another, and four rubber feet 2853 on its base portion for supporting the device on any stable surface, indoors and outdoors alike; a PLIIM-based imaging and profiling subsystem 2855, contained within the transportable housing, and including a PLIIM-based camera subsystem 25" with a 2-D area CCD image detection array as shown in Figs. 6Dl through 6D5 and described above, and a LDIP subsystem 122 as described in detail in WIPO Publication No. 02/43195, supra.
A "Vertical-Type" 3-D PLIIM-Based CAT Scanning System Of The Present Invention
In Fig. 39D, a "vertical-type" 3-D PLIIM-based CAT scanning system of the present invention 2800 is shown comprising: a support base 2801 for supporting a human or animal subject during imaging operations; a pair of vertically extending rail structures 2802A and 2802B supported from the support base 2801 ; a motorized carriage 2803 supported on and adapted to travel along the length of each rail structure 2802A and 2802B at a programmably controlled velocity; a PLIIM-based imaging and profiling subsystem 120 mounted to each motorized 2803 for producing a pair of amplitude modulated (AM) laser scanning beams 2804 and a single planar laser illumination beam (PLIB) 2805, wherein the sets of PLIBs are orthogonal to each other; and a computer workstation 2806 with LCD monitor 2807, operably connected to each PLIIM-based imaging and profiling subsystem 120, for collecting and storing both linear image slices and 3-D range data profiles of the subject generated during scanning operations, so that the workstation can reconstruct to generate a 3-D geometrical model of the object using computer-assisted tomographic (CAT) techniques applied to the collected data, as described in detail in WIPO Publication No. 02/43195, supra.
A Hand-Supportable Mobile-Type PLIIM-Based 3-D Digitization Device Of The Present Invention
In Figs. 39El and 39E2, a hand-supportable mobile-type PLIIM-based 3-D digitization device 2810 of the present invention is shown comprising: a hand-supportable housing 281 1 having a handle structure 2812; a PLIIM-based camera subsystem 25'(or 25) mounted in the hand-supportable housing; a miniature-version of LDIP subsystem 122 mounted in the hand-supportable housing 281 1; a set of optically isolated light transmission apertures 2813 and 2813B for transmission of the PLIBs from the PLIIM-based camera subsystem mounted therein, and a light transmission aperture 2814 for transmission of the FOV of the PLIIM-based camera subsystem, during object imaging operations; a light transmission aperture 2815, optically isolated from light transmission apertures 2813 A, 2813B and 2814, for.transmission of the AM laser beam transmitted from the LDlP subsystem 1 122 during object profiling operations; a LCD view finder 2816 integrated with the housing, for displaying 3-D digital data models and 3-D geometrical models of laser scanned objects. The mobile laser scanning 3-D digitization device 2810 of Fig. 33El also has an Ethernet data communications port 2817 for communicating information files with other computing machines on a LAN to which the mobile device is connected, as described in detail in WlPO Publication No. 02/43195, supra.
Digital Image Capture and Processing Engine of the Present Invention Employing Linear Optical Waveguide Technology For Collecting and Conducting LED-Based Illumination In the Automatic Light Exposure Measurement and Illumination Control Subsystem During Object Illumination and Image Capture Modes of Operation
Referring to Figs. 40 through 54, it is appropriate at this juncture to describe the digital image capture and processing engine of the present invention 220 employing light-pipe technology 221 for collecting and conducting LED-based illumination in the automatic light exposure measurement and illumination control subsystem 15 during object illumination and image capture modes of operation.
As shown in Fig. 40, the digital image capture and processing engine 220 is shown generating and projecting a visible illumination-based Image Cropping Pattern (ICP) 200 within the field of view (FOV) of the engine, during object illumination and image capture operations, as described in connection with Figs. 31 through 37B. Typically, as shown, the digital image capture and processing engine 220 will be embedded or integrated within a host system 222 which uses the digital output generated from the digital image capture and processing engine 220. The host system 222 can be any system that requires the kind of information that the digital image capture and processing engine 220 can capture and process.
As shown in Figs. 41 and 47, the digital image capture and processing engine 220 depicted in Fig. 40 is shown comprising: an assembly of an illumination/targeting optics panel 223; an illumination board 224; a lens barrel assembly 225; a camera housing 226; a camera board 227; and image processing board 230. As shown, these components are assembled into an ultra-compact form factor offering advantages of light-weight construction, excellent thermal management, and exceptional image capture and processing performance. Also, camera housing 226 has a pair of integrated engine mounting projections 226A and 226B, each provided with a hole through which a mounting screw can be passed to fix the engine relative to an optical bench or other support structure within the housing of the host system or device.
In Fig. 47, the digital image capture and processing engine 220 shown in Fig. 46 reveals the integration of a linear optical waveguide (i.e. light conductive pipe) component 221 within the engine housing. Preferably, optical waveguide 221 is made from a plastic material having high light transmission characteristics, and low energy absorption characteristics over the optical band of the engine (which is tuned to the spectral characteristics of the LED illumination arrays and band-pass filter employed in the engine design). The function of optical waveguide 221 is to collect and conduct light energy from the FOV of the Multi-Mode Area-Type Image Formation and Detection Subsystem 13, and direct it to the photo-detector 228 mounted on the camera board 227, and associated with the Automatic Light Exposure Measurement and Illumination Control Subsystem 15. Notably, in the engine design of the illustrative embodiment, the optical waveguide 221 replaces the parabolic light collecting mirror 55 which is employed in the system design shown in Fig. 6A. Use of the optical waveguide 221 in subsystem 15 offers the advantage of ultra-small size and tight integration within the miniature housing of the digital image capture and processing engine. Upon assembling the engine components, the optical waveguide 221 aligns with the photodiode 228 on the camera board which supports subsystem 15, specified in great detail in Figs. 6B through 6C2.
In Fig. 50, an exploded, perspective view of the digital image capture and processing engine 220 is provided to show how the illumination/targeting optics panel 23, the illumination board 224, the lens barrel assembly 225, the camera housing 226, the camera board 227, and its assembly pins 23 IA through 23 I D are easily arranged and assembled with respect to each other in accordance with the principles of the present invention.
As shown in Fig. 50, the illumination board 224 of the illustrative embodiment supports four (4) LEDs 238A through 238D, along with driver circuitry, as generally taught in Figs. 6Cl and 6C2. Also, illumination/targeting optics panel 223 supports light focusing lenses 239A through 239D, for the LEDs in the illumination array supported on the illumination board 224. Optical principles and techniques for specifying lenses 239A through 239D are taught in Figs. 4B through 4D7, and corresponding disclosure here. While a wide-area near/far field LED illumination array is shown used in the digital image capture and processing engine of the illustrative embodiment 220, it is understood that the illumination array can be readily modified to support separate wide-area near field illumination and wide-area far field illumination, as well as narrow-area far and near fields of illumination, as taught in great detail herein with respect to systems disclosed in Figs. 1 through 39C2.
In Fig. 51 , the illumination/targeting optics panel 223, the illumination board 224 and the camera board 230 of digital image capture and processing engine 220 are shown assembled with the lens barrel assembly 225 and the camera housing 226 removed for clarity of illustration. In Fig. 52, the illumination/targeting optics panel 223 and the illumination board 224 are shown assembled together as a subassembly 232 using the assembly pins. In Fig. 53, the subassembly 232 of Fig. 52 is arranged in relation to the lens barrel assembly 225, the camera housing 226, the camera board 227 and the image processing board 230, showing how these system components are assembled together to produce the digital image capture and processing engine 220 of Fig. 40.
In Fig. 54, the digital image capture and processing engine 220 illustrated in Figs. 40 through 53, is shown comprising: a Multi-Mode Area-Type Image Formation and Detection (i.e. Camera) Subsystem 14 having image formation (camera) optics for producing a field of view (FOV) upon an object to be imaged and a CMOS or like area-type image sensing array 22 for detecting imaged light reflected off the object during illumination operations in either (i) a narrow-area image capture mode in which a few central rows of pixels on the image sensing array are enabled, or (ii) a wide-area image capture mode in which substantially all rows of the image sensing array are enabled; a LED-Based Illumination Subsystem 14 for producing a wide area field of narrow-band illumination within the FOV of the Image Formation And Detection Subsystem 13 during.the image capture mode, so that only light transmitted from the LED-Based Illumination Subsystem 14 and reflected from the illuminated object and transmitted through a narrow-band transmission-type optical filter realized within the hand- supportable housing (i.e. using a red-wavelength high-pass reflecting window filter element disposed at the light transmission aperture thereof and a low-pass filter before the image sensor) is detected by the image sensor and all other components of ambient light are substantially rejected; an Image Cropping Pattern Generator 203 for generating a visible illumination-based Image Cropping Pattern (ICP) 200 projected within the field of view (FOV) of the Multi-Mode Area-type Image Formation and Detection Subsystem 13; an IR-Based Object Presence And Range Detection Subsystem 12 for producing an IR- based object detection field within the FOV of the Image Formation and Detection Subsystem 13; an Automatic Light Exposure Measurement and Illumination Control Subsystem 14 for measuring illumination levels in the FOV and controlling the operation of the LED-Based Multi-Mode Illumination Subsystem 14 during the image capture mode; an Image Capturing and Buffering Subsystem 16 for capturing and buffering 2-D images detected by the Image Formation and Detection Subsystem 13; an Image Processing and Cropped Image Locating Module 201 for processing captured and buffered images to locate the image region corresponding to the region defined by the Image Cropping Pattern (ICP) 200; an Image Perspective Correction and Scaling Module 202 for correcting the perspective of the cropped image region and scaling the corrected image to a predetermined (i.e. fixed) pixel image size suitable for decode-processing; a Multimode Image-Processing Based Bar Code Symbol Reading Subsystem 17 for processing cropped and scaled images generated by the Image Perspective and Scaling Module 202 and reading 1 D and 2D bar code symbols represented; and an Input/Output Subsystem 18 for outputting processed image data and the like to an external host system or other information receiving or responding device, in which each said subsystem component is integrated about a System Control Subsystem 19, as shown.
"Notably, use of FOV folding mirror 236 can help to achieve a wider FOV beyond the light transmission window, while using a housing having narrower depth dimensions. Also, use of the linear optical waveguide 221 obviates the need for large aperture light collection optics which requires significant space within the housing.
Digital Image Capture and Processing Engine of the Present Invention Employing Curved Optical Waveguide Technology For Collecting and Conducting LED-Based Illumination In the Automatic Light Exposure Measurement and Illumination Control Subsystem During Object Illumination and Image Capture Modes of Operation
In Fig. 55A, an alternative embodiment of the digital image capture and processing engine 220 of the present invention is shown reconfigured in such as way that the illumination/aiming subassembly 232 (depicted in Fig. 52) is detached from the camera housing 226 and mounted adjacent the light transmission window 233 of the engine housing 234. The remaining subassembly, including lens barrel assembly 225, the camera housing 226, the camera board 227 and the image processing board 230 is mounted relative to the bottom of the engine housing 234 so that the optical axis of the camera lens assembly 225 is parallel with the light transmission aperture 233. A curved optical waveguide 221 is used to collect light from a central portion of the field of view of the engine, and guide the collected light to photodiode 228 on the camera board 227. In addition, a field of view (FOV) folding mirror 236 is mounted beneath the illumination/aiming subassembly 232 for directing the FOV of the system out through the central aperture 237 formed in the illumination/aiming subassembly 232. Use of the FOV folding mirror 236 in this design can help to achieve a wider FOV beyond the light transmission window, while using housing having narrower depth dimensions. Also, use of the curved optical waveguide 221 obviates the need for large aperture light collection optics which requires significant space within the housing.
Automatic Imaging-Based Bar Code Symbol Reading System of the Present Invention Supporting Presentation-Type Modes of Operation Using Wide-Area Illumination and Video Image Capture and Processing Techniques
In Figs. 55Bl, 55B2 and 55B3, a presentation-type imaging-based bar code symbol reading system 300 is shown constructed using the general components of the digital image capture and processing engine of Figs. 55Al . As shown, the illumination/aiming subassembly 232' of Fig. 52 is mounted adjacent the light transmission window 233' of the system housing 301. The remaining subassembly, including lens barrel assembly 225', the camera housing 226', the camera board 227' and the image processing board 230, is mounted relative to the bottom of the engine housing 234' so that the optical axis of the camera lens is parallel with the light transmission aperture 233'. In addition, a field of view (FOV) folding mirror 236' is mounted beneath the illumination/aiming subassembly 232' for directing the FOV of the system out through the central aperture formed in the illumination/aiming subassembly 232.
Automatic Imaging-Based Bar Code Symbol Reading System Of The Present Invention Supporting a Pass-Through Mode Of Operation Using Narrow-Area Illumination and Video Image Capture And Processing Techniques, and a Presentation-Type Mode Of Operation Using Wide-Area Illumination and Video Image Capture and Processing Techniques
In Figs. 55Cl through 55C4, there is shown an automatic imaging-based bar code symbol reading system of the present invention 400 supporting a pass-through mode of operation illustrated in Fig. 55C2 using narrow-area illumination and video image capture and processing techniques, and a presentation-type mode of operation illustrated in Fig. 55C3 using wide-area illumination and video image capture and processing techniques. As shown in Figs. 55Cl through 55C4, the POS-based imaging system 400 employs a digital image capture and processing engine similar in design to that shown in Figs. 55BB 1 and 55B2 and that shown in Fig. 2Al , except for the following differences:
(1 ) the Automatic Light Exposure Measurement and Illumination Control Subsystem 15 is adapted to measure the light exposure on a central portion of the CMOS image sensing array and control the operation of the LED-Based Multi-Mode Illumination Subsystem 14 in cooperation with a the Multi-Mode Image Processing Based Bar Code Symbol Reading Subsystem 17 employing software for performing real-time "exposure quality analysis" of captured digital images in accordance with the adaptive system control method of the present invention, illustrated in Figs. 27A through 27E;
(2) the substantial ly-coplanar narrow-area field of illumination and narrow-area FOV 401 are oriented in the vertical direction (i.e. oriented along Up and Down directions) with respect to the counter surface of the POS environment, so as to support the "pass-through" imaging mode of the system, as illustrated in Fig. 55C2; and
(3) the IR-based object presence and range detection system 12 employed in Fig. 55 A2 is replaced with an automatic IR-based object presence and direction detection subsystem 12' comprising four independent IR-based object presence and direction detection channels (i.e. fields) 402A, 402B, 402C and 402D, generated by IR LED and photodiode pairs 12Al , 12A2, 12A3 and 12A4 respectively, which automatically produce activation control signals Al(t), A2(t), A3(t) and A4(t) upon detecting an object moving through the object presence and direction detection fields, and a signal analyzer and control logic block 12B' for receiving and processing these activation control signals Al(t), A2(t), A3(t) and A4(t), according to Processing Rules 1 through 5 set forth in Fig. 55C4, so as to generate a control activation signal indicative that the detected object is being moved either in a "pass-though" direction (e.g. L — >R, R — >L, U->D, or D-^ U)3 or in a "presentation" direction (towards the imaging window of the system).
Preferably, this POS-based imaging system supports the adaptive control process illustrated in Fig. 27A through 27E, and in the illustrative embodiment of the present invention, operates generally according to System Mode No. 17, described hereinabove. In this POS-based imaging system, the "trigger signal" is generated from the automatic IR-based object presence and direction detection subsystem 12'. In the illustrative embodiment, the trigger signal can take on one or three possible values, namely: (1 ) that no object has been detected in the FOV of the system; (2) that an object has been detected in the FOV and is being moved therethrough in a "Pass-Through" manner; or that an object has been detected in the FOV and is being moved therethrough in a Presentation" manner (i.e. toward the imaging window). For purposes of explanation below, trigger signal (1 ) above is deemed a "negative" trigger signal, whereas trigger signals (2) and (3) are deemed "positive" trigger signals.
In the event that the "Pass-Through" Mode (illustrated in Fig. 55C2) is enabled in response to detected movement of the object in the FOV from L — R or R--> L, then the SCPs would be initially configured as follows:
(1) the shutter mode parameter will be set to the "Video Mode"(illustrated in Fig. 2E); (2) the electronic gain of the image sensor will be set to a default value determined during factory calibration;
(3) the exposure time for blocks of image sensor pixels will be set to a default determined during factory calibration;
(4) the illumination mode parameter will be set to "continuous";
(5) the automatic illumination control parameter will be set to "ON";
(6) the illumination field type will be set to "narrow-area field";
(7) the image capture mode parameter will be set to "narrow-area image capture";
(8) the image capture control parameter will be set to "video frame";
(9) the image processing mode will be set, for example, to a default value; and
( 10) the automatic object detection mode will be set to "ON". Also, the SCPR flag will be set to its FALSE value.
On the other hand, if the event that the "Presentation" Mode (illustrated in Fig. 55C3) is enabled in response to detected movement of the object in the FOV towards the imaging window of the system, then the SCPs would be initially configured as follows:
(1 ) the shutter mode parameter will be set to the "Video Mode"(illustrated in Fig. 2E);
(2) the electronic gain of the image sensor will be set to a default value determined during factory calibration;
(3) the exposure time for blocks of image sensor pixels will be set to a default determined during factory calibration;
(4) the illumination mode parameter will be set to "continuous";
(5) the automatic illumination control parameter will be set to "ON";
(6) the illumination field type will be set to "wide-area field";
(7) the image capture mode parameter will be set to "wide-area image capture";
(8) the image capture control parameter will be set to "video frame";
(9) the image processing mode will be set, for example, to a default value; and
(10) the automatic object detection mode will be set to "ON". Also, the SCPR flag will be set to its FALSE value.
Adaptive (Camera") System Control During Pass-Through Mode of Operation
Upon the generation of a "positive" trigger signal from subsystem 12' (i.e. that an object has been detected in the FOV and is being moved therethrough in a "Pass-Through" manner, or that an object has been detected in the FOV and is being moved therethrough in a Presentation" manner), the system will reconfigure itself only if the SCPR flag is TRUE; otherwise, the system will maintain its current SCPs. During the first pass through STEP 1 , the SCPR flag will be FALSE, and therefore the system will maintain its SCPs at their default settings. For purpose of illustration, assume that trigger signal (2) was generated, indicative of Pass-Through object detection and movement. Then at STEP 2 in Fig, 27 A, the object will be continuously illuminated within a narrow-field of LED-based illumination produced by the illumination subsystem, and a sequence of narrow-area digital images will be captured by the image formation and detection subsystem and buffered to reconstruct 2D images, while the CMOS image sensing array is operated in its Video Mode of operation.
At STEP 3 in Fig. 27B, the reconstructed digital image will be analyzed for exposure quality (e.g. brightness level, saturation etc.).
At STEP 4, if the measured/calculated exposure quality values do not satisfy the exposure quality threshold (EQT) parameters, then the system recalculates new SCPs and sets the SCPR flag to TRUE, indicating that the system must be reconfigured prior to acquiring a digital image during the next wide-area image acquisition cycle while the CMOS sensing array is operated in its Video Mode. Otherwise, the SCPs are maintained by the system.
At STEP 5, the system attempts to read a ID bar code symbol in the captured reconstructed 2D digital image.
At STEP 6, if the system is incapable of reading the bar code symbol (i.e. decoding fails), then the system returns to STEP 1 and reconfigures its SCPs if the SCPR flag is set to TRUE (i.e. indicative of unsatisfactory exposure quality in the captured image). In the case of reconfiguration, the system might reset the SCPs as follows:
(1 ) the shutter mode parameter- set to "Video Mode" illustrated in Fig. 27E;
(2) the electronic gain of the image sensor — set to the value calculated during STEP 4;
(3) the exposure time for blocks of image sensor pixels — set to a values determined during STEP 4;
(4) the illumination mode parameter — set to "continuous";
(5) the automatic illumination control parameter will be set to "ON";
(6) the illumination field type will be set to "narrow-area field";
(7) the image capture mode parameter will be set to "narrow-area image capture";
(8) the image capture control parameter will be set to "video frame";
(9) the image processing mode will be set to the default value; and
(10) the automatic object detection mode will be set to ON.
Then at STEPS 2-4, the system captures a second 2D image using continous LED illumination and the image sensing array configured in its Video Mode (illustrated in Fig. 27E), and recalculates Exposure Quality Threshold Parameters and if the exposure quality does not satisfy the current Exposure Quality Threshold Parameters, then the system calculates new SCPs and sets the SCPR flag to TRUE. Otherwise, the system maintains the SCPs, and proceeds to attempt to decode a bar code symbol in the 2D reconstructed digital image captured using continuous LED illumination.
If at STEPS 5 and 6, bar code decoding is successful, then at STEP 7 the system transmits the results (i.e. symbol character data) to the host the system, and/or at STEP 8, transmits the captured digital image to the host system for storage or processing, or to internal memory for storage, and then exits the control process at STEP 9.
If at STEPS 5 and 6 in Block B2 in Fig. 27B, bar code decoding fails, then the system returns to STEP 1 , and reconfigures for narrow-area illumination and image capture. If while operating in its narrow-area illumination and image capture modes of operation, the image captured by the system had an "exposure quality" which did not satisfy the Exposure Quality Threshold Parameters and indicated that the light exposure was still too bright and saturated, and the recalculated SCPs required switching to a new level of electronic gain (or illumination control), to reduce exposure brightness, then at STEP 1 the SCPs are reconfigured using the SCPs previously computed at STEP 4. Thereafter, the object is illuminated using, for example, ambient illumination and captured at STEP 2, and at STEP 3, the captured/ reconstructed 2D image is analyzed for exposure quality, as described above. At STEP 4, the exposure quality measured in STEP 3 is compared with the Exposure Quality Threshold parameters, and if it does not satisfy these parameters, then new SCPs are calculated and the SCPR flag is set to TRUE. Otherwise the system maintains the SCPs using current SCPs. At STEPs 5 and 6, bar code decoding is attempted, and if it is successful, then at STEPS 7 and 8, symbol character data and image data are transmitted to the host system, and then the system exits the control process at STEP 9. If bar code decoding fails, then the system returns to STEP 1 to repeat STEPS within Blocks Bl and B2 of Figs. 27A and 27B, provided that the automatic trigger signal (2) is still persistent (indicative that the object is still within the field of view of the digital imager). During this second pass through the control loop of Blocks B l and B2, the system will reconfigure the system as determined by the exposure quality analysis performed at STEP Bl , and calculations performed at STEP 4. Notably, such calculations could involve calculating new SCPs that require adjusting illumination and/or image sensing array parameters during the narrow-area image capture mode, that is, as the analysis of the facts may require, according to the adaptive control process of the present invention. Recycling this control loop will reoccur as long as a bar code symbol has not been successfully read, and the automatic trigger signal (2) is persistently generated by the IR-based automatic object detecting subsystem 12'.
Adaptive System Control During Presentation (Camera') Mode of Operation
In the event that trigger signal (3) was generated, indicative of Presentation object detection and movement, then at STEP 2 in Fig. 27 A, the object will be continuously illuminated within a wide-Field of LED-based illumination produced by the illumination subsystem, and a sequence of wide-area (2D) digital images will be captured by the image formation and detection subsystem and buffered, while the CMOS image sensing array is operated in its Video Mode of operation.
At STEP 3 in Fig. 27B, the reconstructed digital image will be analyzed for exposure quality (e.g. brightness level, saturation etc.).
At STEP 4, if the measured/calculated exposure quality values do not satisfy the exposure quality threshold (EQT) parameters, then the system recalculates new SCPs and sets the SCPR flag to TRUE, indicating that the system must be reconfigured prior to acquiring a digital image during the next wide-area image acquisition cycle while the CMOS sensing array is operated in its Video Mode. Otherwise, the SCPs are maintained by the system.
At STEP 5, the system attempts to read a 1 D bar code symbol in the captured wide-area digital image.
At STEP 6, if the system is incapable of reading the bar code symbol (i.e. decoding fails), then the system returns to STEP 1 and reconfigures its SCPs if the SCPR flag is set to TRUE (i.e. indicative of unsatisfactory exposure quality in the captured image). In the case of reconfiguration, the system might reset the SCPs as follows:
(1 ) the shutter mode parameter- set to "Video Mode" illustrated in Fig. 27E;
(2) the electronic gain of the image sensor — set to the value calculated during STEP 4;
(3) the exposure time for blocks of image sensor pixels — set to a values determined during STEP 4;
(4) the illumination mode parameter — set to "continuous";
(5) the automatic illumination control parameter will be set to "ON";
(6) the illumination field type will be set to "wide-area field";
(7) the image capture mode parameter will be set to "wide-area image capture";
(8) the image capture control parameter will be set to "video frame";
(9) the image processing mode will be set to the default value; and
(10) the automatic object detection mode will be set to ON.
Then at STEPS 2-4, the system captures a second 2D image using continuous LED illumination and the image sensing array configured in its Video Mode (illustrated in Fig. 27E), and recalculates Exposure Quality Threshold Parameters and if the exposure quality does not satisfy the current Exposure Quality Threshold Parameters, then the system calculates new SCPs and sets the SCPR flag to TRUE. Otherwise, the system maintains the SCPs, and proceeds to attempt to decode a bar code symbol in the 2 D reconstructed digital image captured using continuous LED illumination.
If at STEPS 5 and 6, bar code decoding is successful, then at STEP 7 the system transmits the results (i.e. symbol character data) to the host the system, and/or at STEP 8, transmits the captured digital image to the host system for storage or processing, or to internal memory for storage, and then exits the control process at STEP 9.
If at STEPS 5 and 6 in Block B2 in Fig. 27B, bar code decoding fails, then the system returns to STEP 1, and reconfigures for wide-area illumination and image capture. If while operating in its wide- area illumination and image capture modes of operation, the image captured by the system had an "exposure quality" which did not satisfy the Exposure Quality Threshold Parameters and indicated that the light exposure was still too bright and saturated, and the recalculated SCPs required switching to a new level of electronic gain (or illumination control), to reduce exposure brightness, then at STEP 1 the SCPs are reconfigured using the SCPs previously computed at STEP 4. Thereafter, the object is illuminated with ambient illumination and captured at STEP 2, and at STEP 3, the captured wide-area image is analyzed for exposure quality, as described above. At STEP 4, the exposure quality measured in STEP 3 is compared with the Exposure Quality Threshold parameters, and if it does not satisfy these parameters, then new SCPs are calculated and the SCPR flag is set to TRUE. Otherwise the system maintains the SCPs using current SCPs. At STEPs 5 and 6, bar code decoding is attempted, and if it is successful, then at STEPS 7 and 8, symbol character data and image data are transmitted to the host system, and then the system exits the control process at STEP 9. If bar code decoding fails, then the system returns to STEP 1 to repeat STEPS within Blocks Bl and B2 of Figs. 27A and 27B, provided that the automatic trigger signal (3) is still persistent (indicative that the object is still within the field of view of the digital imager). During this second pass through the control loop of Blocks Bl and B2, the system will reconfigure the system as determined by the exposure quality analysis performed at STEP Bl, and calculations performed at STEP 4. Notably, such calculations could involve calculating new SCPs that require adjusting illumination and/or image sensing array parameters during the wide-area image capture mode, that is, as the analysis of the facts may require, according to the adaptive control process of the present invention. Recycling this control loop will reoccur as long as a bar code symbol has not been successfully read, and the automatic trigger signal (3) is persistently generated by the IR- based automatic object detecting subsystem 12'.
By virtue of the intelligent automatic pass-through/presentation digital image capture and processing system of the present invention, it is now possible for operators to move objects past the imager in either a pass-through or presentation type manner, and the system will automatically adapt and reconfigure itself to optimally support the method of image-based scanning chosen by the operator.
Alternative Embodiments of Imaεing-Based Bar Code Symbol Reading System Of The Present Invention
In Fig. 56A3 a first alternative embodiment of a projection-type POS image-processing based bar code symbol reading system 250 is shown employing the digital image capture and processing engine 220 or 220'. As shown, system 250 includes a housing 241 which may contain the engine housing shown in Fig. 55Al, or alternatively, it may support the subassemblies and components shown in Fig. 55Al .
In Fig. 56B, a second illustrative embodiment of a projection-type POS image-processing based bar code symbol reading system 260 is shown employing the digital image capture and processing engine 220 or 220'. As shown, system 260 includes a housing 261 which may contain the engine housing shown in Fig. 55Al , or alternatively, it may support the subassemblies and components shown in Fig. 55Al .
In Fig. 56C, a third illustrative embodiment of a projection-type POS image-processing based bar code symbol reading system 270 is shown employing the digital image capture and processing engin2 220 or 220'. As shown, system 270 includes a housing portion 271 (containing engine 220 or 220'), and a base portion 272 for rotatably supporting housing portion 271. Housing portion 271 may contain the engine housing shown in Fig. 55Al , or alternatively, it may support the subassemblies and components shown in Fig. 55Al .
In each of the POS-based systems disclosed in Figs. 56A, 56B and 56C5 the number of VLDs mounted on the illumination board 224 can be substantially greater than four (4), as shown in the illustrative embodiment in Fig. 55. The exact number of LEDs used in the illumination will depend on the end-user application requirements at hand. Also, the IR-Based Object Presence And Range Detection Subsystem 12 employed therein may be used to detect the range of an object within the FOV, and the LED-Based Illumination Subsystem 14 may include both long and short range wide-area LED illumination arrays, as disclosed hereinabove, for optimized illumination of long and short range regions of the FOV during image capture operations.
In Fig. 57, a price lookup unit (PLU) system 280 is shown comprising: a housing 281 with mounting bracket; a LCD panel 282; a computing platform 283 with network interfaces etc, and a digital image capture and processing subsystem 220 or 220' of the present invention, for identifying bar coded consumer products in retail store environments, and displaying the price thereof on the LCD panel 282.
Method Of And Apparatus For Modifying and/or Extending System Features And Functions Within A Digital Image Capture and Processing System In Accordance With Principles Of The Present Invention
Referring now to Figs. 58 through 59C2, the method of and apparatus for extending the standard system features and functions within a digital image capture and processing system of the present invention, will now be described below. While it is understond that any of the digital image capture and processing systems described and disclosed herein could be referred to for purposes of illustrating the novel plug-in programming methodology of the present invention, described in Figs. 58 through 59C2, reference will be made to the digital imaging based bar code reading system shown in Figs. 2A through 18 for purposes of illustration, and not limitation.
As indicated in Block A of Fig. 58, the first step involves the "system designer" of the Imaging- based Bar Code Symbol Reading System (having a multi-tier software architecture), determining which "features" of the system (implemented by Tasks called in the Application Layer) and which functions within any given feature, will be modifiable and/or extendable by end-users and/or third-party persons other (than the original designer and the manufacturer, e.g. VARs3 end-users, customers et al.) without having detailed knowledge of the system's hardware platform, its communication interfaces with the outside environment, or its user interfaces. This step by the system designer establishes constraints on system modification by others, yet provides degrees of freedom on how the system can be modified to meet custom requirements of end-user applications.
As indicated in Block B of Fig. 58, based on such determinations, the system designer designs and makes the image-processing based bar code reading system of the present invention, wherein persons other than the system designer (e.g. end-users and third-parties) are permitted to modify and/or extend the- system features and functionalities of the original product/system specified by the system designer (i.e. designer of the original product/system) in Block A.
As indicated in Block C of Fig. 58, persons other than the system designer, then determine which modifiable and/or extendable system features and functions they wish to modify and/or extend to meet a particular set of end-user application requirements.
As indicated in Block D of Fig. 58, for each modifiable feature/function to be modified in the system, persons other than the system designer develop a "plug-in module" (third-party code or "software object") to implement the designed custom system feature, and thereafter they install the plug-in module (i.e. third-party code) within the suitable Library(ies) in the Application Layer of the multi-tier system.
As indicated in Block E of Fig. 58, persons other than the system designer reconfigure the functions associated with each modifiable and/or extendible feature within the system by either sending communications from a host system, or by reading function-reconfiguring bar code symbols.
Having provided a brief overview on the system feature/functionality modification methodology of the present invention, it is now in order to describe these method steps in greater detail referring to Fig. 10, and Figs. 58 through 59C2, in particular.
In the illustrative embodiment, each plug-in module, stored within the Plug-In and Configuration File Library, shown in Fig. 10, consists of the set of software libraries (object modules) and configuration files. They can be downloaded to the Image-Processing Based Bar Code Symbol Reading System from an external host system, such as Plug-in Development Platform implemented on a host PC, and using various standard or proprietary communication protocols to communicate with the OS layer of the system. In the Image-Processing Based Bar Code Symbol Reading System, this operation is performed by the Metroset task or User Command Manager (see Software Block Diagram) upon reception of the appropriate command from the host system. Once the download is complete, the plug-in files are stored in the file system of the Image-Processing Based Bar Code Symbol Reading System.
The management of all plug-in modules (i.e. third-party code) is performed by the Plug-in Controller shown in Fig. 10. The Plug-in Controller can perform operations such as: load (install) plug- in module from the file system to the executable memory of the Image-Processing Based Bar Code Symbol Reading System and perform dynamic linking of the plug-in libraries with the Application; unload (uninstall) the plug-in module; provide executable address of (i.e. Place Holder for) the plug-in module (i.e. third-party code) to the Application; provide additional information about the plug-in module to the Application, such as the rules of the plug-in engagement as described in the plug-in configuration file.
Any task of the Image-Processing Based Bar Code Symbol Reading System can request information from the Plug-in Controller about a plug-in module and/or request an operation on it. For a set of predetermined features, the Application tasks can request the Plug-in Controller to check the availability of a third-party plug-in module, and if such module is available, install it and provide its executable address as well as the rules of the plug-in engagement. The tasks then can execute it either instead or along with the "standard" module that implements the particular feature. The rules of engagement of the plug-in module, i.e. determination whether the plug-in module should be executed as a replacement or a complimentary module to the "standard" module, can be unique to the particular feature. The rules can also specify whether the complimentary plug-in module should be executed first, prior to the "standard" module, or after. Moreover, the plug-in module, if executed first, can indicate back to the device whether the "standard" module should also be called or not, thus, allowing the alteration of the device's behavior. The programming interfaces are predefined for the features that allow the plug-in functionality, thus, enabling the third-parties to develop their own software for the device.
Consider, as a first and very simple example, the Image Pre-Processing Plug-in described in Fig. 32A. The original equipment manufacturer of the Image-Processing Based Bar Code Symbol Reading System supplies the system's "standard" Image Pre-Processing Module (i.e. "original product code" of executable binary format), which is normally executed by the Main Task at Block D in Fig. 32, after the system acquires an image at Block C. In accordance with the principles of the present invention, the customer can provide its own image preprocessing software as a plug-in module (i.e. "third-party code") to the multi-tier software-based system. Notably, the third-party code is typically expressed in executable binary format. The plug-in can be described in a "Image Preprocessing Plug-in Configuration File", having a format, for example, as expressed below:
// Image Preprocessing Configuration File
//type param library function
IMGPREPR: Iibimgpreprj>lugin.so.l->PluginImgprepr
IMGPREPR_PROGMD: libimgprepr_plugin.so.l->PluginImgpreprProgmd
IMGPREPR_PROGBC: libimgprepr_p!ugin.so.l ->PluginImgpreprProgbc
The block-diagram set forth in Fig. 59 A illustrates the logic of the Image Preprocessing plug- in.
Consider, as a second, more interesting example, the Image Processing and Barcode Decoding Plug-in described in Fig. 59B. The original equipment manufacturer of the Image-Processing Based Bar Code Symbol Reading System supplies the system's "standard" Image Processing and Barcode Decoding Module, which is normally executed by the Main Task after the system acquires an image, as indicated in Fig. 59. In accordance with the principles of the present invention, the customer can provide its own image processing and barcode decoding software as a plug-in module to the multi-tier software-based system. The plug-in can be described in a "Image Processing and Barcode Decoding Plug-in Configuration File", having a format, for example, as expressed below:
// Decode Plug-in Configuration File //type param library function
DECODE: 0x02: libdecode_plugin.so.l ->PluginDecode
wherein "DECODE" is a keyword identifying the image processing and barcode decoding plug-in; wherein "0x02" is the value identifying the plug-in's rules of engagement; wherein "libdecode_plugin.so.l " is the name of the plug-in library in the device's file system; and wherein "PluginDecode" is the name of the plug-in function that implements the customer-specific image processing and barcode decoding functionality.
The individual bits of the value "param", which is used as the value indicating the rules of this particular plug-in's engagement, can have the following meaning: bit meaning
0 0 = compliment standard; 1 = replace standard
1 (if bitO==O) 0 = call before standard func; 1 = call after standard func
2 reserved
The value "0x02", therefore, means that the customer plug-in is a complimentary, not a replacement, module (the bit "0" is 0), and it should be executed after the execution of the standard module (bit "1" is I).
The block-diagram set forth in Fig. 32B illustrates the logic of the Image Processing and Barcode Decoding plug-in.
Consider, as a third example, the Image Processing and Barcode Decoding Plug-in described in Fig. 59Cl . The original equipment manufacturer of the Image-Processing Based Bar Code Symbol Reading System supplies the system's "standard" Image Processing and Barcode Decoding Module, which is normally executed by the Main Task after the system acquires an image as indicated in Fig. 59. In accordance with the principles of the present invention, the customer can provide its own image processing and barcode decoding software as a plug-in module to the multi-tier software-based system. The plug-in can be described in a "Image Processing and Barcode Decoding Plug-in Configuration File", having a format, for example, as expressed below:
// Data Formatting Plug-in Configuration File
//type param library function
PREFORMAT: libformat_plugin.so.l ->PluginPreformat
FORMAT_PROGMD: libformat_plugin.so.l->PluginFormatProgmd
FORM AT_P ROGBC: libformat_plugin.so.l ->PluginFormatProgbc The block-diagram set forth in Fig. 59Cl illustrates the logic of the Data Formatting Procedure plug-in.
The Plug-Ins described above provide a few examples of the many kinds of plug-ins (objects) that be developed so that allowed features and functionalities of the system can be modified by persons other than the system designer, in accordance with the principles of the present invention. Other system features and functionalities for which Plug-in modules can be developed and installed within the Image-Processing Based Bar Code Symbol Reading System include, but are not limited to, control over functions supported and performed by the following systems: the IR-based Object Presence and Range Detection Subsystem 12; the Multi-Mode Area-type Image Formation and Detection (i.e. camera) Subsystem 13; the Multi-Mode LED-Based Illumination Subsystem 14; the Automatic Light Exposure Measurement and Illumination Control Subsystem 15; the Image Capturing and Buffering Subsystem 16; the Multi-Mode Image-Processing Bar Code Symbol Reading Subsystem 17; the Input/Output Subsystem 18; the manually-actuatable trigger switch 2C; the System Mode Configuration Parameter Table 70; the System Control Subsystem 18; and any other subsystems which may be integrated within the Image-Processing Based Bar Code Symbol Reading System.
Having described the structure and function of Plug-In Modules that can be created by persons other than the OEM system designer, it is now in order to describe an illustrative embodiment of the Plug-In Development Platform of the present invention with reference to Figs. 10 and 1 1.
In the illustrative embodiment, the system designer/OEM of the system (e.g. Metrologic Focus™1690 Image-Processing Bar Code Reader) will provide the plug-in developer with a CD that contains, for example, the following software tools:
Arm Linux Toolchain for Linux PC
This directory contains the Arm Linux cross-compiling toolchain package for IBM-compatible Linux PC.
Arm Linux Toolchain for Cygwin
This directory contains the Arm Linux cross-compiling toolchain package for IBM-compatible Windows PC. The Cygwin software must be installed prior to the usage of this cross-compiling toolchain.
Plug-in Samples
This directory contains sample plug-in development projects. The plug-in software must be compiled on the IBM-compatible Linux PC using the Arm Linux Toolchain for Linux PC or on Windows PC with installed Cygwin software using Arm Linux Toolchain for Cygwin.
FWZ Maker
This directory contains the installation package of the program FWZ Maker for Windows PC. This program is used to build the FWZ-files for downloading into the Focus 1690 scanner.
Latest Metrologic ® Focus™ Software This directory contains the FWZ-fϊle with the latest Metrologi® Focus™ scanner software.
The first step of the plug-in software development process involves configuring the plug-in developer platform by installing the above tools on. the host/developer computer system. The next step involves installing system software onto the Image-Processing Bar Code Reader, via the host plug-in developer platform using a communications cable between the communication ports of the system and the plug-in developer computer shown in Figs. 10 and 1 1 .
To develop plug-in software, a corresponding shared library can be developed on the plug-in developer platform (i.e. the Linux PC) or in Windows Cygwin, and then the proper plug-in configuration file. The plug-in configuration file is then be loaded to the "/usr" directory in the case of developing a plug-in for example, an image capture and receiving device, such as Metrologic's Focus™ image-processing bar code reader. In this illustrative embodiment, each line of the plug-in configuration file contains information about a plug-in function in the following format: plug-in type: parameter: filename -> function _name
wherein plug-in type is one of the supported plug-in type keywords, followed by the field separator ":"; wherein parameter is a number (could be decimal or hex, if preceded with Ox)3 having a specific and unique meaning for some plug-in functions. The parameter is also called a "call-mode", for it can provide some specific information on how the plug-in should be called. The parameter is not required and can be omitted. If specified, the parameter must be followed by the field separator ":"; wherein filename is the name of the shared library, followed by the filename separator "->". The filename can contain a full-path to the library. If the path is omitted, the library is assumed to be located in either "/usr/local/Iib" or "/usr/lib/" directory in the Focus scanner. It is therefore important to make sure that the shared library is loaded to the correct directory in the Focus scanner, as specified by the plug-in configuration file; and wherein function _name is the name of the corresponding plug-in C function.
Notably, that the configuration file can also contain single-line C-style comments.
It is within the discretion of the plug-in developer to decide which plug-jn functions (of those supported by the system designer) should be included in the plug-in module (i.e. "object"). Once the shared library is built and configuration file is prepared on the plug-in development platform (illustrated in Figs. 10 and 1 1 ), the plug-in developer can then generate the FWZ file and include the configuration file and the shared library in it using FWZ Maker program on the Windows PC. Thereafter, the FWZ file can be downloaded to Metrologic's Focus™ Image-processing bar code reader using, for example, Metrologic's Metroset program's Flash Utility tool.
In the case of installing plug-in software for Metrologic's Focus™ Image-processing bar code reader, it is recommended not to use dynamic memory allocation and have static buffers rather than allocating them dynamically. As far as the filesystem is concerned, if necessary to store data in a file, then the locations such as "/usr/" and "/usr/Iocal" are recommended for storing data in non-volatile Flash memory; the "/tmp" directory can be used to store data in RAM.
Programming Barcodes and Programming Modes
In the illustrative embodiment, configuration of image-processing bar code reader of the present invention can be changed via scanning special programming barcodes, or by sending equivalent data to the reader from the host computer (i.e. plug-in development computer). Programming barcodes are usually Code 128 symbols with the Fn3 codeword.
When scanning a programming barcode, the reader may or may not be in its so-called programming mode. When the reader is not in its programming mode, the effect of the programming barcode is supposed to be immediate. On the other hand, when the reader is in its programming mode, the effect of all the programming barcodes read during the programming mode should occur at the time when the reader exits the programming mode.
There is a special set of programming barcodes reserved for the plug-in software configuration purposes. These barcodes have at least 4 data characters, and the first three data characters are "990". It is recommended (but not required) that the Decode Plug-in use programming barcodes having 6 characters long, starting with "9900xx". It is recommended (but not required) that the Image Preprocessing Plug-in use programming barcodes having 6 characters long, starting with "9901 xx". It is recommended (but not required) that the Formatting Plug-in use programming barcodes having 6 characters long, starting with "9902xx".
Once a plug-in module has been developed in accordance with the principles of the present invention, the plug-in can be uninstalled by simply downloading an empty plug-in configuration file. For example, to uninstall a Decode plug-in, download an empty "decode.plugin" file into the "/usr" directory of the File system within the OS layer, shown in Fig. 10.
Details about the Decode Plug-in of the Illustrative Embodiment
The purpose of the Decode Plug-in is to provide a replacement or a complimentary barcode decoding software to the standard Focus barcode decoding. The Decode Plug-in can have the following plug-in functions:
DECODE; DECODE_ENABLE2D; DECODE_PROGMD; DECODE_PROGBC.
DECODE Plug-in Function
This function is called to perform a barcode decoding from the given image in memory. Image is represented in memory as a two-dimensional array of 8-bit pixels. The first pixel of the array represents the upper-left corner of the image.
Function prototype: int /* Return: number of decoded barcodes; negative if error */
(*PLUGIN_DECODE)( void *p_image, /* Input: pointer to the image */ int size_x, /* Input: number of columns */ int size_y, /* Input: number of rows */ int pitch, /* Input: row size, in bytes */
DECODE_R£SULT *p_decode_results, /* Output: decode results */ int max_decodes, /* Tnput: maximum decode results allowed */ int *p_cancel_fiag); /* Input: if not MULL, pointer to the cancel flag */ Note that p_decode_results points to the location in memory where the Decode plug-in function should store one or more results of barcode decoding (if of course the plug-in successfully decodes one or more barcodes in the given image) in the form of the array of DECODE_RESULT structures. The maximum number of allowed decode results (i.e. the size of the array) is given in max_decodes. The plug-in must return the number of successfully decoded barcodes (i.e. the number of populated elements in the array p_decode_results), or a negative number in case of an error.
If p_cancel_flag is not NULL, it points to the integer flag (called "Cancel flag") that indicates whether the decoding process should continue or should stop as soon as possible. If the flag is 0, the decoding process can continue. If the flag is not zero, the decoding process must stop as soon as possible. The reason for aborting the decoding process could be, for example, a time out. It is recommended to check the Cancel flag often enough so that the latency on aborting the decoding process would be as short as possible.
Note that the Cancel flag is not the only way the Decoding plug-in (or any plug-in for that matter) can be aborted. Depending on the circumstances, the system can decide to abruptly kill the thread, in which the Decoding piug-in is running, at any time.
Structure DECODE_RESULT
The structure DECODE_RESULT has the following format:
#define MAX_DECODED_DATA_LEN 4096 #define MAX_SUPPL_DATA_LEN 128 typedef struct { int x; int y; } BC_POINT;
typedef struct {
BCJ3OINT BCPts[4]; /* Coordinates of the 4 corners of the barcode */ } BC_BOUNDS;
The order of the array elements (i.e. corners) in BC_BOUNDS structure is as follows:
0 - top left
1 — top right
2 - bottom right
3 - bottom left
typedef struct { int decode_result_index; /* index of the decode result, starting from 0 */ int num_decode_results; /* total number of decode results minus 1 (i.e. 0-based) */ char Symld[32]; /* the symbology identifier characters */ int Symbology; /* the decoded barcode's symbology identifier number */ int Modifier; /* additional information of the decoded barcode */ int Decld; /* reserved */ int Class; /* 1 for ID, 2 for 2 D */ unsigned char Data[MAX_DECODED_DATA_LEN];7* decoded data - may contain null chars */ int Length; /* number of characters in the decoded barcode */ unsigned char SupplData[MAX_SUPPLJDATA_LEN]; /* supplemental code's data */ int Suppl Length; /* number of characters in the supplemental code's data */ unsigned char LinkedData[MAX_DECODED_DAT AJLEN]; int LinkedLength;
BC-BOUNDS C_Bounds; /* Bounds for the primary barcode */
BCJBOUNDS S_Bounds; /* Bounds for the supplemental barcode */
} DECODE_RESULT;
The first two members of each populated DECODE_RESULT structure must contain a zero-based index of the decode result in the array (i.e. the first decode result must have decode_result_index = 0, the second must have decode_result_index = 1, and so on) and the zero-based total number of successfully decoded barcodes (which should equal the returned value minus 1).
The Symld member of DECODE RESULT structure can have a string of up to 31 null- terminated characters describing the barcode symbology. It is used for informational purposes only. The following values are recommended for some known barcode symbologies.
"AZTEC" Aztec
"CBR" Codabar
"CBK A" Codablock A
"CBK_F" Codablock F
"CH " Code 1 1
"C128" Code 128
"C39" Code 39
"C93" Code 93
"DM" Datamatrix
"S2O5" Straight 2 of 5
"I2O5" Interleaved 2 of 5
"MC" MexiCode
"PDF" Code PDF
"QR" Code QR
"RSS-E" Code RSS-E
"RSS-EST" Code RSS-EST
"RSS14-LIM" Code RSS Limited
"RSS 14" Code RSS-14
"RSS 14-ST" Code RSS-ST
"UPC" Code UPC/EAN
The Symbology member of the DECODE_RESULT structure must contain the id of the decoded barcode symbology. The following symbology ids must be used for the known barcode symbologies:
MBCD SYM C128 Code 128
MBCD" SYM" "C39 Code 39
M BCD" _SYM~ "iTF Interleaved 2 of 5
MBCD" SYM" C93 Code 93
MBCD" SYM" "CBR Codabar
MBCD" SYM" ~UPC Code UPC/EAN
MBCD" SYM" "TPEN Telepen
MBCD SYM RSS14 Code RSS- 14
MBCD" "SYM" RSSE Code RSS-E MBCD_SYM RSSL Code RSS Limited
MBCD SYM MTF Matrix 2 of 5
MBCD_SYM ATF Airline 2 of 5
MBCD_SYM STF Straight 2 of 5
MBCD SYM MPLY MSl Plessey
MBCD SYM CI l Code 1 1
MBCD_SYM_PDF Code PDF
MBCD_SYM PN Postnet
MBCD SYM DM Datamatrix
MBCD SYM MC MaxiCode
MBCD_SYM QR Code QR
MBCD_SYM AZ Aztec
MBCD SYM MICROPDF MicroPDF
MBCD SYM CBLA 1 Codablock A
MBCD_SYM CBLF Codablock F
MBCD_SYM UNKNOWN User-defined symbology
The Modifier member of the DECODE_RESULT structure contains additional information about the decoded barcode. The values of the Modifier are usually bit-combinatory. They are unique for different symbologies, and many symbologies don't use it all. If the Modifier is not used, it should be set to 0. For some symbologies that support Modifier, the possible values are presented below.
Coupon Modifier MBCD_MODIFIER_COUP Coupon code
UPC Modifier Bit Flag Constants
MBCD_MODIFIERJUPCA UPC-A
MBCD_MODIFIER_UPCE UPC-E
MBCD_MODIFIER_EAN8 EAN-8
MBCD_MODIFIER_EAN13 EAN- 13
MBCD_MOD1F1ER_SUPP2 2-digit supplement
MBCD MODIFIER SUPP5 5 digit supplement
Code 128 Modifier Bit Flag Constants
MBCD MODIFIER_C128A Code 128 with A start character MBCDJvlODIFIER_C128B Code 128 with B start character MBCD MODIFIER C128C Code 128 with C start character, but not an EAN 128
MBCD_MODIF1ER_EAN128 EAN-128
MBCD MODIFIER PROG Programming label (overrides all other considerations)
MBCD_MODIFIER_A1M_AI Code 128 with AIM Application indicator
Code 39 Modifier Bits Flag Constands
MBCD MODIFIER JTPHARM Italian Pharmaceutical
Codabar Modifier Bit Flag Constants MBCD_MODIFIER_CBR DF Double-Field Codabar
POSTNET i Modifier Bit Flag Constants MBCD_MOD1FIER_PN POSTNET
MBCD_MODIFIER_JAP Japan Post
MBCD MODIFIER AUS Australia Post MBCD_MODIFIER_PLANET PLANET MBCD_MODIFIER_RM Royal Mail
MBCD_MODIFIER_KIX KlX Code
MBCD_MODIFIER_UPU57 UPU (57-bar) MBCD_MODIFIER_UPU75 UPU (75-bar)
Datamatrix Modifier Bit Flag Constants MBCD_MODIFIER_ECC140 ECC 000-140 MBCD_MODIFIER_ECC200 ECC 200
MBCD_MODIF1ER_FNC15 ECC 200, FNCI in first or fifth position MBCD_MODIFIER_FNC26 ECC 200, FNCl in second or sixth position MBCD_MODI FIER-ECI ECC 200, ECI protocol implemented
MBCD_MODIFIER_FNC15_EC1 ECC 200, FNCl in first or fifth position, ECI protocol
MBCD_MODIFIER_FNC26_ECI ECC 200, FNCl in second or sixth position, ECl protocol
MBCD_MODIFTER_RP Reader Programming Code
MaxiCode Modifier Bit Flag Constants MBCD_MODIFIER_MZ Symbol in Mode 0
MBCD_MODIFIER_M45 Symbol in Mode 4 or 5
MBCD_MODJFIER_M23 Symbol in Mode 2 or 3
MBCD_MODIFIER_M45_ECI Symbol in Mode 4 or 5, ECI protocol MBCD_MODIFIER_M23_ECI Symbol in Mode 2 or 3, ECl protocol
The Decld member of the DECODE_RESULT structure is currently not used and should be set to 0.
The Class member of the DECODE_RESULT structure must be set either to 1 or 2. If the decoded barcode is a regular linear barcode, such as UPC, Code 39, RSS, etc., the Class should be set to 1. If the decoded barcode is a 2D symbology, such as Code PDF, Datamatrix, Aztec, MaxiCode, etc., the Class should be set to 2.
The Data member of the DECODE_RESULT structure contains the decoded data. It can contain up to M AX_DECODED_D ATA-LEN bytes of data.
The Length member of the DECODE_RESULT structure specifies how many bytes of decoded data are stored in Data.
The SupplData member of the DECODE_RESULT structure contains the data decoded in a supplemental part of the barcode, such as a coupon. It can contain up to MAX_DECODED_DATA_LEN bytes of data.
The SupplLength member of the DECODE_RESULT structure specifies how many bytes of decoded data are stored in SupplData.
The LinkedData member of the DECODE_RESULT structure contains the data decoded in a secondary part of the composite barcode, such as RSS/PDF composite. It can contain up to MAX_DECODED_DAT AJLEN bytes of data.
The LinkedLength member of the DECODE_RESULT structure specifies how many bytes of decoded data are stored in LinkedData.
The C_Bounds and S_Bounds members of the DECODE_RESULT structure are currently not used. DECODE Plug-in Call-Mode
The DECODE plug-in can have the following call-mode values: bit value
0 <— 0 = compliment standard; 1 = replace standard
1 <-- (if bitO==O) 0 = call before standard function; 1 = call after standard function
The default call-mode value is 0, meaning that by default, the DECODE plug-in is considered a complimentary module to standard Focus barcode decoding software and is executed before the standard function. In this case, the standard function will be called only if the result returned from DECODE plug-in is not negative and less than max_decodes.
DECODE ENABLE2D Plug-in Function
This function is called to notify the plug-in that the scanner enters a mode of operation in which decoding of 2D symbologies (such as PDF417, Datamatrix, etc.) should be either allowed or disallowed. By default, the decoding of 2D symbologies is allowed.
Function prototype: void
(*PLUGIN_ENABLE2D)(int enable); /* Input: 0 = disable; 1 = enable */
For example, when the Focus scanner is configured to work in linear mode (as opposed to omnidirectional mode), the decoding of 2D symbologies is disallowed.
DECODE PROGMD Plug-in Function
This function is called to notify the plug-in that the scanner enters a programming mode. Function prototype: void
(*PLUGIN_PROGMD)(int progmd); /* Input: 1 = enter; 0 = normal exit; (-1) = abort */
DECODE PROGBC Plug-in Function
This function is called to notify the plug-in that the scanner just scanned a programming barcode, which can be used by the plug-in for its configuration purposes.
Function prototype: int /* Return: 1 if successful; 0 if barcode is invalid; negative if error */
(*PLUGIN_PROGBC)(unsigned char *bufferptr, int data_len);
Details About The Image Preprocessing Plug-In of the Illustrative Embodiment of the Present Invention The purpose of the Image Preprocessing Plug-in is to allow the plug-in to perform some special image processing right after the image acquisition and prior to the barcode decoding. The Image Preprocessing Plug-in can have the following plug-in functions:
IMGPREPR; IMGPREPR_PROGMD; IMGPREPR_PROGBC.
IMGPREPR Plug-in Function
This function is called to perform an image preprocessing. The image is represented in memory as a two-dimensional array of 8-bit pixels. The first pixel of the array represents the upper-left corner of the image.
Function prototype: int /* Return: 1 if preprocessing is done; 0 if not; neg. if error */
(*PLUGIM_IMGPREPR)( void *p_image, /* Input: pointer to the image */ int size x, /* Input: number of columns */ int size_y, /* Input: number of rows */ int pitch, /* Input: row size, in bytes */ void **pp_new_image, /* Output: pointer to the new image */ int *p_new_size_x, /* Output: new number of columns */ int *p_new_size_y, /* Output: new number of rows */ int *p_new_pitch); /* Output: new row size, in bytes */
If the IMGPREPR plug-in function is successful, it should return 1 and store the address of the new image in the location in memory pointed to by pp_new_image. The new image dimensions should be stored in the locations pointed to by p__new_size_x, p_new_size_y, and p_new_pitch.
If the preprocessing is not performed for whatever reason, the IMGPREPR plug-in function must return 0.
The negative returned value indicates an error.
IMGPREPR PROGMD Plug-in Function
This function is called to notify the plug-in that the scanner enters a programming mode. Function prototype: void
(*PLUGIN_PROGMD)(int progmd); /* Input: 1 = enter; 0 = normal exit; (-1) = abort */
IMGPREPR PROGBC Plug-in Function
This function is called to notify the plug-in that the scanner just scanned a programming barcode, which can be used by the plug-in for its configuration purposes. Function prototype: int /* Return: 1 if successful; 0 if barcode is invalid; negative if error */
(*PLUGIN_PROGBC)(unsigned char *bufferptr, int data_len);
Details about Formatting Plug-in of the Illustrative Embodiment
The purpose of the Formatting Plug-in is to provide a replacement or complimentary software to the standard Focus data formatting software. The Formatting Plug-in configuration file must have the name "format.plugin" and loaded in the "/usr" directory in the Focus scanner.
The Formatting Plug-in can currently have the following plug-in functions: PREFORMAT; FORMAT_PROGMD; FORMAT_PROGBC. PREFORMAT Plug-in Function
This function is called to perform a necessary transformation of the decoded barcode data prior to the data being actually formatted and sent out.
Function prototype: int /* Return: 1 if preformat is done; 0 if not; neg. if error */
(*PLUGIN_PREFORMAT)(
DECODE_RESULT *decode_results, /* Input: decode results */
DECODE_RESULT *new_decode_results); /* Output: preformatted decode results */
If the PREFORMAT plug-in function is successful, it should return 1 and store the new decode result in the location in memory pointed to new_decode_results.
If the preformatting is not performed for whatever reason, the PREFORMAT plug-in function must return 0.
The negative returned value indicates an error.
For the details about the DECODE_RESULT structure, please refer to the section DECODE Plug-in Function.
FORMAT PROGMD Plug-in Function
This function is called to notify the plug-in that the scanner enters a programming mode.
Function prototype: void
(*PLUGI"N_PROGMD)(int progmd); /* Input: 1 = enter; 0 = normal exit; (-1 ) = abort */
FORMAT PROGBC Plug-in Function This function is called to notify the plug-in that the scanner just scanned a programming barcode, which can be used by the plug-in for its configuration purposes.
Function prototype: int /* Return: 1 if successful; 0 if barcode is invalid; negative if error */
(*PLUGIN_PROGBC)(unsigned char *bufferptr, int data_len);
The method of system feature/functionality modification described above can be practiced in diverse application environments which are not limited to image-processing based bar code symbol reading systems described hereinabove. In general, any image capture and processing system or device that supports an application software layer and at least an image capture mechanism and an image processing mechanism would be suitable for the practice of the present invention. Thus, image- capturing cell phones, digital cameras, video cameras, and portable or mobile computing terminals and portable data terminals (PDTs) are all suitable systems in which the present invention can be practiced.
Also, it is understood that the application layer of the image-processing bar code symbol reading system of the present invention, illustrated in Fig. 10, with the above-described facilities for modifying system features and functionalities using the plug-in development techniques described above, can be ported over to execute on conventional mobile computing devices, PDAs, pocket personal computers (PCs), and other portable devices supporting image capture and processing functions, and being provided with suitable user and communication interfaces.
The Image Capture and Processing System of the present invention described above can be implemented on various hardware computing platforms such as Palm®, PocketPC®, MobilePC®, JVM®, etc. equipped with CMOS sensors, trigger switches etc. In such illustrative embodiments, the 3-tier system software architecture of the present invention can be readily modified by replacing the low-tier Linux OS (described herein) with any operating system (OS), such as Palm, PocketPC, Apple OSX, etc. Furthermore, provided that the mid-tier SCORE subsystem described hereinabove supports a specific hardware platform equipped with an image sensor, trigger switch of one form or another etc., and that the same (or similar) top-tier "Bar Code Symbol Reading System" Application is compiled for that platform, any universal (mobile) computing device can be transformed into an Image Acquisition and Processing System having the bar code symbol reading functionalities of the system shown in Figs 2A through 18, and described in detail hereinabove. In such alternative embodiments of the present invention, third-party customers can be permitted to write their own software plug-ins to enhance or modify the behavior of the Image Acquisition and Processing Device, realized on the universal mobile computing platform, without any required knowledge of underlying hardware platform, communication protocols and/or user interfaces.
Some Modifications Which Readily Come To Mind In alternative embodiments of the present invention, illumination arrays 27, 28 and 29 employed within the Multi-Mode Illumination Subsystem 14 may be realized using solid-state light sources other than LEDs, such as, for example, visible laser diode (VLDs) taught in great detail in WIPO Publication No. WO 02/43195 A2, published on May 30, 2002, assigned to Metrologic Instruments, Inc., and incorporated herein by reference in its entirety as if set forth fully herein. However, when using VLD-based illumination techniques in the imaging-based bar code symbol reader of the present invention, great care must be taken to eliminate or otherwise substantially reduce speckle- noise generated at the image detection array 22 when using coherent illumination source during object illumination and imaging operations. WIPO Publication No. WO 02/43195 A2, supra, provides diverse methods of and apparatus for eliminating or substantially reducing speckle-noise during image formation and detection when using VLD-based illumination arrays.
While CMOS image sensing array technology was described as being used in the preferred embodiments of the present invention, it is understood that in alternative embodiments, CCD-type image sensing array technology, as well as other kinds of image detection technology, can be used.
The bar code reader design described in great detail hereinabove can be readily adapted for use as an industrial or commercial fixed-position bar code reader/imager, having the interfaces commonly used in the industrial world, such as Ethernet TCP/IP for instance. By providing the system with an Ethernet TCP/IP port, a number of useful features will be enabled, such as, for example: multi-user access to such bar code reading systems over the Internet; control of multiple bar code reading system on the network from a single user application; efficient use of such bar code reading systems in live video operations; web-servicing of such bar code reading systems, i.e. controlling the system or a network of systems from an Internet Browser; and the like.
While the illustrative embodiments of the present invention have been described in connection with various types of bar code symbol reading applications involving 1-D and 2-D bar code structures, it is understood that the present invention can be use to read (i.e. recognize) any machine-readable indicia, dataform, or graphically-encoded form of intelligence, including, but not limited to bar code symbol structures, alphanumeric character recognition strings, handwriting, and diverse dataforms currently known in the art or to be developed in the future. Hereinafter, the term "code symbol" shall be deemed to include all such information carrying structures and other forms of graphically-encoded intelligence.
Also, imaging-based bar code symbol readers of the present invention can also be used to capture and process various kinds of graphical images including photos and marks printed on driver licenses, permits, credit cards, debit cards, or the like, in diverse user applications.
It is understood that the image capture and processing technology employed in bar code symbol reading systems of the illustrative embodiments may be modified in a variety of ways which will become readily apparent to those skilled in the art of having the benefit of the novel teachings disclosed herein. All such modifications and variations of the illustrative embodiments thereof shall be deemed to be within the scope and spirit of the present invention as defined by the Claims to Invention appended hereto.

Claims

CLAIMS TO INVENTION:
1. A digital image capture and processing system having a set of standard features and functions, and a set of custom features and functionalities that satisfy customized end^-user application requirements, said digital image capture and processing system comprising: a digital camera subsystem for projecting a field of view (FOV) upon an object to be imaged in said FOV, and detecting imaged light reflected off the object during illumination operations in an image capture mode in which one or more digital images of the object are formed and detected by said digital camera subsystem; a digital image processing subsystem for processing said one or more digital images and producing raw or processed data, or recognizing or acquiring information graphically represented therein, and producing output data representative of said recognized information; an input/output subsystem for transmitting said output data to an external host system or other information receiving or responding device; a system control system for controlling and/or coordinating the operation of said subsystems above; and a computing platform for supporting the implementation of one or more of said subsystems above, and the features and functions of said digital image capture and processing system; said computing platform including (i) memory for storing pieces of original product code written by the original designers of said digital image capture and processing system, and (ii) a microprocessor for running one or more applications by calling and executing pieces of said original product code in a particular sequence, so as support a set of standard features and functions which characterize a standard behavior of said digital image capture and processing system; wherein said one or more pieces of original product code have a set of place holders into which third-party product code can be inserted or plugged by third parties, including value-added resellers (VARs), original equipment manufacturers (OEMs), and also end-users of said digital image capture and processing system; and wherein one or more pieces of third-party code that have been plugged into said set of place holders, operate to extend the features and functions of said digital image capture and processing system, and modify the standard behavior of said digital image capture and processing system into a custom behavior for said digital image capture and processing system.
2. The digital image capture and processing system of claim 1 , which further comprises a housing having a light transmission window, wherein said FOV is projected through said light transmission window and upon an object to be imaged in said FOV.
3. The digital image capture and processing system of claim 2, wherein said housing contains said subsystems.
4. The digital image capture and processing system of claim 4, wherein when said pieces of third-party code are plugged into said place holders, the features and functions of said digital image capture and processing system are modified and/or extended, and the standard behavior of said digital image capture and processing system is modified into a custom behavior for said digital image capture and processing system.
5. The digital image capture and processing system of claim 4, wherein said one or more pieces of original product code and said third-party product code are maintained in one or more libraries.
6. The digital image capture and processing system of claim 5, wherein said memory comprises a memory architecture having different kinds of memory, each having a different access speed and performance characteristics.
7. The digital image capture and processing system of claim 1 , wherein an end-user, such a value-added reseller (VAR) or original equipment manufacturer (OEM), can write said one or more pieces of third- party code according to specifications set by said original system designers, and said one or more pieces of custom code can be plugged into said place holders, so as to extend the features and functions of said digital image capture and processing system, and modify the standard behavior of said digital image capture and processing system into said custom behavior for said digital image capture and processing system, without permanently modifying the standard features and functions of said digital image capture and processing system.
8. The digital image capture and processing system of claim 1 , which is integrated or embodied into a third-party product.
9. The digital image capture and processing system of claim 8, wherein said third-party product is selected from the group consisting of image-processing based bar code symbol reading systems, portable data terminals (PDTs), mobile phones, computer mice-type devices, personal computers, keyboards, consumer appliances, automobiles, ATMs, vending machines, reverse-vending machines, retail POS-based transaction systems, 2D or 2D digitizer, and CAT scanning systems, automobile identification systems, package inspection systems, and personal identification systems.
10. The digital image capture and processing system of claim 8, which is selected from the group consisting of image-processing based bar code symbol reading systems, portable data terminals (PDTs), mobile phones, computer mice-type devices, personal computers, keyboards, consumer appliances, automobiles, ATMs, vending machines, reverse-vending machines, retail POS-based transaction systems, 2D or 2D digitizers, and CAT scanning systems, automobile identification systems, package inspection systems, and personal identification systems.
1 1. The digital image capture and processing system of claim 1, which has the form factor of an digital imaging engine module that can be integrated into a third party product selected from the group consisting of image-processing based bar code symbol reading systems, portable data terminals (PDTs), mobile phones, computer mice-type devices, personal computers, keyboards, consumer appliances, automobiles, ATMs, vending machines, reverse-vending machines, retail POS-based transaction systems, 2D or 2D digitizers, and CAT scanning systems, automobile identification systems, package inspection systems, and personal identification systems.
12. The digital image capture and processing system of claim 1 1 , which further comprises at least one printed circuit board that can be installed within a housing of a third-party product with said digital imaging engine module, and interfaced with said digital imaging engine module, and at least one component with the housing of said third-party housing.
13. The digital image capture and processing system of claim 1 , wherein said original product code and said third-party code each comprises executable binary code.
14. The digital image capture and processing system of claim 1, wherein said digital camera subsystem comprises: a digital image formation and detection subsystem having (i) image formation optics for projecting said FOV through a light transmission window and upon said object to be imaged in said FOV, and (ii) an image sensing array for detecting imaged light reflected off the object during illumination operations in an image capture mode in which sensor elements in said image sensing array are enabled so as to detect one or more digital images of the object formed on said image sensing array; an illumination subsystem having an illumination array for producing and projecting a field of illumination through said light transmission window and within said FOV during the image capture mode; and an image capturing and buffering subsystem for capturing and buffering said one or more digital images detected by said image formation and detection subsystem.
15. The digital image capture and processing system of claim 14, wherein said image sensing array comprises an image sensing structure selected from the group consisting of an area-type image sensing array, and a linear-type image sensing array.
16. The digital image capture and processing system of claim 1, wherein said memory maintains system parameters used to configure said functions of said digital image capture and processing system.
17. The digital image capture and processing system of claim 1 , wherein said computing platform implements said digital image processing subsystem, said input/output subsystem and said system control subsystem.
18. The digital image capture and processing system of claim 1, wherein said memory comprises a memory architecture that supports a three-tier modular software architecture characterized by an Operating System (OS) layer, a System CORE (SCORE) layer, and an Application layer and responsive to the generation of a triggering event within said digital-imaging based code symbol reading system.
19. The digital image capture and processing system of claim 15, wherein said OS layer includes one or more software modules selected from the group consisting of an OS kernal module, an OS file system module, and device driver modules; wherein said SCORE layer includes one or more of software modules selected from the group consisting of a tasks manager module, an events dispatcher module, an input/output manager module, a user commands manager module, the timer subsystem module, an input/output subsystem module and an memory control subsystem module; wherein said application layer includes one or more software modules selected from the group consisting of a code symbol decoding module, a function programming module, an application events manager module, a user commands table module, and a command handler module.
20. The digital image capture and processing system of claim 19, wherein, prior to capturing one or more digital images of the object, said microprocessor rapidly initializes said micro-computing platform by performing the following operations:
(1 ) accessing one or more software modules from said OS layer and executing code contained therein;
(2) accessing one or more software modules from said SCORE layer and executing code contained therein; and
(3) accessing one or more software modules from said Application layer and executing code contained therein.
21. The digital image capture and processing system of claim 1 , wherein said field illumination comprises narrow-band illumination produced from an array of light emitting diodes (LEDs).
22. The digital image capture and processing system of claim 10, wherein said digital image processing subsystem processes said one or more digital images, so as to read one or more code symbols graphically represented therein, and producing output data in the form of symbol character data representative of said read one or more code symbols.
23. The digital image capture and processing system of claim 19, wherein each said code symbol is a bar code symbol selected from the group consisting of a I D bar code symbol, a 2D bar code symbol, and a data matrix type code symbol structure.
24. The digital image capture and processing system of claim 1 , wherein said digital camera subsystem further comprises an illumination control subsystem for controlling the operation of said illumination subsystem during said image capture mode.
25. The digital image capture and processing system of claim 15, said computing platform implements said digital image processing subsystem, said input/output subsystem and said system control subsystem, and wherein said application layer includes said one or more libraries and said one or more libraries include one or more software modules selected from the group consisting of a code symbol decoding module, a function programming module, an application events manager module, a user commands table module, and a command handler module.
26. The digital image capture and processing system of claim 1, which further comprises an automatic object detection subsystem for automatically detecting the presence of the object in said FOV, and in response thereto, generating a trigger signal indicative of a triggering event.
27. The digital image capture and processing system of claim 1 , which further comprises a trigger manually actuatable by an operator of said digital image capturing and processing system so as to generate a trigger signal indicating a triggering event.
28. The digital image capture and processing system of claim 1 , which further comprises an automatic object detection subsystem disposed in said housing, for automatically detecting the presence of the object in said FOV, and in response thereto, generating a trigger signal indicating a triggering event.
29. The digital image capture and processing system of claim 1 , wherein said housing is a hand- supportable housing.
30. The digital image capture and processing system of claim 1 , wherein said digital image processing subsystem, said input/output subsystem and said system control system are implemented using said computing platform.
31. The digital image capture and processing system of claim 30, wherein said digital camera subsystem is implemented as an electro-optical module.
32. The digital image capture and processing system of claim 31 , wherein said computing platform is implemented on a printed circuit (PC) board, and wherein said electro-optical module and said PC board are interfaced and contained in a housing having a light transmission window, through which said FOV is projected.
33. The digital image capture and processing system of claim 1, wherein said digital image processing subsystem, said input/output subsystem and said system control system are implemented using said computing platform, and wherein said digital camera subsystem is implemented on a camera board.
34. The digital image capture and processing system of claim 33, wherein said printed circuit (PC) board and said camera board are electrically interfaced.
35. The digital image capture and processing system of claim 34, wherein said printed circuit (PC) board and said camera board are mounted within a housing having a light transmission window, through which said FOV is projected.
36. The digital image capture and processing system of claim 34, wherein a host computer system, operated by said third-party, can be interfaced with said input/output system so as to load said one or more pieces of third-party code into said memory, and plugged into said set of place holders, so as operate to extend the features and functions of said digital image capture and processing system, and modify the standard behavior of said digital image capture and processing system into said custom behavior for said digital image capture and processing system.
37. The digital image capture and processing system of claim 1 , wherein a host computer system, operated by said third-party, can be interfaced with said input/output system so as to load said one or more pieces of third-party code into said memory, and plugged into said set of place holders, so as operate to extend the features and functions of said digital image capture and processing system, and modify the standard behavior of said digital image capture and processing system into said custom behavior for said digital image capture and processing system.
38. The digital image capture and processing system of claim 1 , wherein a host computer system, operated by said third-party, can be interfaced with said input/output system so as to load said one or more pieces of third-party code into said memory, and plugged into said set of place holders, so as operate to extend the features and functions of said digital image capture and processing system, and modify the standard behavior of said digital image capture and processing system into said custom behavior for said digital image capture and processing system, without permanently modifying the standard features and functions of said digital image capture and processing system.
39. The digital image capture and processing system of claim 1, which further comprises an object presence detection subsystem.
40. The digital image capture and processing system of claim 39, wherein said object presence detection (triggering) subsystem supports standard functions selected from the group consisting of automatic triggering, manual triggering, and semi-automatic triggering.
41. The digital image capture and processing system of claim 40, wherein said automatic triggering comprises IR-based object presence detection.
42. The digital image capture and processing system of claim 1 , which further comprises an object range detection subsystem.
43. The digital image capture and processing system of claim 42, wherein said object range detection subsystem supports standard functions selected from the group consisting of long/short range detection and quantized/incremental range detection.
44. The digital image capture and processing system of claim 43, wherein said object range detection subsystem employs IR-based long/short range detection.
45. The digital image capture and processing system of claim 42, wherein said object range detection subsystem employs IR-based quantized/incremental range detection.
46. The digital image capture and processing system of claim 1, which further comprises an object velocity detection subsystem.
47. The digital image capture and processing system of claim 46, wherein said object velocity detection subsystem supports standard functions selected from the group consisting of LIDAR-based object velocity detection, and pulse-doppler based object velocity detection
48. The digital image capture and processing system of claim 1, which further comprises an object dimensioning subsystem.
49. The digital image capture and processing system of claim 48, wherein said object dimensioning subsystem supports standard functions selected from the group consisting of LIDAR-based object dimensioning, and structured-light based object dimensioning.
50. The digital image capture and processing system of claim 1, wherein said digital camera subsystem comprises an illumination subsystem for illuminating said FOV.
51. The digital image capture and processing system of claim 50, wherein said illumination subsystem supports functions selected from the group consisting of illumination mode, automatic illumination control, and illumination field type.
51. The digital image capture and processing system of claim 51 , wherein said illumination mode includes ambient illumination, continuous LED-based illumination, and strobe or flash based LED illumination.
52. The digital image capture and processing system of claim 51 , wherein said automatic illumination control comprises measuring the intensity of light reflected off an illuminated object, in a particular region of said FOV.
53. The digital image capture and processing system of claim 51 , wherein illumination field type includes types of illumination fields selected from the group consisting of narrow-area near-field illumination, wide-area far-field illumination, narrow-area field of illumination, and wide-area field of illumination, narrow-area field of illumination, wide-area field of illumination.
54. The digital image capture and processing system of claim 14, wherein said image formation and detection subsystem supports standard functions selected from the group consisting of image capture mode, image capture control, electronic gain of the image sensing array, image frame exposure control, programmable exposure time for each block of imaging pixels within the image sensing array, programmable exposure time for each image frame detected by image sensing array and field of view (FOV) marking.
55. The digital image capture and processing system of claim 54, wherein said image capture mode is either a narrow-area image capture mode or a wide-area image capture mode.
56. The digital image capture and processing system of claim 54, wherein said image capture control is either single frame control or video frame control.
57. The digital image capture and processing system of claim 54, wherein said electronic gain of said image sensing array is programmable.
58. The digital image capture and processing system of claim 54, wherein said image frame exposure control is programmable.
59. The digital image capture and processing system of claim 54, wherein said exposure time for each image frame detected by said image sensing array is programmable.
60. The digital image capture and processing system of claim 54, wherein said exposure time for each block of imaging pixels within the image sensing array is programmable.
61. The digital image capture and processing system of claim 54, wherein said field of view marking is selected from the group consisting of a one dot pattern, a two dot pattern, a four dot pattern, a visible line pattern, and a composite four dot and line pattern.
62. The digital image capture and processing system of claim 14, wherein said digital image processing subsystem supports standard functions selected from the group consisting of an image cropping pattern on said image sensing array, pre-processing of image frames, information recognition processing, post-processing of detected said one or more digital images, and object feature/characteristic set recognition.
63. The digital image capture and processing system of claim 62, wherein said image cropping pattern on said image sensing array is defined in terms of z and y coordinates.
64. The digital image capture and processing system of claim 62, wherein said pre-processing of image frames comprises a plurality of digital filters.
65. The digital image capture and processing system of claim 62, wherein said information recognition processing comprises one or more processes selected from the group consisting of code symbology recognition processes, and alphanumerical character string recognition processes, and text recognition processes.
66. The digital image capture and processing system of claim 62, wherein said post-processing of said digital images comprises one or more of a plurality of digital data filters.
67. The digital image capture and processing system of claim 62, wherein said object feature/characteristic set recognition comprises automatic produce recognition for deployment at retail point of sale (POS) stations.
68. The digital image capture and processing system of claim 14, wherein said input/output subsystem supports standard functions selected from the group consisting of data communication protocols, output image file formats, output video file formats, data output format, keyboard interface, and graphical display interface.
69. The digital image capture and processing system of claim 14, which further comprises a sound indicator output subsystem which supports standard functions selected from the group consisting of sound loudness, and sound pitch.
70. The digital image capture and processing system of claim 14, which further comprises a visual indictor output subsystem which supports standard functions selected from the group consisting of indicator brightness and indicator color.
71. The digital image capture and processing system of claim 14, which further comprises a power management subsystem which supports standard functions selected from the group consisting of a power operation mode and an energy savings mode.
72. The digital image capture and processing system of claim 71 , wherein said power operation mode includes states selected from the group consisting of OFF, continuously ON, and energy savings mode OM.
73. The digital image capture and processing system of claim 72, wherein said energy savings mode comprises a plurality of different energy savings modes.
74. The digital image capture and processing system of claim 14, which further comprises an image time/space stamping subsystem which supports standard functions selected from the group consisting of GPS-based time/space stamping and network server time assignment stamping.
75. The digital image capture and processing system of claim 14, which further comprises a network address storage subsystem which supports standard functions selected from the group consisting of manual network address storage, and automatic IP address storage via DHCP.
76. The digital image capture and processing system of claim 14, which further comprises a remote monitoring/servicing subsystem which supports standard functions selected from the group consisting of TCP/IP connection and SNMP agent.
77. The digital image capture and processing system of claim 14, wherein said system control and/or coordination subsystem supports standard functions selected from the group consisting of a plurality of different modes of system operation, each being programmably selected by a function parameter stored in said memory.
78. A method of modifying and/or extending the standard features and functions of a digital image capture and processing system, said method comprising the steps of:
(a) providing said digital image capture and processing system having a set of standard features and functions, and a computing platform including (i) memory for storing pieces of original product code written by the original designers of said digital image capture and processing system, and (ii) a microprocessor for running one or more applications by calling and executing pieces of said original product code in a particular sequence, so as support said set of standard features and functions which characterize a standard behavior of said digital image capture and processing system, wherein said one or more pieces of original product code have a set of place holders into which third-party product code can be inserted or plugged by third parties, including value-added resellers (VARs), original equipment manufacturers (OEMs), and also end-users of said digital image capture and processing system; and
(b) plugging one or more pieces of third-party code into said set of place holders, so as operate to modify and/or extend the features and functions of said digital image capture and processing system, and thereby modify or extend the standard behavior of said digital image capture and processing system into a custom behavior for said digital image capture and processing system.
79. The method of claim 78, wherein said one or more pieces of original product code and said third- party product code are maintained in one or more libraries.
80. The method of claim 78, wherein said memory comprises a memory architecture having different kinds of memory, each having a different access speed and performance characteristics.
81. The method of claim 78, wherein step (b) further comprises an end-user or third-party, such a value- added reseller (VAR) or original equipment manufacturer (OEM), writing said one or more pieces of third-party code according to specifications set by said original system designers, and said one or more pieces of custom code thereafter being plugged into said place holders, so as to extend the features and functions of said digital image capture and processing system, and modify the standard behavior of said digital image capture and processing system into said custom behavior for said digital image capture and processing system,
82. The method of claim 81, wherein plugging said one or more pieces of custom code into said place holders, modifies and/or extends the features and functions of said digital image capture and processing system, and modifies the standard behavior of said digital image capture and processing system into said custom behavior for said digital image capture and processing system, but without permanently modifying the standard features and functions of said digital image capture and processing system.
83. The method of claim 78, wherein step (a) further comprises integrating or embodying said digital image capture and processing system into a third-party product, and thereafter performing step (b).
84. The method of claim 78, wherein during step (a), said third-party product is selected from the group consisting of image-processing based bar code symbol reading systems, portable data terminals (PDTs), mobile phones, computer mice-type devices, personal computers, keyboards, consumer appliances, automobiles, ATMs, vending machines, reverse- vend ing machines, retail POS-based transaction systems, 2D or 2D digitizer, and CAT scanning systems, automobile identification systems, package inspection systems, and personal identification systems.
85. The method of claim 78, wherein during step (a), said third party product is selected from the group consisting of image-processing based bar code symbol reading systems, portable data terminals (PDTs), mobile phones, computer mice-type devices, personal computers, keyboards, consumer appliances, automobiles, ATMs, vending machines, reverse-vending machines, retail POS-based transaction systems, 2D or 2D digitizer, and CAT scanning systems, automobile identification systems, package inspection systems, and personal identification systems.
86. The method of claim 78, which has the form factor of an digital imaging engine module that can be integrated into a third party product selected from the group consisting of image-processing based bar code symbol reading systems, portable data terminals (PDTs), mobile phones, computer mice-type devices, personal computers, keyboards, consumer appliances, automobiles, ATMs, vending machines, reverse-vending machines, retail POS-based transaction systems, 2D or 2D digitizer, and CAT scanning systems, automobile identification systems, package inspection systems, and personal identification systems.
87. The method of claim 78, which further comprises at least one printed circuit board that can be installed within a housing of a third-party product with said digital imaging engine module, and interfaced with said digital imaging engine module, and at least one component with the housing of said third-party housing.
88. The method of claim 78, wherein said original product code and said third-party code each comprises executable binary code.
89. The method of claim 78, which further comprises after step (b), an end-user using said digital image capture and processing system to form and detect one or more digital images of an object with said digital camera subsystem.
90. The method of claim 89, wherein said object bears a code symbol, and said digital image processing subsystem processes said 2D digital image so as to read the code symbol and producing symbol character data representative of said read code symbol.
91. The method of claim 90, wherein said code symbol is a bar code symbol selected from the group consisting of a I D bar code symbol, a 2D bar code symbol and a data matrix type code symbol structure.
92. A digital image capture and processing system having a set of standard features and functions, and a set of custom features and functions that satisfy customized end-user application requirements, said digital image capture and processing system comprising: a housing having a light transmission window; a digital camera subsystem for projecting a field of view (FOV) through said light transmission window and upon an object to be imaged in said FOV, and detecting imaged light reflected off the object during illumination operations in an image capture mode in which one or more digital images of the object are formed and detected by said digital camera subsystem; a digital image processing subsystem for processing said one or more digital images and producing raw or processed output data, or recognizing or acquiring information graphically represented therein, and producing output data representative of said recognized information; an input/output subsystem for transmitting said output data to an external host system or other information receiving or responding device; a system control system for controlling and/or coordinating the operation of said subsystems above; and a computing platform for supporting the implementation of one or more of said subsystems above, and the features and functions of said digital image capture and processing system; said computing platform including (ϊ) memory for storing pieces of original product code written by the original designers of said digital image capture and processing system, and (ii) a microprocessor for running one or more applications by calling and executing pieces of original product code in a particular sequence, so as support a set of standard features and functions which characterize a standard behavior of said digital image capture and processing system; wherein said one or more pieces of original product code have a set of place holders into which third-party product code can be inserted or plugged by third parties, including value-added resellers (VARs), original equipment manufacturers (OEMs), and also end-users of said digital image capture and processing system; and wherein one or more pieces of third-party code that have been plugged into said set of place holders, operate to modify and/or extend the features and functions of said digital image capture and processing system, and modify the standard behavior of said digital image capture and processing system into a custom behavior for said digital image capture and processing system.
93. A digital image capture and processing system that supports manufacturer-constrained system behavior modification and/or extension by an end-user or third-party through the development and installation/deployment of plug-in modules within the application layer of said system by a person other than the system designer, so as to allow this person to flexible modify standard features and functionalities of the system, and thus satisfy customized end-user application requirements, but without requiring detailed knowledge about the hard-ware platform of the system, its communication with an outside environment, and user-related interfaces.
94. A digital image capture and processing system that supports manufacturer-constrained system behavior modification by the end-user through the development and installation/deployment of plug-in modules within the application layer of the system by the person other than the designer of the system, so as to allow the person to flexible modify standard features and functionalities of the system, and thus satisfy customized end-user application requirements, but without requiring detailed knowledge about the hard-ware platform of the system, its communication with an outside environment, and user-related interfaces.
95. A digital image capture and processing system that supports manufacturer-constrained system behavior modification and/or extension by an end-user or third-party through the development and installation/deployment of plug-in modules within the application layer of the system by a person other than the system designer, so as to allow this person to flexible modify standard features and functionalities of the system, and thus satisfy customized end-user application requirements, but without requiring detailed knowledge about the hard-ware platform of the system.
96. A digital image capture and processing system that supports manufacturer-constrained system behavior modification by an end-user or third-party through the development and installation/deployment of plug-in modules within an application layer of said system by the person other than the designer of said system, so as to allow flexible modification and/or extension of standard system features and functionalities, and thus satisfy customized end-user application requirements, but without need for knowledge of details of the underlying hardware, communications and user-related interfaces employed in said system.
97. A digital image capture and processing system that allows customers, VARs and third parties to modify and/or extend a set of standard features and functionalities of the system without needing to contact the original designer of said system and negotiate ways of integrating desired modifications and/or enhancements into said system.
98. A digital image capture and processing system that allows customers, VARs, OEMs and other third parties to independently design their own software modules according to the original product specifications, and plug such software modules into said system, thereby effectively changing the behavior of the system, without need for knowledge of details of the underlying hardware, communications and user-related interfaces employed in said system.
99. A method of enabling a customer of a digital image capture and processing system, or any third- party thereof, with a way of and means for modifying and/or extending the behavior of the system without interfering with its underlying hardware, communications and user-related interfaces.
100. A digital image capture and processing system that provides end-users as well as third-parties, with a way of and means for designing, developing, and installing in the system, the third-parties own plug-in modules without need for knowledge of details of the underlying hardware, communications and user-related interfaces employed in said system.
101. A method of providing customers of a digital image capture and processing system, and third- parties thereof, with a way of and means for installing their own modules to enhance or alter the "standard" behavior of the device according to their own needs and independently from each other.
102. A digital image capture and processing system that supports designer/manufacturer-constrained system behavior modification, without requiring detailed knowledge about the hard-ware platform of the system, its communications with the outside environment, and user-related interfaces.
PCT/US2006/048148 2000-11-24 2006-12-18 Digital image capture and processng system permitting modification and/or extension of system features and functions WO2007075519A2 (en)

Priority Applications (23)

Application Number Priority Date Filing Date Title
EP06845674A EP1971952A4 (en) 2005-12-16 2006-12-18 Digital image capture and processng system permitting modification and/or extension of system features and functions
US11/880,087 US8042740B2 (en) 2000-11-24 2007-07-19 Method of reading bar code symbols on objects at a point-of-sale station by passing said objects through a complex of stationary coplanar illumination and imaging planes projected into a 3D imaging volume
US11/900,651 US7954719B2 (en) 2000-11-24 2007-09-12 Tunnel-type digital imaging-based self-checkout system for use in retail point-of-sale environments
US11/977,430 US7614560B2 (en) 2000-11-24 2007-10-24 Method of illuminating objects at a point of sale (POS) station by adaptively controlling the spectral composition of the wide-area illumination beam produced from an illumination subsystem within an automatic digital image capture and processing system
US11/977,422 US7731091B2 (en) 2000-11-24 2007-10-24 Digital image capturing and processing system employing automatic object detection and spectral-mixing based illumination techniques
US11/977,432 US7878407B2 (en) 2000-11-24 2007-10-24 POS-based digital image capturing and processing system employing automatic object motion detection and spectral-mixing based illumination techniques
US11/977,413 US7546952B2 (en) 2000-11-24 2007-10-24 Method of illuminating objects during digital image capture operations by mixing visible and invisible spectral illumination energy at point of sale (POS) environments
US11/978,535 US7571858B2 (en) 2000-11-24 2007-10-29 POS-based digital image capturing and processing system using automatic object detection, spectral-mixing based illumination and linear imaging techniques
US11/978,522 US7588188B2 (en) 2000-11-24 2007-10-29 Pos-based digital image capturing and processing system using automatic object detection, spectral-mixing based illumination and linear imaging techniques
US11/978,525 US7575170B2 (en) 2000-11-24 2007-10-29 POS-based digital image capturing and processing system using automatic object detection, spectral-mixing based illumination and linear imaging techniques
US11/978,521 US7661597B2 (en) 2000-11-24 2007-10-29 Coplanar laser illumination and imaging subsystem employing spectral-mixing and despeckling of laser illumination
US11/980,192 US7806336B2 (en) 2000-11-24 2007-10-30 Laser beam generation system employing a laser diode and high-frequency modulation circuitry mounted on a flexible circuit
US11/980,319 US8172141B2 (en) 2000-11-24 2007-10-30 Laser beam despeckling devices
US11/980,084 US7793841B2 (en) 2000-11-24 2007-10-30 Laser illumination beam generation system employing despeckling of the laser beam using high-frequency modulation of the laser diode current and optical multiplexing of the component laser beams
US11/980,329 US20080249884A1 (en) 2000-11-24 2007-10-30 POS-centric digital imaging system
US11/978,951 US7775436B2 (en) 2000-11-24 2007-10-30 Method of driving a plurality of visible and invisible LEDs so as to produce an illumination beam having a dynamically managed ratio of visible to invisible (IR) spectral energy/power during object illumination and imaging operations
US11/980,078 US7806335B2 (en) 2000-11-24 2007-10-30 Digital image capturing and processing system for automatically recognizing objects in a POS environment
US11/978,943 US7665665B2 (en) 2000-11-24 2007-10-30 Digital illumination and imaging subsystem employing despeckling mechanism employing high-frequency modulation of laser diode drive current and optical beam multiplexing techniques
US11/978,981 US7762465B2 (en) 2000-11-24 2007-10-30 Device for optically multiplexing a laser beam
US11/980,083 US7784695B2 (en) 2000-11-24 2007-10-30 Planar laser illumination module (PLIM) employing high-frequency modulation (HFM) of the laser drive currents and optical multplexing of the output laser beams
US11/980,317 US7770796B2 (en) 2000-11-24 2007-10-30 Device for producing a laser beam of reduced coherency using high-frequency modulation of the laser diode current and optical multiplexing of the output laser beam
US11/980,080 US7784698B2 (en) 2000-11-24 2007-10-30 Digital image capturing and processing system for automatically recognizing graphical intelligence graphically represented in digital images of objects
US12/283,439 US20090134221A1 (en) 2000-11-24 2008-09-11 Tunnel-type digital imaging-based system for use in automated self-checkout and cashier-assisted checkout operations in retail store environments

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US11/305,895 US7607581B2 (en) 2003-11-13 2005-12-16 Digital imaging-based code symbol reading system permitting modification of system features and functionalities
US11/305,895 2005-12-16
US11/408,268 US7464877B2 (en) 2003-11-13 2006-04-20 Digital imaging-based bar code symbol reading system employing image cropping pattern generator and automatic cropped image processor
US11/408,268 2006-04-20

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/489,259 Continuation-In-Part US7540424B2 (en) 2000-11-24 2006-07-19 Compact bar code symbol reading system employing a complex of coplanar illumination and imaging stations for omni-directional imaging of objects within a 3D imaging volume

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US11/640,814 Continuation-In-Part US7708205B2 (en) 2000-11-24 2006-12-18 Digital image capture and processing system employing multi-layer software-based system architecture permitting modification and/or extension of system features and functions by way of third party code plug-ins
US11/880,087 Continuation-In-Part US8042740B2 (en) 2000-11-24 2007-07-19 Method of reading bar code symbols on objects at a point-of-sale station by passing said objects through a complex of stationary coplanar illumination and imaging planes projected into a 3D imaging volume

Publications (2)

Publication Number Publication Date
WO2007075519A2 true WO2007075519A2 (en) 2007-07-05
WO2007075519A3 WO2007075519A3 (en) 2008-01-10

Family

ID=38218492

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2006/048148 WO2007075519A2 (en) 2000-11-24 2006-12-18 Digital image capture and processng system permitting modification and/or extension of system features and functions

Country Status (3)

Country Link
US (18) US7464877B2 (en)
EP (1) EP1971952A4 (en)
WO (1) WO2007075519A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8844822B2 (en) 2003-11-13 2014-09-30 Metrologic Instruments, Inc. Image capture and processing system supporting a multi-tier modular software architecture
US9720671B2 (en) 2008-06-17 2017-08-01 Microsoft Technology Licensing, Llc Installation of customized applications

Families Citing this family (183)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6924781B1 (en) 1998-09-11 2005-08-02 Visible Tech-Knowledgy, Inc. Smart electronic label employing electronic ink
US7464877B2 (en) * 2003-11-13 2008-12-16 Metrologic Instruments, Inc. Digital imaging-based bar code symbol reading system employing image cropping pattern generator and automatic cropped image processor
US20090134221A1 (en) * 2000-11-24 2009-05-28 Xiaoxun Zhu Tunnel-type digital imaging-based system for use in automated self-checkout and cashier-assisted checkout operations in retail store environments
US6760772B2 (en) * 2000-12-15 2004-07-06 Qualcomm, Inc. Generating and implementing a communication protocol and interface for high data rate signal transfer
EP1717728B1 (en) 2001-01-22 2010-09-01 Hand Held Products, Inc. Optical reader having partial frame operating mode
US7268924B2 (en) 2001-01-22 2007-09-11 Hand Held Products, Inc. Optical reader having reduced parameter determination delay
US7270273B2 (en) * 2001-01-22 2007-09-18 Hand Held Products, Inc. Optical reader having partial frame operating mode
US8812706B1 (en) * 2001-09-06 2014-08-19 Qualcomm Incorporated Method and apparatus for compensating for mismatched delays in signals of a mobile display interface (MDDI) system
US8596542B2 (en) * 2002-06-04 2013-12-03 Hand Held Products, Inc. Apparatus operative for capture of image data
US7637430B2 (en) 2003-05-12 2009-12-29 Hand Held Products, Inc. Picture taking optical reader
US20070241195A1 (en) * 2006-04-18 2007-10-18 Hand Held Products, Inc. Optical reading device with programmable LED control
ES2357234T3 (en) * 2003-06-02 2011-04-20 Qualcomm Incorporated GENERATION AND IMPLEMENTATION OF A PROTOCOL AND A SIGNAL INTERFACE FOR SPEEDS OF TRANSFER OF HIGH DATA.
EP2363991A1 (en) * 2003-08-13 2011-09-07 Qualcomm Incorporated A signal interface for higher data rates
RU2369033C2 (en) * 2003-09-10 2009-09-27 Квэлкомм Инкорпорейтед High-speed data transmission interface
JP2007509533A (en) * 2003-10-15 2007-04-12 クゥアルコム・インコーポレイテッド High speed data rate interface
CN101827074B (en) * 2003-10-29 2013-07-31 高通股份有限公司 High data rate interface
RU2341906C2 (en) * 2003-11-12 2008-12-20 Квэлкомм Инкорпорейтед High-speed data transfer interface with improved connection control
US7841533B2 (en) 2003-11-13 2010-11-30 Metrologic Instruments, Inc. Method of capturing and processing digital images of an object within the field of view (FOV) of a hand-supportable digitial image capture and processing system
CA2546971A1 (en) * 2003-11-25 2005-06-09 Qualcomm Incorporated High data rate interface with improved link synchronization
WO2005057881A1 (en) * 2003-12-08 2005-06-23 Qualcomm Incorporated High data rate interface with improved link synchronization
MXPA06010312A (en) * 2004-03-10 2007-01-19 Qualcomm Inc High data rate interface apparatus and method.
MXPA06010647A (en) * 2004-03-17 2007-01-17 Qualcomm Inc High data rate interface apparatus and method.
RU2006137364A (en) * 2004-03-24 2008-04-27 Квэлкомм Инкорпорейтед (US) DEVICE AND METHOD FOR HIGH-SPEED DATA TRANSFER INTERFACE
US8650304B2 (en) * 2004-06-04 2014-02-11 Qualcomm Incorporated Determining a pre skew and post skew calibration data rate in a mobile display digital interface (MDDI) communication system
KR100914420B1 (en) * 2004-06-04 2009-08-27 퀄컴 인코포레이티드 High data rate interface apparatus and method
US8570156B2 (en) 2010-09-01 2013-10-29 Quake Global, Inc. Pluggable small form-factor UHF RFID reader
US8873584B2 (en) * 2004-11-24 2014-10-28 Qualcomm Incorporated Digital data interface device
US8692838B2 (en) * 2004-11-24 2014-04-08 Qualcomm Incorporated Methods and systems for updating a buffer
US8723705B2 (en) 2004-11-24 2014-05-13 Qualcomm Incorporated Low output skew double data rate serial encoder
US20060161691A1 (en) * 2004-11-24 2006-07-20 Behnam Katibian Methods and systems for synchronous execution of commands across a communication link
US8539119B2 (en) * 2004-11-24 2013-09-17 Qualcomm Incorporated Methods and apparatus for exchanging messages having a digital data interface device message format
US8699330B2 (en) 2004-11-24 2014-04-15 Qualcomm Incorporated Systems and methods for digital data transmission rate control
US8667363B2 (en) 2004-11-24 2014-03-04 Qualcomm Incorporated Systems and methods for implementing cyclic redundancy checks
US7568628B2 (en) 2005-03-11 2009-08-04 Hand Held Products, Inc. Bar code reading device with global electronic shutter control
US7331524B2 (en) * 2005-05-31 2008-02-19 Symbol Technologies, Inc. Feedback mechanism for scanner devices
US7770799B2 (en) 2005-06-03 2010-08-10 Hand Held Products, Inc. Optical reader having reduced specular reflection read failures
US20070063048A1 (en) * 2005-09-14 2007-03-22 Havens William H Data reader apparatus having an adaptive lens
US8730069B2 (en) 2005-11-23 2014-05-20 Qualcomm Incorporated Double data rate serial encoder
US8692839B2 (en) * 2005-11-23 2014-04-08 Qualcomm Incorporated Methods and systems for updating a buffer
US20070175996A1 (en) * 2006-01-30 2007-08-02 Edward Barkan Imaging reader and method with tall field of view
US20110044544A1 (en) * 2006-04-24 2011-02-24 PixArt Imaging Incorporation, R.O.C. Method and system for recognizing objects in an image based on characteristics of the objects
US7740176B2 (en) * 2006-06-09 2010-06-22 Hand Held Products, Inc. Indicia reading apparatus having reduced trigger-to-read time
US7784696B2 (en) 2006-06-09 2010-08-31 Hand Held Products, Inc. Indicia reading apparatus having image sensing and processing circuit
US7599712B2 (en) * 2006-09-27 2009-10-06 Palm, Inc. Apparatus and methods for providing directional commands for a mobile computing device
US8594387B2 (en) * 2007-04-23 2013-11-26 Intel-Ge Care Innovations Llc Text capture and presentation device
US8909296B2 (en) * 2007-05-14 2014-12-09 Kopin Corporation Mobile wireless display software platform for controlling other systems and devices
US7918398B2 (en) * 2007-06-04 2011-04-05 Hand Held Products, Inc. Indicia reading terminal having multiple setting imaging lens
JPWO2008149709A1 (en) * 2007-06-04 2010-08-26 シャープ株式会社 Portable terminal, portable terminal control method, portable terminal control program, and computer-readable recording medium recording the same
WO2009042635A1 (en) * 2007-09-24 2009-04-02 Sound Innovations Inc. In-ear digital electronic noise cancelling and communication device
US7866557B2 (en) * 2007-09-27 2011-01-11 Symbol Technologies, Inc. Imaging-based bar code reader utilizing modified rolling shutter operation
US8131019B2 (en) * 2007-10-10 2012-03-06 Pitney Bowes Inc. Method and system for capturing images moving at high speed
KR100950465B1 (en) * 2007-12-21 2010-03-31 손승남 Camera control method for vehicle enrance control system
JP4914503B2 (en) * 2008-01-16 2012-04-11 日本電信電話株式会社 Surface plasmon resonance measuring apparatus, sample cell, and measuring method
US8325976B1 (en) * 2008-03-14 2012-12-04 Verint Systems Ltd. Systems and methods for adaptive bi-directional people counting
US9886231B2 (en) 2008-03-28 2018-02-06 Kopin Corporation Head worn wireless computer having high-resolution display suitable for use as a mobile internet device
US8947207B2 (en) 2008-04-29 2015-02-03 Quake Global, Inc. Method and apparatus for a deployable radio-frequency identification portal system
EP2302516B1 (en) * 2008-05-13 2012-07-18 dspace digital signal processing and control engineering GmbH Method to perform tasks for calculating a signal to be simulated in real time
US8035728B2 (en) 2008-06-27 2011-10-11 Aptina Imaging Corporation Method and apparatus providing rule-based auto exposure technique preserving scene dynamic range
US8336778B2 (en) * 2008-08-19 2012-12-25 The Code Corporation Graphical code readers that provide sequenced illumination for glare reduction
US8805110B2 (en) 2008-08-19 2014-08-12 Digimarc Corporation Methods and systems for content processing
US8814047B2 (en) * 2008-08-21 2014-08-26 Jadak, Llc Expedited image processing method
US8011584B2 (en) * 2008-12-12 2011-09-06 The Code Corporation Graphical code readers that are configured for glare reduction
US9639727B2 (en) * 2008-12-12 2017-05-02 The Code Corporation Graphical barcode readers that are configured for glare reduction
US8521217B2 (en) * 2009-06-10 2013-08-27 Digimarc Corporation Content sharing methods and systems
US20110135144A1 (en) * 2009-07-01 2011-06-09 Hand Held Products, Inc. Method and system for collecting voice and image data on a remote device and coverting the combined data
US8289300B2 (en) * 2009-07-17 2012-10-16 Microsoft Corporation Ambient correction in rolling image capture system
US8373108B2 (en) * 2009-08-12 2013-02-12 Hand Held Products, Inc. Indicia reading terminal operative for processing of frames having plurality of frame featurizations
US8121618B2 (en) 2009-10-28 2012-02-21 Digimarc Corporation Intuitive computing methods and systems
US8424766B2 (en) 2009-11-04 2013-04-23 Hand Held Products, Inc. Support assembly for terminal
US8605209B2 (en) 2009-11-24 2013-12-10 Gregory Towle Becker Hurricane damage recording camera system
JP5130332B2 (en) 2009-12-11 2013-01-30 東芝テック株式会社 Scanner
US20110163165A1 (en) * 2010-01-07 2011-07-07 Metrologic Instruments, Inc. Terminal having illumination and focus control
US8434686B2 (en) 2010-01-11 2013-05-07 Cognex Corporation Swipe scanner employing a vision system
US8640958B2 (en) 2010-01-21 2014-02-04 Honeywell International, Inc. Indicia reading terminal including optical filter
CN101807243B (en) * 2010-02-09 2011-12-14 郭毅军 Automatic identification system for surgical cutter
US9373016B2 (en) * 2010-03-11 2016-06-21 Datalogic Ip Tech S.R.L. Image capturing device
US8282005B2 (en) * 2010-06-18 2012-10-09 Hand Held Products, Inc. Portable data terminal with integrated flashlight
US8573497B2 (en) 2010-06-30 2013-11-05 Datalogic ADC, Inc. Adaptive data reader and method of operating
US9122939B2 (en) 2010-09-16 2015-09-01 Datalogic ADC, Inc. System and method for reading optical codes on reflective surfaces while minimizing flicker perception of pulsed illumination
US9135484B2 (en) 2010-09-28 2015-09-15 Datalogic ADC, Inc. Data reader with light source arrangement for improved illumination
US8387881B2 (en) 2010-12-01 2013-03-05 Hand Held Products, Inc. Terminal with screen reading mode
GB2501404A (en) 2010-12-09 2013-10-23 Metrologic Instr Inc Indicia encoding system with integrated purchase and payment information
US8448863B2 (en) 2010-12-13 2013-05-28 Metrologic Instruments, Inc. Bar code symbol reading system supporting visual or/and audible display of product scan speed for throughput optimization in point of sale (POS) environments
US8408468B2 (en) 2010-12-13 2013-04-02 Metrologic Instruments, Inc. Method of and system for reading visible and/or invisible code symbols in a user-transparent manner using visible/invisible illumination source switching during data capture and processing operations
US8939374B2 (en) 2010-12-30 2015-01-27 Hand Held Products, Inc. Terminal having illumination and exposure control
WO2012103092A2 (en) 2011-01-24 2012-08-02 Datalogic ADC, Inc. Exception detection and handling in automated optical code reading systems
US9418270B2 (en) 2011-01-31 2016-08-16 Hand Held Products, Inc. Terminal with flicker-corrected aimer and alternating illumination
US8561903B2 (en) 2011-01-31 2013-10-22 Hand Held Products, Inc. System operative to adaptively select an image sensor for decodable indicia reading
US8678286B2 (en) 2011-01-31 2014-03-25 Honeywell Scanning & Mobility Method and apparatus for reading optical indicia using a plurality of data sources
US8789757B2 (en) 2011-02-02 2014-07-29 Metrologic Instruments, Inc. POS-based code symbol reading system with integrated scale base and system housing having an improved produce weight capturing surface design
US8408464B2 (en) 2011-02-03 2013-04-02 Metrologic Instruments, Inc. Auto-exposure method using continuous video frames under controlled illumination
DE102011010722A1 (en) * 2011-02-09 2012-08-09 Testo Ag Meter set and method for documentation of a measurement
US20120223141A1 (en) 2011-03-01 2012-09-06 Metrologic Instruments, Inc. Digital linear imaging system employing pixel processing techniques to composite single-column linear images on a 2d image detection array
US8537245B2 (en) 2011-03-04 2013-09-17 Hand Held Products, Inc. Imaging and decoding device with quantum dot imager
US8469272B2 (en) 2011-03-29 2013-06-25 Metrologic Instruments, Inc. Hybrid-type bioptical laser scanning and imaging system supporting digital-imaging based bar code symbol reading at the surface of a laser scanning window
US8231054B1 (en) 2011-05-12 2012-07-31 Kim Moon J Time-varying barcodes for information exchange
US9667823B2 (en) 2011-05-12 2017-05-30 Moon J. Kim Time-varying barcode in an active display
US8256673B1 (en) 2011-05-12 2012-09-04 Kim Moon J Time-varying barcode in an active display
US8794525B2 (en) 2011-09-28 2014-08-05 Metologic Insturments, Inc. Method of and system for detecting produce weighing interferences in a POS-based checkout/scale system
US8561905B2 (en) 2011-06-15 2013-10-22 Metrologic Instruments, Inc. Hybrid-type bioptical laser scanning and digital imaging system supporting automatic object motion detection at the edges of a 3D scanning volume
US8640960B2 (en) 2011-06-27 2014-02-04 Honeywell International Inc. Optical filter for image and barcode scanning
US8636215B2 (en) 2011-06-27 2014-01-28 Hand Held Products, Inc. Decodable indicia reading terminal with optical filter
US8985459B2 (en) 2011-06-30 2015-03-24 Metrologic Instruments, Inc. Decodable indicia reading terminal with combined illumination
US8786730B2 (en) 2011-08-18 2014-07-22 Microsoft Corporation Image exposure using exclusion regions
US9491441B2 (en) 2011-08-30 2016-11-08 Microsoft Technology Licensing, Llc Method to extend laser depth map range
US8590789B2 (en) 2011-09-14 2013-11-26 Metrologic Instruments, Inc. Scanner with wake-up mode
US8479994B2 (en) * 2011-09-14 2013-07-09 Metrologic Instruments, Inc. Individualized scanner
US8950672B2 (en) * 2011-09-28 2015-02-10 Ncr Corporation Methods and apparatus for control of an imaging scanner
US8646692B2 (en) * 2011-09-30 2014-02-11 Hand Held Products, Inc. Devices and methods employing dual target auto exposure
US8608071B2 (en) 2011-10-17 2013-12-17 Honeywell Scanning And Mobility Optical indicia reading terminal with two image sensors
US8725833B2 (en) 2011-11-11 2014-05-13 Motorola Mobility Llc Comparison of selected item data to usage data for items associated with a user account
US20130129142A1 (en) * 2011-11-17 2013-05-23 Microsoft Corporation Automatic tag generation based on image content
US8485430B2 (en) 2011-12-06 2013-07-16 Honeywell International, Inc. Hand held bar code readers or mobile computers with cloud computing services
US8881983B2 (en) 2011-12-13 2014-11-11 Honeywell International Inc. Optical readers and methods employing polarization sensing of light from decodable indicia
US8628013B2 (en) 2011-12-13 2014-01-14 Honeywell International Inc. Apparatus comprising image sensor array and illumination control
US8857719B2 (en) 2012-01-15 2014-10-14 Symbol Technologies, Inc. Decoding barcodes displayed on cell phone
RU2481634C1 (en) * 2012-01-24 2013-05-10 ООО "Научно-производственный центр "ИНТЕЛКОМ" Method to read operational labels with scanner creating additional scattered light flow for expansion of scanner reading zone
US9041518B2 (en) 2012-01-26 2015-05-26 Hand Held Products, Inc. Portable RFID reading terminal with visual indication of scan trace
US8740085B2 (en) 2012-02-10 2014-06-03 Honeywell International Inc. System having imaging assembly for use in output of image data
US9143936B2 (en) 2012-03-06 2015-09-22 Moon J. Kim Mobile device digital communication and authentication methods
EP2828791B1 (en) * 2012-03-23 2018-05-23 Optoelectronics Co., Ltd. Image reading device capable of producing illumination including a continuous, low-intensity level illumination component and one or more pulsed, high-intensity level illumination components
US8777108B2 (en) 2012-03-23 2014-07-15 Honeywell International, Inc. Cell phone reading mode using image timer
US9013275B2 (en) 2012-04-20 2015-04-21 Hand Held Products, Inc. Portable encoded information reading terminal configured to adjust transmit power level
US8881982B2 (en) 2012-04-20 2014-11-11 Honeywell Scanning & Mobility Portable encoded information reading terminal configured to acquire images
US8727225B2 (en) 2012-04-20 2014-05-20 Honeywell International Inc. System and method for calibration and mapping of real-time location data
US9443119B2 (en) 2012-04-20 2016-09-13 Hand Held Products, Inc. Portable encoded information reading terminal configured to locate groups of RFID tags
US9536219B2 (en) 2012-04-20 2017-01-03 Hand Held Products, Inc. System and method for calibration and mapping of real-time location data
US8976030B2 (en) 2012-04-24 2015-03-10 Metrologic Instruments, Inc. Point of sale (POS) based checkout system supporting a customer-transparent two-factor authentication process during product checkout operations
GB2501504B (en) * 2012-04-25 2015-07-22 Ziath Ltd Device for reading barcodes
US9558386B2 (en) 2012-05-15 2017-01-31 Honeywell International, Inc. Encoded information reading terminal configured to pre-process images
US9004359B2 (en) 2012-05-16 2015-04-14 Datalogic ADC, Inc. Optical scanner with top down reader
US9064254B2 (en) 2012-05-17 2015-06-23 Honeywell International Inc. Cloud-based system for reading of decodable indicia
USD708183S1 (en) 2012-06-08 2014-07-01 Datalogic ADC, Inc. Data reader for checkout station
US8978981B2 (en) 2012-06-27 2015-03-17 Honeywell International Inc. Imaging apparatus having imaging lens
US9092683B2 (en) 2012-07-10 2015-07-28 Honeywell International Inc. Cloud-based system for processing of decodable indicia
US9202095B2 (en) 2012-07-13 2015-12-01 Symbol Technologies, Llc Pistol grip adapter for mobile device
US9361540B2 (en) * 2012-08-15 2016-06-07 Augmented Reality Lab LLC Fast image processing for recognition objectives system
US9456483B2 (en) * 2012-10-15 2016-09-27 The United States Of America As Represented By The Secretary Of The Navy Field programmable multi-emitter
US9465967B2 (en) 2012-11-14 2016-10-11 Hand Held Products, Inc. Apparatus comprising light sensing assemblies with range assisted gain control
US9841492B2 (en) 2013-02-25 2017-12-12 Quake Global, Inc. Ceiling-mounted RFID-enabled tracking
EP2962254A1 (en) 2013-02-26 2016-01-06 Quake Global, Inc. Methods and apparatus for automatic identification wristband
US10018510B2 (en) 2013-04-22 2018-07-10 Excelitas Technologies Singapore Pte. Ltd. Motion and presence detector
US9377365B2 (en) * 2013-04-22 2016-06-28 Excelitas Technologies Singapore Pte. Ltd. Thermal sensor module with lens array
US8770485B1 (en) * 2013-06-28 2014-07-08 Marson Technology Co., Ltd. Actuation method of virtual laser barcode scanner
US9594939B2 (en) 2013-09-09 2017-03-14 Hand Held Products, Inc. Initial point establishment using an image of a portion of an object
USD726186S1 (en) * 2013-10-25 2015-04-07 Symbol Technologies, Inc. Scanner
WO2015077455A1 (en) * 2013-11-25 2015-05-28 Digimarc Corporation Methods and systems for contextually processing imagery
USD727905S1 (en) * 2014-04-17 2015-04-28 Faro Technologies, Inc. Laser scanning device
CN105320504B (en) * 2014-06-25 2018-08-17 成都普中软件有限公司 A kind of visual software modeling method constructing software view based on software member view
US9769392B1 (en) 2014-06-27 2017-09-19 Amazon Technologies, Inc. Imaging system for addressing specular reflection
EP3175368A4 (en) * 2014-07-29 2018-03-14 Hewlett-Packard Development Company, L.P. Default calibrated sensor module settings
US9396409B2 (en) 2014-09-29 2016-07-19 At&T Intellectual Property I, L.P. Object based image processing
US9639730B2 (en) 2014-10-09 2017-05-02 Datalogic IP Tech Srl Aiming system and method for machine-readable symbol readers
US9418272B2 (en) * 2014-12-10 2016-08-16 Oracle International Corporation Configurable barcode processing system
US9679178B2 (en) * 2014-12-26 2017-06-13 Hand Held Products, Inc. Scanning improvements for saturated signals using automatic and fixed gain control methods
EP3040904B1 (en) 2014-12-31 2021-04-21 Hand Held Products, Inc. Portable rfid reading terminal with visual indication of scan trace
WO2016192025A1 (en) 2015-06-01 2016-12-08 SZ DJI Technology Co., Ltd. Systems and methods for memory architecture
JPWO2017006370A1 (en) * 2015-07-07 2018-04-19 オリンパス株式会社 Digital holographic imaging device
US9594936B1 (en) 2015-11-04 2017-03-14 Datalogic Usa, Inc. System and method for improved reading of data from reflective surfaces of electronic devices
US9697393B2 (en) 2015-11-20 2017-07-04 Symbol Technologies, Llc Methods and systems for adjusting mobile-device operating parameters based on housing-support type
US10097819B2 (en) 2015-11-23 2018-10-09 Rohde & Schwarz Gmbh & Co. Kg Testing system, testing method, computer program product, and non-transitory computer readable data carrier
US10599631B2 (en) 2015-11-23 2020-03-24 Rohde & Schwarz Gmbh & Co. Kg Logging system and method for logging
US10244180B2 (en) 2016-03-29 2019-03-26 Symbol Technologies, Llc Imaging module and reader for, and method of, expeditiously setting imaging parameters of imagers for imaging targets to be read over a range of working distances
US9646188B1 (en) * 2016-06-02 2017-05-09 Symbol Technologies, Llc Imaging module and reader for, and method of, expeditiously setting imaging parameters of an imager based on the imaging parameters previously set for a default imager
AU2017290659B2 (en) * 2016-06-27 2022-07-21 Omri WARSHAVSKI Color imaging by discrete narrow-band synchronized illumination
US9977941B2 (en) * 2016-07-29 2018-05-22 Ncr Corporation Barcode scanner illumination
US10452881B2 (en) * 2016-09-15 2019-10-22 Datalogic IP Tech, S.r.l. Machine-readable symbol reader with distributed illumination and/or image capture
US11336584B2 (en) * 2016-12-07 2022-05-17 Fuji Corporation Communication control device that varies data partitions based on a status of connected nodes
US11013562B2 (en) * 2017-02-14 2021-05-25 Atracsys Sarl High-speed optical tracking with compression and/or CMOS windowing
JP6676573B2 (en) * 2017-03-29 2020-04-08 Ckd株式会社 Inspection device and winding device
US10817685B2 (en) * 2017-09-28 2020-10-27 Datalogic Ip Tech S.R.L. System and method for illuminating a target of a barcode reader
US10204253B1 (en) 2017-12-01 2019-02-12 Digimarc Corporation Diagnostic data reporting from point-of-sale scanner
US11592536B2 (en) 2018-01-10 2023-02-28 Sony Semiconductor Solutions Corporation Control of image capture
WO2020085331A1 (en) * 2018-10-23 2020-04-30 株式会社デンソーウェーブ Information-code reading device
US11223814B2 (en) * 2019-05-28 2022-01-11 Lumileds Llc Imaging optics for one-dimensional array detector
WO2021001184A1 (en) * 2019-07-01 2021-01-07 Signify Holding B.V. Automatic power-on restart system for wireless network devices
US20210012622A1 (en) * 2019-07-11 2021-01-14 Zebra Technologies Corporation Scanning Devices and Methods to Constrain Radio Frequency Identification (RFID) Signals Within a Physical Location
US10970507B1 (en) * 2019-12-06 2021-04-06 Zebra Technologies Corporation Disable scanner illumination and aimer based on pre-defined scanner position
CN113284052A (en) 2020-02-19 2021-08-20 阿里巴巴集团控股有限公司 Image processing method and apparatus
US11790197B2 (en) * 2021-10-11 2023-10-17 Zebra Technologies Corporation Miniature long range imaging engine with auto-focus, auto-zoom, and auto-illumination system
US20230109799A1 (en) * 2021-10-11 2023-04-13 Zebra Technologies Corporation Methods and apparatus for scanning swiped barcodes
CN114666515A (en) * 2022-03-29 2022-06-24 上海富瀚微电子股份有限公司 Real-time acquisition device and method for original image data
US11816528B2 (en) * 2022-03-31 2023-11-14 Zebra Technologies Corporation Slot scanner assembly with wakeup system

Family Cites Families (361)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4045813A (en) 1975-07-10 1977-08-30 General Aquadyne, Inc. Method of operating video cameras and lights underwater
US4007377A (en) * 1975-09-08 1977-02-08 The Singer Company Optical scanning system for universal product code
US4053233A (en) 1976-03-15 1977-10-11 Aerodyne Research, Inc. Retroreflectors
US4338514A (en) 1980-04-07 1982-07-06 Spin Physics, Inc. Apparatus for controlling exposure of a solid state image sensor array
US4291338A (en) 1980-04-29 1981-09-22 The United States Of America As Represented By The Secretary Of The Navy Automatic exposure control for pulsed active TV systems
US4317622A (en) 1980-06-16 1982-03-02 Eastman Kodak Company Exposure control apparatus for flash photography
JPS5795771A (en) 1980-12-05 1982-06-14 Fuji Photo Film Co Ltd Solid-state image pickup device
US4434360A (en) * 1981-07-29 1984-02-28 Texas Instruments Incorporated Optical sensing device for reading bar code or the like
JPS5876973A (en) 1981-10-30 1983-05-10 Nippon Denso Co Ltd Optical information reader
US4766300A (en) 1984-08-06 1988-08-23 Norand Corporation Instant portable bar code reader
US6234395B1 (en) * 1981-12-28 2001-05-22 Intermec Ip Corp. Instant portable bar code reader
US6158661A (en) 1981-12-28 2000-12-12 Intermec Ip Corp. Instant portable bar code reader
US5144119A (en) 1981-12-28 1992-09-01 Norand Corporation Instant portable bar code reader
US4894523A (en) * 1981-12-28 1990-01-16 Norand Corporation Instant portable bar code reader
US5288985A (en) * 1981-12-28 1994-02-22 Norand Corporation Instant portable bar code reader
US5992750A (en) 1981-12-28 1999-11-30 Intermec Ip Corp. Instant portable bar code reader
JPS58211277A (en) 1982-05-31 1983-12-08 Nippon Denso Co Ltd Optical information reader
US4467196A (en) * 1982-06-29 1984-08-21 International Business Machines Corporation Single fiber optic wand system
US4818847A (en) * 1982-07-29 1989-04-04 Nippondenso Co., Ltd. Apparatus for optically reading printed information
US4636624A (en) 1983-01-10 1987-01-13 Minolta Camera Kabushiki Kaisha Focus detecting device for use with cameras
US4535758A (en) 1983-10-07 1985-08-20 Welch Allyn Inc. Signal level control for video system
JPS60197063A (en) 1984-03-21 1985-10-05 Canon Inc Led array and its sectional lighting method
US4571058A (en) * 1984-11-08 1986-02-18 Xerox Corporation Flash illumination and optical imaging system
US4547139A (en) 1984-12-17 1985-10-15 The Goodyear Tire & Rubber Company Tire mold
US4703344A (en) 1985-03-30 1987-10-27 Omron Tateisi Electronics Co. Illumination system of the digital control type
US4632542A (en) 1985-05-02 1986-12-30 Polaroid Corporation Exposure control system having dual mode photodetector
US4613402A (en) 1985-07-01 1986-09-23 Eastman Kodak Company Method of making edge-aligned implants and electrodes therefor
US4652732A (en) 1985-09-17 1987-03-24 National Semiconductor Corporation Low-profile bar code scanner
US4835615A (en) * 1986-01-21 1989-05-30 Minolta Camera Kabushiki Kaisha Image sensor with improved response characteristics
US4805026A (en) * 1986-02-18 1989-02-14 Nec Corporation Method for driving a CCD area image sensor in a non-interlace scanning and a structure of the CCD area image sensor for driving in the same method
USD297432S (en) 1986-06-27 1988-08-30 Hand Held Products Electronic bar code reader
US5410141A (en) * 1989-06-07 1995-04-25 Norand Hand-held data capture system with interchangable modules
KR910000753B1 (en) * 1986-08-29 1991-02-06 니뽄 덴끼 가부시끼가이샤 Flat display panel
JPS6386974A (en) 1986-09-30 1988-04-18 Nec Corp Charge transfer image pickup element and its driving method
EP0264956B1 (en) * 1986-10-24 1992-12-30 Sumitomo Electric Industries Limited An optical code reading device
US4819070A (en) 1987-04-10 1989-04-04 Texas Instruments Incorporated Image sensor array
US5272538A (en) 1987-11-04 1993-12-21 Canon Kabushiki Kaisha Exposure control device
US4805743A (en) * 1987-12-15 1989-02-21 Gunderson, Inc. Handbrake operating linkage for multi-unit rail cars
US5170205A (en) 1988-04-25 1992-12-08 Asahi Kogaku Kogyo Kabushiki Kaisha Eliminating camera-shake
DE3913595C3 (en) 1988-04-25 1996-12-19 Asahi Optical Co Ltd Exposure control device for a camera
US5025319A (en) 1988-07-12 1991-06-18 Fuji Photo Film Co., Ltd. Solid state image pickup device driving method utilizing an electronic shutter operation
US5841121A (en) 1988-08-31 1998-11-24 Norand Technology Corporation Hand-held optically readable character set reader having automatic focus control for operation over a range of distances
US6681994B1 (en) 1988-08-31 2004-01-27 Intermec Ip Corp. Method and apparatus for optically reading information
USD308865S (en) 1988-09-30 1990-06-26 Hand Held Products, Inc. Electronic bar code reader
USD304026S (en) 1988-09-30 1989-10-17 Hand Held Products, Inc. Battery pack for electronic bar code reader
US5621203A (en) * 1992-09-25 1997-04-15 Symbol Technologies Method and apparatus for reading two-dimensional bar code symbols with an elongated laser line
US5710417A (en) * 1988-10-21 1998-01-20 Symbol Technologies, Inc. Bar code reader for reading both one dimensional and two dimensional symbologies with programmable resolution
CA1329263C (en) * 1989-03-01 1994-05-03 Mark Krichever Bar code scanner
US5319181A (en) 1992-03-16 1994-06-07 Symbol Technologies, Inc. Method and apparatus for decoding two-dimensional bar code using CCD/CMD camera
US5635697A (en) 1989-03-01 1997-06-03 Symbol Technologies, Inc. Method and apparatus for decoding two-dimensional bar code
US5304786A (en) * 1990-01-05 1994-04-19 Symbol Technologies, Inc. High density two-dimensional bar code symbol
CA1334218C (en) 1989-03-01 1995-01-31 Jerome Swartz Hand-held laser scanning for reading two dimensional bar codes
JP2530910B2 (en) 1989-04-13 1996-09-04 株式会社テック Data processing device
US5270802A (en) 1989-04-14 1993-12-14 Hitachi, Ltd. White balance adjusting device for video camera
US6244512B1 (en) 1989-06-08 2001-06-12 Intermec Ip Corp. Hand-held data capture system with interchangeable modules
US5142684A (en) 1989-06-23 1992-08-25 Hand Held Products, Inc. Power conservation in microprocessor controlled devices
DE3923521C2 (en) 1989-07-15 1994-05-26 Kodak Ag Electronic camera
US4972224A (en) 1989-08-28 1990-11-20 Polaroid Corporation Exposure control system for a fixed aperture camera
US5034619A (en) 1989-09-21 1991-07-23 Welch Allyn, Inc. Optical reader with dual vertically oriented photoemitters
JP2976242B2 (en) 1989-09-23 1999-11-10 ヴィエルエスアイ ヴィジョン リミテッド Integrated circuit, camera using the integrated circuit, and method for detecting incident light incident on an image sensor manufactured using the integrated circuit technology
US5495097A (en) * 1993-09-14 1996-02-27 Symbol Technologies, Inc. Plurality of scan units with scan stitching
US5262871A (en) 1989-11-13 1993-11-16 Rutgers, The State University Multiple resolution image sensor
US5235198A (en) 1989-11-29 1993-08-10 Eastman Kodak Company Non-interlaced interline transfer CCD image sensing device with simplified electrode structure for each pixel
JPH03223972A (en) 1990-01-29 1991-10-02 Ezel Inc Camera illumination device
US4996413A (en) * 1990-02-27 1991-02-26 General Electric Company Apparatus and method for reading data from an image detector
US5258605A (en) 1990-03-13 1993-11-02 Symbol Technologies, Inc. Scan generators for bar code reader using linear array of lasers
JP2870946B2 (en) * 1990-03-13 1999-03-17 ブラザー工業株式会社 Optical scanning device
GB2244025B (en) * 1990-03-19 1994-03-30 Ricoh Kk Control system for image forming equipment
US5627359A (en) * 1991-09-17 1997-05-06 Metrologic Instruments, Inc. Laser code symbol scanner employing optical filtering system having narrow band-pass characteristics and spatially separated optical filter elements with laser light collection optics arranged along laser light return path disposed therebetween
US20010017320A1 (en) 1990-09-10 2001-08-30 Knowles Carl H. Projection laser scanner for scanning bar codes within a confined scanning volume
US5340973A (en) 1990-09-17 1994-08-23 Metrologic Instruments, Inc. Automatic laser scanning system and method of reading bar code symbols using same
US7077327B1 (en) 1990-09-17 2006-07-18 Metrologic Instruments, Inc. System for reading bar code symbols using bar code readers having RF signal transmission links with base stations
US5124537A (en) 1990-10-29 1992-06-23 Omniplanar, Inc. Omnidirectional bar code reader using virtual scan of video raster scan memory
US5111263A (en) * 1991-02-08 1992-05-05 Eastman Kodak Company Charge-coupled device (CCD) image sensor operable in either interlace or non-interlace mode
US5296689A (en) * 1992-02-28 1994-03-22 Spectra-Physics Scanning Systems, Inc. Aiming beam system for optical data reading device
US5233415A (en) * 1991-04-19 1993-08-03 Kaman Aerospace Corporation Imaging lidar employing transmitter referencing
IL98337A (en) * 1991-06-02 1995-01-24 Pinchas Schechner Production control by multiple branch bar-code readers
JP3097186B2 (en) 1991-06-04 2000-10-10 ソニー株式会社 Solid-state imaging device
CA2056272C (en) * 1991-06-14 2001-10-16 Patrick Salatto, Jr. Combined range laser scanner
US6266685B1 (en) 1991-07-11 2001-07-24 Intermec Ip Corp. Hand-held data collection system with stylus input
US5378883A (en) * 1991-07-19 1995-01-03 Omniplanar Inc. Omnidirectional wide range hand held bar code reader
US5235416A (en) 1991-07-30 1993-08-10 The Government Of The United States Of America As Represented By The Secretary Of The Department Of Health & Human Services System and method for preforming simultaneous bilateral measurements on a subject in motion
US5221956A (en) * 1991-08-14 1993-06-22 Kustom Signals, Inc. Lidar device with combined optical sight
US5883375A (en) * 1991-09-17 1999-03-16 Metrologic Instruments, Inc. Bar code symbol scanner having fixed and hand-held modes
EP0536481A2 (en) 1991-10-09 1993-04-14 Photographic Sciences Corporation Bar code reading instrument and selctively orientable graphics display which facilitates the operation of the instrument
US5231293A (en) 1991-10-31 1993-07-27 Psc, Inc. Bar code reading instrument which prompts operator to scan bar codes properly
US5233169A (en) 1991-10-31 1993-08-03 Psc, Inc. Uniport interface for a bar code reading instrument
US5308962A (en) * 1991-11-01 1994-05-03 Welch Allyn, Inc. Reduced power scanner for reading indicia
US5286960A (en) * 1991-11-04 1994-02-15 Welch Allyn, Inc. Method of programmable digitization and bar code scanning apparatus employing same
US5253198A (en) 1991-12-20 1993-10-12 Syracuse University Three-dimensional optical memory
US5281800A (en) * 1991-12-23 1994-01-25 Hand Held Products, Inc. Method and apparatus for low power optical sensing and decoding of data
US5291008A (en) * 1992-01-10 1994-03-01 Welch Allyn, Inc. Optical assembly and apparatus employing same using an aspherical lens and an aperture stop
US5294783A (en) * 1992-01-10 1994-03-15 Welch Allyn, Inc. Analog reconstruction circuit and bar code reading apparatus employing same
EP0576662B1 (en) 1992-01-17 1998-06-17 Welch Allyn, Inc. Intimate source and detector and apparatus employing same
JP3013584B2 (en) 1992-02-14 2000-02-28 ソニー株式会社 Solid-state imaging device
US6347163B2 (en) * 1994-10-26 2002-02-12 Symbol Technologies, Inc. System for reading two-dimensional images using ambient and/or projected light
US5291009A (en) * 1992-02-27 1994-03-01 Roustaei Alexander R Optical scanning head
US5786582A (en) 1992-02-27 1998-07-28 Symbol Technologies, Inc. Optical scanner for reading and decoding one- and two-dimensional symbologies at variable depths of field
US6385352B1 (en) * 1994-10-26 2002-05-07 Symbol Technologies, Inc. System and method for reading and comparing two-dimensional images
US5484994A (en) * 1993-10-18 1996-01-16 Roustaei; Alexander Optical scanning head with improved resolution
US5354977A (en) 1992-02-27 1994-10-11 Alex Roustaei Optical scanning head
US5777314A (en) 1992-02-27 1998-07-07 Symbol Optical scanner with fixed focus optics
US5349172A (en) 1992-02-27 1994-09-20 Alex Roustaei Optical scanning head
US5756981A (en) * 1992-02-27 1998-05-26 Symbol Technologies, Inc. Optical scanner for reading and decoding one- and-two-dimensional symbologies at variable depths of field including memory efficient high speed image processing means and high accuracy image analysis means
US5319182A (en) 1992-03-04 1994-06-07 Welch Allyn, Inc. Integrated solid state light emitting and detecting array and apparatus employing said array
US5902988A (en) * 1992-03-12 1999-05-11 Norand Corporation Reader for decoding two-dimensional optically readable information
JP3233981B2 (en) 1992-05-26 2001-12-04 オリンパス光学工業株式会社 Symbol information reader
US5308960A (en) * 1992-05-26 1994-05-03 United Parcel Service Of America, Inc. Combined camera system
US5327171A (en) * 1992-05-26 1994-07-05 United Parcel Service Of America, Inc. Camera system optics
US5274228A (en) * 1992-06-01 1993-12-28 Eastman Kodak Company Linear light source/collector with integrating cylinder and light pipe means
US5309243A (en) * 1992-06-10 1994-05-03 Eastman Kodak Company Method and apparatus for extending the dynamic range of an electronic imaging system
US6189793B1 (en) 1992-06-12 2001-02-20 Metrologic Instruments, Inc. Automatic laser projection scanner with improved activation controlling mechanism
JP2788152B2 (en) * 1992-06-22 1998-08-20 松下電器産業株式会社 Barcode reader
USD346162S (en) * 1992-06-30 1994-04-19 Hand Held Products Electronic bar code reader
US5475207A (en) * 1992-07-14 1995-12-12 Spectra-Physics Scanning Systems, Inc. Multiple plane scanning system for data reading applications
US5410108A (en) 1992-08-31 1995-04-25 Spectra-Physics Scanning Systems, Inc. Combined scanner and scale
EP0620458A4 (en) * 1992-09-07 1995-02-01 Nippon Kogaku Kk Optical waveguide device and optical instrument using the same.
US6044231A (en) * 1993-01-28 2000-03-28 Nikon Corp. Camera with dual mode exposure controlled data imprinting
US5352884A (en) 1993-04-14 1994-10-04 General Electric Corporation Method and apparatus for providing offset for light detector
US5304787A (en) * 1993-06-01 1994-04-19 Metamedia Corporation Locating 2-D bar codes
US5393967A (en) 1993-07-21 1995-02-28 Sensis Corporation Method and apparatus for non-contact reading of a relief pattern
JP3144736B2 (en) * 1993-08-10 2001-03-12 富士通株式会社 Ambient light detection device and laser lighting control device for barcode reader using the same
US5430285A (en) 1993-08-20 1995-07-04 Welch Allyn, Inc. Illumination system for optical reader
US5623137A (en) * 1993-08-20 1997-04-22 Welch Allyn, Inc. Illumination apparatus for optical readers
US5398112A (en) * 1993-10-04 1995-03-14 Wyko Corporation Method for testing an optical window with a small wedge angle
US6006995A (en) * 1993-10-12 1999-12-28 Metrologic Instruments Inc. System for reading bar code symbol on containers having arbitrary surface geometry
US5489771A (en) * 1993-10-15 1996-02-06 University Of Virginia Patent Foundation LED light standard for photo- and videomicroscopy
US5420409A (en) * 1993-10-18 1995-05-30 Welch Allyn, Inc. Bar code scanner providing aural feedback
US5519496A (en) * 1994-01-07 1996-05-21 Applied Intelligent Systems, Inc. Illumination system and method for generating an image of an object
US5559907A (en) * 1994-02-17 1996-09-24 Lucent Technologies Inc. Method of controlling polarization properties of a photo-induced device in an optical waveguide and method of investigating structure of an optical waveguide
US5965863A (en) 1994-03-04 1999-10-12 Welch Allyn, Inc. Optical reader system comprising local host processor and optical reader
US5773806A (en) 1995-07-20 1998-06-30 Welch Allyn, Inc. Method and apparatus for capturing a decodable representation of a 2D bar code symbol using a hand-held reader having a 1D image sensor
US5463214A (en) 1994-03-04 1995-10-31 Welch Allyn, Inc. Apparatus for optimizing throughput in decoded-output scanners and method of using same
US7387253B1 (en) 1996-09-03 2008-06-17 Hand Held Products, Inc. Optical reader system comprising local host processor and optical reader
US5825006A (en) 1994-03-04 1998-10-20 Welch Allyn, Inc. Optical reader having improved autodiscrimination features
US5942741A (en) 1994-03-04 1999-08-24 Welch Allyn, Inc. Apparatus for optimizing throughput in decoded-output scanners and method of using same
US5900613A (en) * 1994-03-04 1999-05-04 Welch Allyn, Inc. Optical reader having improved reprogramming features
US5932862A (en) 1994-03-04 1999-08-03 Welch Allyn, Inc. Optical reader having improved scanning-decoding features
US5929418A (en) 1994-03-04 1999-07-27 Welch Allyn, Inc. Optical reader having improved menuing features
US5457309A (en) 1994-03-18 1995-10-10 Hand Held Products Predictive bar code decoding system and method
US5541419A (en) 1994-03-21 1996-07-30 Intermec Corporation Symbology reader wth reduced specular reflection
US5513264A (en) 1994-04-05 1996-04-30 Metanetics Corporation Visually interactive encoding and decoding of dataforms
EP0679021B1 (en) 1994-04-19 2010-12-15 Eastman Kodak Company Automatic camera exposure control using variable exposure index CCD sensor
US5479515A (en) 1994-05-11 1995-12-26 Welch Allyn, Inc. One-dimensional bar code symbology and method of using same
US5837985A (en) 1994-05-14 1998-11-17 Welch Allyn, Inc. Optical imaging assembly having improved image sensor orientation
US5831674A (en) 1994-06-10 1998-11-03 Metanetics Corporation Oblique access to image data for reading bar codes
US5736724A (en) * 1994-06-10 1998-04-07 Metanetics Corporation Oblique access to image data for reading dataforms
US5917945A (en) * 1994-06-15 1999-06-29 Metanetics Corporation Recognizing dataforms in image areas
US5550366A (en) 1994-06-20 1996-08-27 Roustaei; Alexander Optical scanner with automatic activation
US6708883B2 (en) * 1994-06-30 2004-03-23 Symbol Technologies, Inc. Apparatus and method for reading indicia using charge coupled device and scanning laser beam technology
US5672858A (en) 1994-06-30 1997-09-30 Symbol Technologies Inc. Apparatus and method for reading indicia using charge coupled device and scanning laser beam technology
CA2150747A1 (en) * 1994-06-30 1995-12-31 Yajun Li Multiple laser indicia reader optionally utilizing a charge coupled device (ccd) detector and operating method therefor
US5702059A (en) 1994-07-26 1997-12-30 Meta Holding Corp. Extended working range dataform reader including fuzzy logic image control circuitry
US5521366A (en) 1994-07-26 1996-05-28 Metanetics Corporation Dataform readers having controlled and overlapped exposure integration periods
US5572006A (en) 1994-07-26 1996-11-05 Metanetics Corporation Automatic exposure single frame imaging systems
US5811784A (en) 1995-06-26 1998-09-22 Telxon Corporation Extended working range dataform reader
US5815200A (en) 1994-07-26 1998-09-29 Metanetics Corporation Extended working range dataform reader with reduced power consumption
US5572007A (en) 1994-08-19 1996-11-05 Intermec Corporation Symbology reader with interchangeable window
JP3525353B2 (en) * 1994-09-28 2004-05-10 株式会社リコー Digital electronic still camera
US5793967A (en) 1994-10-18 1998-08-11 Hand Held Products, Inc. Data collection and RF communications system and method of developing applications for same
US5659761A (en) 1994-10-18 1997-08-19 Hand Held Products Data recognition apparatus and portable data reader having power management system
US5506929A (en) * 1994-10-19 1996-04-09 Clio Technologies, Inc. Light expanding system for producing a linear or planar light beam from a point-like light source
ATE165681T1 (en) 1994-10-25 1998-05-15 United Parcel Service Inc METHOD AND DEVICE FOR A PORTABLE CONTACTLESS IMAGE RECORDING DEVICE
PT788634E (en) 1994-10-25 2000-08-31 United Parcel Service Inc AUTOMATIC ELECTRONIC CAMERA FOR LABEL IMAGE COLLECTION
US5825010A (en) * 1994-11-21 1998-10-20 Symbol Technologies, Inc. Bar code scanner positioning
EP0722148A2 (en) * 1995-01-10 1996-07-17 Welch Allyn, Inc. Bar code reader
US5786586A (en) 1995-01-17 1998-07-28 Welch Allyn, Inc. Hand-held optical reader having a detachable lens-guide assembly
US6045047A (en) * 1995-01-17 2000-04-04 Welch Allyn Data Collection, Inc. Two-dimensional part reader having a focussing guide
EP0727760A3 (en) * 1995-02-17 1997-01-29 Ibm Produce size recognition system
US5797015A (en) 1995-04-18 1998-08-18 Pitney Bowes Inc. Method of customizing application software in inserter systems
US5780834A (en) * 1995-05-15 1998-07-14 Welch Allyn, Inc. Imaging and illumination optics assembly
US6060722A (en) * 1995-05-15 2000-05-09 Havens; William H. Optical reader having illumination assembly including improved aiming pattern generator
US5784102A (en) 1995-05-15 1998-07-21 Welch Allyn, Inc. Optical reader having improved interactive image sensing and control circuitry
US5739518A (en) * 1995-05-17 1998-04-14 Metanetics Corporation Autodiscrimination for dataform decoding and standardized recording
JPH08329180A (en) 1995-06-05 1996-12-13 Asahi Optical Co Ltd Data symbol reader
US5661291A (en) 1995-06-07 1997-08-26 Hand Held Products, Inc. Audio proof of delivery system and method
JPH096891A (en) 1995-06-21 1997-01-10 Asahi Optical Co Ltd Data symbol reader and data symbol read system
US6019286A (en) * 1995-06-26 2000-02-01 Metanetics Corporation Portable data collection device with dataform decoding and image capture capability
US5783811A (en) 1995-06-26 1998-07-21 Metanetics Corporation Portable data collection device with LED targeting and illumination assembly
US5912700A (en) * 1996-01-10 1999-06-15 Fox Sports Productions, Inc. System for enhancing the television presentation of an object at a sporting event
US6056199A (en) * 1995-09-25 2000-05-02 Intermec Ip Corporation Method and apparatus for storing and reading data
US5979763A (en) 1995-10-13 1999-11-09 Metanetics Corporation Sub-pixel dataform reader with dynamic noise margins
US5949054A (en) 1995-10-23 1999-09-07 Welch Allyn, Inc. Bar code reader for reading high to low contrast bar code symbols
US5805743A (en) 1995-10-30 1998-09-08 Minolta Co., Ltd. Optical deflector and scanning optical system provided with the optical deflector
WO1997019491A1 (en) 1995-11-20 1997-05-29 Philips Electronics N.V. An electrically conductive wire
US6254003B1 (en) 1995-12-18 2001-07-03 Welch Allyn Data Collection, Inc. Optical reader exposure control apparatus comprising illumination level detection circuitry
US5831254A (en) 1995-12-18 1998-11-03 Welch Allyn, Inc. Exposure control apparatus for use with optical readers
US6629641B2 (en) * 2000-06-07 2003-10-07 Metrologic Instruments, Inc. Method of and system for producing images of objects using planar laser illumination beams and image detection arrays
US5714745A (en) * 1995-12-20 1998-02-03 Metanetics Corporation Portable data collection device with color imaging assembly
US6109528A (en) 1995-12-22 2000-08-29 Intermec Ip Corp. Ergonomic hand-held data terminal and data collection system
US5786583A (en) 1996-02-16 1998-07-28 Intermec Corporation Method and apparatus for locating and decoding machine-readable symbols
US6473519B1 (en) 1996-02-21 2002-10-29 Hand Held Products, Inc. Check reader
US5717195A (en) * 1996-03-05 1998-02-10 Metanetics Corporation Imaging based slot dataform reader
USD505423S1 (en) 1996-03-18 2005-05-24 Hand Held Products, Inc. Finger saddle incorporated in cornerless housing
US5955720A (en) * 1996-03-21 1999-09-21 Symbol Technologies, Inc. Semi-retroreflective scanners
US5838495A (en) 1996-03-25 1998-11-17 Welch Allyn, Inc. Image sensor containment system
US5773810A (en) 1996-03-29 1998-06-30 Welch Allyn, Inc. Method for generating real time degree of focus signal for handheld imaging device
US5793033A (en) 1996-03-29 1998-08-11 Metanetics Corporation Portable data collection device with viewing assembly
US6330974B1 (en) 1996-03-29 2001-12-18 Intermec Ip Corp. High resolution laser imager for low contrast symbology
US5719384A (en) * 1996-05-10 1998-02-17 Metanetics Corporation Oblique access to image data for reading dataforms
US5914477A (en) 1996-06-26 1999-06-22 Ncr Corporation Line focus barcode scanner
US6367699B2 (en) * 1996-07-11 2002-04-09 Intermec Ip Corp. Method and apparatus for utilizing specular light to image low contrast symbols
US6064763A (en) * 1996-07-26 2000-05-16 Intermec Ip Corporation Time-efficient method of analyzing imaged input data to locate two-dimensional machine-readable symbols or other linear images therein
USD396033S (en) 1996-09-12 1998-07-14 Hand Held Products, Inc. Base unit for receiving an article
AT408287B (en) * 1996-10-01 2001-10-25 Sez Semiconduct Equip Zubehoer METHOD AND DEVICE FOR DRYING DISC-SHAPED SUBSTRATES OF SEMICONDUCTOR TECHNOLOGY
US6223988B1 (en) * 1996-10-16 2001-05-01 Omniplanar, Inc Hand-held bar code reader with laser scanning and 2D image capture
US6177926B1 (en) * 1996-10-22 2001-01-23 Intermec Ip Corp. Hand-held computer having input screen and means for preventing inadvertent actuation of keys
US6015088A (en) * 1996-11-05 2000-01-18 Welch Allyn, Inc. Decoding of real time video imaging
US5892214A (en) 1996-11-20 1999-04-06 Ncr Corporation Low profile planar scanner
US5970245A (en) 1997-01-03 1999-10-19 Ncr Corporation Method for debugging shared procedures contained in dynamic link library files
US5838842A (en) * 1997-01-10 1998-11-17 The United States Of America As Represented By The Secretary Of The Army Self-imaging waveguide optical polarization or wavelength splitters
US6179208B1 (en) * 1997-01-31 2001-01-30 Metanetics Corporation Portable data collection device with variable focusing module for optic assembly
US5986705A (en) 1997-02-18 1999-11-16 Matsushita Electric Industrial Co., Ltd. Exposure control system controlling a solid state image sensing device
US5992744A (en) 1997-02-18 1999-11-30 Welch Allyn, Inc. Optical reader having multiple scanning assemblies with simultaneously decoded outputs
US6097839A (en) 1997-03-10 2000-08-01 Intermec Ip Corporation Method and apparatus for automatic discriminating and locating patterns such as finder patterns, or portions thereof, in machine-readable symbols
US6173893B1 (en) * 1997-04-16 2001-01-16 Intermec Corporation Fast finding algorithm for two-dimensional symbologies
US6223986B1 (en) 1997-04-17 2001-05-01 Psc Scanning, Inc. Aiming aid for optical data reading
EP0978084A1 (en) 1997-04-21 2000-02-09 Intermec Scanner Technology Center S.A. Optoelectronic device for image acquisition, in particular of bar codes
EP0980537B1 (en) 1997-05-05 2007-11-14 Symbol Technologies, Inc. Optical scanner and image reader for reading images and decoding optical information including one and two dimensional symbologies at variable depth of field
US5920061A (en) 1997-05-29 1999-07-06 Metanetics Corporation Portable data collection device including imaging assembly with modular high density dataform reader assembly
US6075882A (en) 1997-06-18 2000-06-13 Philip Morris Incorporated System and method for optically inspecting cigarettes by detecting the lengths of cigarette sections
US6062475A (en) * 1997-06-25 2000-05-16 Metanetics Corporation Portable data collection device including color imaging dataform reader assembly
FR2765363B1 (en) 1997-06-30 2000-02-18 Actikey METHOD AND SYSTEM FOR MONITORING THE USE OF SOFTWARE
US6188381B1 (en) * 1997-09-08 2001-02-13 Sarnoff Corporation Modular parallel-pipelined vision system for real-time video processing
US7028899B2 (en) * 1999-06-07 2006-04-18 Metrologic Instruments, Inc. Method of speckle-noise pattern reduction and apparatus therefore based on reducing the temporal-coherence of the planar laser illumination beam before it illuminates the target object by applying temporal phase modulation techniques during the transmission of the plib towards the target
US6128414A (en) 1997-09-29 2000-10-03 Intermec Ip Corporation Non-linear image processing and automatic discriminating method and apparatus for images such as images of machine-readable symbols
US6000612A (en) 1997-10-10 1999-12-14 Metanetics Corporation Portable data collection device having optical character recognition
US6298175B1 (en) 1997-10-17 2001-10-02 Welch Allyn Data Collection, Inc. Object sensor system comprising controlled light source
US5949052A (en) 1997-10-17 1999-09-07 Welch Allyn, Inc. Object sensor system for stationary position optical reader
US6298176B2 (en) 1997-10-17 2001-10-02 Welch Allyn Data Collection, Inc. Symbol-controlled image data reading system
US6561428B2 (en) * 1997-10-17 2003-05-13 Hand Held Products, Inc. Imaging device having indicia-controlled image parsing mode
US5914476A (en) 1997-11-04 1999-06-22 Welch Allyn, Inc. Optical reader configured to accurately and rapidly read multiple symbols
US6669093B1 (en) 1997-12-19 2003-12-30 Telxon Corporation Hand-held dataform reader having multiple target area illumination sources for independent reading of superimposed dataforms
US5953534A (en) 1997-12-23 1999-09-14 University Of Washington Environment manipulation for executing modified executable and dynamically-loaded library files
US5969326A (en) 1998-01-14 1999-10-19 Intermec Ip Corp. Method and apparatus of autodiscriminating in symbol reader employing prioritized and updated table of symbologies
US6497368B1 (en) 1998-01-22 2002-12-24 Intermec Ip Corp. Portable data collection
US6123263A (en) 1998-01-29 2000-09-26 Meta Holdings Corporation Hand held dataform reader having strobing ultraviolet light illumination assembly for reading fluorescent dataforms
US6108100A (en) 1998-02-20 2000-08-22 Hewlett-Packard Company Apparatus and method for end-user performance upgrade
US6177957B1 (en) * 1998-02-26 2001-01-23 Flashpoint Technology, Inc. System and method for dynamically updating features in an electronic imaging device
US6119941A (en) 1998-05-04 2000-09-19 Intermec Ip Corp. Automated help instructions for automatically or adaptively configuring a hand-held device, such as a bar code reader or hand-held personal computer
US6685095B2 (en) * 1998-05-05 2004-02-03 Symagery Microsystems, Inc. Apparatus and method for decoding damaged optical codes
US6186404B1 (en) 1998-05-29 2001-02-13 Welch Allyn Data Collection, Inc. Security document voiding system
US6250551B1 (en) 1998-06-12 2001-06-26 Symbol Technologies, Inc. Autodiscrimination and line drawing techniques for code readers
US6340114B1 (en) * 1998-06-12 2002-01-22 Symbol Technologies, Inc. Imaging engine and method for code readers
US20040000592A1 (en) * 2002-02-20 2004-01-01 Welch Allyn, Inc. Adjustable illumination system for a barcode scanner
US20030209603A1 (en) 1998-07-08 2003-11-13 Welch Allyn Data Collection, Inc. Optical assembly for barcode scanner
US6659350B2 (en) 2000-11-01 2003-12-09 Hand Held Products Adjustable illumination system for a barcode scanner
US6275388B1 (en) 1998-07-08 2001-08-14 Welch Allyn Data Collection, Inc. Image sensor mounting system
USD459728S1 (en) 1998-07-08 2002-07-02 Hand Held Products, Inc. Optical reader device
US6164544A (en) * 1998-07-08 2000-12-26 Welch Allyn Data Collection, Inc. Adjustable illumination system for a barcode scanner
US6601768B2 (en) 2001-03-08 2003-08-05 Welch Allyn Data Collection, Inc. Imaging module for optical reader comprising refractive diffuser
US6607128B1 (en) 1998-07-08 2003-08-19 Welch Allyn Data Collection Inc. Optical assembly for barcode scanner
US6547139B1 (en) * 1998-07-10 2003-04-15 Welch Allyn Data Collection, Inc. Method and apparatus for extending operating range of bar code scanner
US6097856A (en) 1998-07-10 2000-08-01 Welch Allyn, Inc. Apparatus and method for reducing imaging errors in imaging systems having an extended depth of field
US6152371A (en) 1998-08-12 2000-11-28 Welch Allyn, Inc. Method and apparatus for decoding bar code symbols
US6045046A (en) 1998-08-27 2000-04-04 Ncr Corporation Full coverage barcode scanner
US6598797B2 (en) * 1998-09-11 2003-07-29 Jason J. Lee Focus and illumination analysis algorithm for imaging device
US6098887A (en) 1998-09-11 2000-08-08 Robotic Vision Systems, Inc. Optical focusing device and method
US6149063A (en) 1998-09-14 2000-11-21 Intermec Ip Corp. Method and apparatus for bar code association for wireless network
US6161760A (en) 1998-09-14 2000-12-19 Welch Allyn Data Collection, Inc. Multiple application multiterminal data collection network
US6336587B1 (en) * 1998-10-19 2002-01-08 Symbol Technologies, Inc. Optical code reader for producing video displays and measuring physical parameters of objects
US6264105B1 (en) 1998-11-05 2001-07-24 Welch Allyn Data Collection, Inc. Bar code reader configured to read fine print barcode symbols
US6575367B1 (en) 1998-11-05 2003-06-10 Welch Allyn Data Collection, Inc. Image data binarization methods enabling optical reader to read fine print indicia
US6109526A (en) 1998-11-17 2000-08-29 Intermec Ip Corp. Optical and passive electromagnetic reader for reading machine-readable symbols, such as bar codes, and reading wireless tags, such as radio frequency tags, and corresponding method
JP3592941B2 (en) * 1998-11-24 2004-11-24 株式会社東海理化電機製作所 Steering lock device
US6565003B1 (en) 1998-12-16 2003-05-20 Matsushita Electric Industrial Co., Ltd. Method for locating and reading a two-dimensional barcode
US20050274801A1 (en) * 1999-01-29 2005-12-15 Intermec Ip Corp. Method, apparatus and article for validating ADC devices, such as barcode, RFID and magnetic stripe readers
US6246642B1 (en) * 1999-04-13 2001-06-12 Hewlett-Packard Company Automated optical detection system and method
US6373579B1 (en) * 1999-05-26 2002-04-16 Hand Held Products, Inc. Portable measurement apparatus for determinging the dimensions of an object and associated method
US6357659B1 (en) * 1999-06-03 2002-03-19 Psc Scanning, Inc. Hands free optical scanner trigger
JP2000349984A (en) * 1999-06-04 2000-12-15 Fujitsu Ltd Image reader and image processing unit
US6181760B1 (en) * 1999-07-20 2001-01-30 General Electric Company Electrochemical corrosion potential sensor with increased lifetime
US6352204B2 (en) * 1999-08-04 2002-03-05 Industrial Data Entry Automation Systems Incorporated Optical symbol scanner with low angle illumination
JP2003532944A (en) * 1999-10-04 2003-11-05 ウェルチ アリン データ コレクション インコーポレーテッド Imaging module for optical reader
US6832725B2 (en) 1999-10-04 2004-12-21 Hand Held Products, Inc. Optical reader comprising multiple color illumination
US6695209B1 (en) * 1999-10-04 2004-02-24 Psc Scanning, Inc. Triggerless optical reader with signal enhancement features
US6585159B1 (en) 1999-11-02 2003-07-01 Welch Allyn Data Collection, Inc. Indicia sensor system for optical reader
US6370003B1 (en) * 1999-11-30 2002-04-09 Welch Allyn Data Collections, Inc. Electrostatic charge resistant instrument system
US6831690B1 (en) 1999-12-07 2004-12-14 Symagery Microsystems, Inc. Electrical sensing apparatus and method utilizing an array of transducer elements
US6478223B1 (en) 2000-01-12 2002-11-12 Intermec Ip Corporation Machine-readable color symbology and method and apparatus for reading same with standard readers such as laser scanners
US6469289B1 (en) 2000-01-21 2002-10-22 Symagery Microsystems Inc. Ambient light detection technique for an imaging array
US6940998B2 (en) 2000-02-04 2005-09-06 Cernium, Inc. System for automated screening of security cameras
US7137555B2 (en) * 2000-02-28 2006-11-21 Psc Scanning, Inc. Multi-format bar code reader
US6912076B2 (en) 2000-03-17 2005-06-28 Accu-Sort Systems, Inc. Coplanar camera scanning system
US6628445B2 (en) 2000-03-17 2003-09-30 Accu-Sort Systems, Inc. Coplanar camera scanning system
US6489798B1 (en) 2000-03-30 2002-12-03 Symagery Microsystems Inc. Method and apparatus for testing image sensing circuit arrays
DE60135227D1 (en) * 2000-05-03 2008-09-18 Leonard Reiffel DOUBLE MODE DATA PRODUCT PICTURE
US6616046B1 (en) 2000-05-10 2003-09-09 Symbol Technologies, Inc. Techniques for miniaturizing bar code scanners including spiral springs and speckle noise reduction
US6899272B2 (en) * 2000-05-17 2005-05-31 Symbol Technologies, Inc Bioptics bar code reader
US6637655B1 (en) * 2000-06-08 2003-10-28 Metrologic Instruments, Inc. Automatic range adjustment techniques for stand-mountable bar code scanners
US6345765B1 (en) * 2000-06-30 2002-02-12 Intermec Ip Corp. Spectral scanner employing light paths of multiple wavelengths for scanning objects, such as bar code symbols, and associated method
US6789157B1 (en) 2000-06-30 2004-09-07 Intel Corporation Plug-in equipped updateable firmware
US6689998B1 (en) * 2000-07-05 2004-02-10 Psc Scanning, Inc. Apparatus for optical distancing autofocus and imaging and method of using the same
USD442152S1 (en) * 2000-07-17 2001-05-15 Symagery Microsystems Inc. Multipurpose portable wireless video appliance
US7129961B1 (en) 2000-09-18 2006-10-31 Sony Corporation System and method for dynamic autocropping of images
US6775077B1 (en) * 2000-09-22 2004-08-10 Symbol Technologies, Inc. Micro reader scan engine with prism
US6947612B2 (en) 2000-09-29 2005-09-20 Hand Held Products, Inc. Methods and apparatus for image capture and decoding in a centralized processing unit
US7148923B2 (en) 2000-09-30 2006-12-12 Hand Held Products, Inc. Methods and apparatus for automatic exposure control
US7594609B2 (en) * 2003-11-13 2009-09-29 Metrologic Instruments, Inc. Automatic digital video image capture and processing system supporting image-processing based code symbol reading during a pass-through mode of system operation at a retail point of sale (POS) station
US7464877B2 (en) * 2003-11-13 2008-12-16 Metrologic Instruments, Inc. Digital imaging-based bar code symbol reading system employing image cropping pattern generator and automatic cropped image processor
US7034848B2 (en) * 2001-01-05 2006-04-25 Hewlett-Packard Development Company, L.P. System and method for automatically cropping graphical images
US6993169B2 (en) * 2001-01-11 2006-01-31 Trestle Corporation System and method for finding regions of interest for microscopic digital montage imaging
US6637658B2 (en) * 2001-01-22 2003-10-28 Welch Allyn, Inc. Optical reader having partial frame operating mode
US6390625B1 (en) * 2001-01-31 2002-05-21 Welch Allyn, Inc. Focusing mechanism
US7302462B2 (en) 2001-03-12 2007-11-27 Mercury Computer Systems, Inc. Framework and methods for dynamic execution of digital data processor resources
US6978038B2 (en) 2001-04-13 2005-12-20 The Code Corporation Systems and methods for pixel gain compensation in machine-readable graphical codes
USD467918S1 (en) 2001-04-24 2002-12-31 Hand Held Products, Inc. Data collection device
US6619547B2 (en) 2001-04-30 2003-09-16 The Code Corporation Image-based graphical code reader device with multi-functional optical element and converging laser targeting
US6899273B2 (en) 2001-05-02 2005-05-31 Hand Held Products, Inc. Optical reader comprising soft key including permanent graphic indicia
USD458265S1 (en) 2001-05-02 2002-06-04 Hand Held Products, Inc. Hand held optical reader
US6942151B2 (en) 2001-05-15 2005-09-13 Welch Allyn Data Collection, Inc. Optical reader having decoding and image capturing functionality
US7111787B2 (en) 2001-05-15 2006-09-26 Hand Held Products, Inc. Multimode image capturing and decoding optical reader
DE10126351A1 (en) * 2001-05-30 2002-12-12 Ccs Technology Inc Optical distribution device and fiber optic connection cable
US6685092B2 (en) * 2001-06-15 2004-02-03 Symbol Technologies, Inc. Molded imager optical package and miniaturized linear sensor-based code reading engines
US6766954B2 (en) * 2001-06-15 2004-07-27 Symbol Technologies, Inc. Omnidirectional linear sensor-based code reading engines
US6722569B2 (en) * 2001-07-13 2004-04-20 Welch Allyn Data Collection, Inc. Optical reader having a color imager
US7331523B2 (en) * 2001-07-13 2008-02-19 Hand Held Products, Inc. Adaptive optical image reader
US6834807B2 (en) 2001-07-13 2004-12-28 Hand Held Products, Inc. Optical reader having a color imager
US7225430B2 (en) 2001-07-26 2007-05-29 Landesk Software Limited Software code management method and apparatus
US6865742B1 (en) * 2001-08-16 2005-03-08 Cisco Technology, Inc. Run-time property-based linking of software modules
US6758403B1 (en) 2001-09-11 2004-07-06 Psc Scanning, Inc. System for editing data collection device message data
US6837431B2 (en) * 2002-04-09 2005-01-04 Symbol Technologies, Inc. Semiconductor device adapted for imaging bar code symbols
US7320075B2 (en) * 2001-11-20 2008-01-15 Safenet, Inc. Software protection method utilizing hidden application code in a protection dynamic link library object
US7069562B2 (en) 2001-12-12 2006-06-27 Sun Microsystems, Inc. Application programming interface for connecting a platform independent plug-in to a web browser
US7162102B2 (en) * 2001-12-19 2007-01-09 Eastman Kodak Company Method and system for compositing images to produce a cropped image
US7296748B2 (en) * 2002-01-11 2007-11-20 Metrologic Instruments, Inc. Bioptical laser scanning system providing 360° of omnidirectional bar code symbol scanning coverage at point of sale station
US7073178B2 (en) 2002-01-18 2006-07-04 Mobitv, Inc. Method and system of performing transactions using shared resources and different applications
US7055747B2 (en) 2002-06-11 2006-06-06 Hand Held Products, Inc. Long range optical reader
US6679848B2 (en) * 2002-03-07 2004-01-20 Koninklijke Philips Electronics N.V. Method for allowing plug-in architecture for digital echocardiography lab image processing applications
US6959865B2 (en) 2002-03-28 2005-11-01 Hand Held Products, Inc. Customizable optical reader
US7219843B2 (en) * 2002-06-04 2007-05-22 Hand Held Products, Inc. Optical reader having a plurality of imaging modules
US7086596B2 (en) 2003-01-09 2006-08-08 Hand Held Products, Inc. Decoder board for an optical reader utilizing a plurality of imaging formats
US6871993B2 (en) * 2002-07-01 2005-03-29 Accu-Sort Systems, Inc. Integrating LED illumination system for machine vision systems
US7240059B2 (en) 2002-11-14 2007-07-03 Seisint, Inc. System and method for configuring a parallel-processing database system
US7739693B2 (en) * 2002-11-25 2010-06-15 Sap Ag Generic application program interface for native drivers
US7245404B2 (en) 2002-12-17 2007-07-17 Hewlett-Packard Development Company, L.P. Dynamically programmable image capture appliance and system
US7025272B2 (en) * 2002-12-18 2006-04-11 Symbol Technologies, Inc. System and method for auto focusing an optical code reader
US7066388B2 (en) 2002-12-18 2006-06-27 Symbol Technologies, Inc. System and method for verifying RFID reads
US7195164B2 (en) * 2003-01-03 2007-03-27 Symbol Technologies, Inc. Optical code reading device having more than one imaging engine
US7097101B2 (en) * 2003-02-13 2006-08-29 Symbol Technologies, Inc. Interface for interfacing an imaging engine to an optical code reader
KR100544459B1 (en) 2003-02-21 2006-01-24 삼성전자주식회사 automatic maintenance executing apparatus and method of an office machine
US7090135B2 (en) * 2003-07-07 2006-08-15 Symbol Technologies, Inc. Imaging arrangement and barcode imager for imaging an optical code or target at a plurality of focal planes
US7222793B2 (en) * 2003-07-09 2007-05-29 Symbol Technologies, Inc. Arrangement and method of imaging one-dimensional and two-dimensional optical codes at a plurality of focal planes
US7044377B2 (en) * 2003-08-01 2006-05-16 Symbol Technologies Inc. Plug-and-play imaging and illumination engine for an optical code reader
US7021542B2 (en) * 2003-08-01 2006-04-04 Symbol Technologies, Inc. Imaging and illumination engine for an optical code reader
US7017816B2 (en) * 2003-09-30 2006-03-28 Hewlett-Packard Development Company, L.P. Extracting graphical bar codes from template-based documents
US7350201B2 (en) * 2003-10-23 2008-03-25 International Business Machines Corporation Software distribution application supporting operating system installations
EP1714231B1 (en) * 2004-01-23 2011-09-07 Intermec IP Corporation Autofocus barcode scanner and the like employing micro-fluidic lens
US7262783B2 (en) 2004-03-03 2007-08-28 Virtual Iris Studios, Inc. System for delivering and enabling interactivity with images
US7303126B2 (en) * 2004-03-18 2007-12-04 Symbol Technologies, Inc. System and method for sensing ambient light in an optical code reader
US7574709B2 (en) * 2004-04-30 2009-08-11 Microsoft Corporation VEX-virtual extension framework
US6974083B1 (en) * 2004-07-23 2005-12-13 Symbol Technologies, Inc. Point-of-transaction workstation for electro-optically reading one-dimensional indicia, including image capture of two-dimensional targets
US7097102B2 (en) * 2004-07-29 2006-08-29 Symbol Technologies, Inc. System and method for decoding optical codes read by an imager-based optical code reader
US7303131B2 (en) * 2004-07-30 2007-12-04 Symbol Technologies, Inc. Automatic focusing system for imaging-based bar code reader
US7083098B2 (en) * 2004-08-24 2006-08-01 Symbol Technologies, Inc. Motion detection in imaging reader
US7070099B2 (en) * 2004-09-30 2006-07-04 Symbol Technologies, Inc. Modular architecture for a data capture device
EP2420954B8 (en) 2004-12-01 2017-04-12 Datalogic USA, Inc. Data reader with automatic exposure adjustment and methods of operating a data reader
US7403707B2 (en) 2005-07-28 2008-07-22 Mitsubishi Electric Research Laboratories, Inc. Method for estimating camera settings adaptively
US7261238B1 (en) * 2006-05-26 2007-08-28 Symbol Technologies, Inc. Method of locating imaged bar codes for an imaging-based bar code reader
JP2008233726A (en) * 2007-03-23 2008-10-02 Konica Minolta Opto Inc Optical waveguide element, and optical module, and optical axis adjustment method thereof
US7667200B1 (en) * 2007-12-05 2010-02-23 Sandia Corporation Thermal microphotonic sensor and sensor array

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of EP1971952A4 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8844822B2 (en) 2003-11-13 2014-09-30 Metrologic Instruments, Inc. Image capture and processing system supporting a multi-tier modular software architecture
US9355288B2 (en) 2003-11-13 2016-05-31 Metrologic Instruments, Inc. Image capture and processing system supporting a multi-tier modular software architecture
US9785811B2 (en) 2003-11-13 2017-10-10 Metrologic Instruments, Inc. Image capture and processing system supporting a multi-tier modular software architecture
US9720671B2 (en) 2008-06-17 2017-08-01 Microsoft Technology Licensing, Llc Installation of customized applications

Also Published As

Publication number Publication date
US7543752B2 (en) 2009-06-09
US20070040035A1 (en) 2007-02-22
WO2007075519A3 (en) 2008-01-10
US20080048039A1 (en) 2008-02-28
US7845563B2 (en) 2010-12-07
US7559475B2 (en) 2009-07-14
US20100096461A1 (en) 2010-04-22
US7770798B2 (en) 2010-08-10
US20070295813A1 (en) 2007-12-27
US7854384B2 (en) 2010-12-21
US20070187509A1 (en) 2007-08-16
US20080041957A1 (en) 2008-02-21
US20080041960A1 (en) 2008-02-21
US20070181689A1 (en) 2007-08-09
US20070187510A1 (en) 2007-08-16
US20070199998A1 (en) 2007-08-30
US7546951B2 (en) 2009-06-16
US7950583B2 (en) 2011-05-31
US7735737B2 (en) 2010-06-15
US7494063B2 (en) 2009-02-24
US20070194124A1 (en) 2007-08-23
US7712666B2 (en) 2010-05-11
US7789309B2 (en) 2010-09-07
US7487917B2 (en) 2009-02-10
US20070228175A1 (en) 2007-10-04
US7464877B2 (en) 2008-12-16
US20070199995A1 (en) 2007-08-30
US7637432B2 (en) 2009-12-29
EP1971952A4 (en) 2011-09-21
US20070194122A1 (en) 2007-08-23
US7575167B2 (en) 2009-08-18
US20080061144A1 (en) 2008-03-13
US7540425B2 (en) 2009-06-02
US20090052807A1 (en) 2009-02-26
US20070199993A1 (en) 2007-08-30
US7484666B2 (en) 2009-02-03
EP1971952A2 (en) 2008-09-24
US7654461B2 (en) 2010-02-02
US20080048038A1 (en) 2008-02-28

Similar Documents

Publication Publication Date Title
US7708205B2 (en) Digital image capture and processing system employing multi-layer software-based system architecture permitting modification and/or extension of system features and functions by way of third party code plug-ins
US7484666B2 (en) Automatic digital-imaging based bar code symbol reading system supporting pass-through and presentation modes of system operation using automatic object direction detection and illumination control, and video image capture and processing techniques
US7607581B2 (en) Digital imaging-based code symbol reading system permitting modification of system features and functionalities

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2006845674

Country of ref document: EP

DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)