WO2000026640A1 - Electronics assembly apparatus with improved imaging system - Google Patents

Electronics assembly apparatus with improved imaging system Download PDF

Info

Publication number
WO2000026640A1
WO2000026640A1 PCT/US1999/026186 US9926186W WO0026640A1 WO 2000026640 A1 WO2000026640 A1 WO 2000026640A1 US 9926186 W US9926186 W US 9926186W WO 0026640 A1 WO0026640 A1 WO 0026640A1
Authority
WO
WIPO (PCT)
Prior art keywords
component
sensor
image
machine
head
Prior art date
Application number
PCT/US1999/026186
Other languages
French (fr)
Inventor
Steven K. Case
Timothy A. Skunes
John P. Konicek
Original Assignee
Cyberoptics Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cyberoptics Corporation filed Critical Cyberoptics Corporation
Priority to JP2000579970A priority Critical patent/JP2002529907A/en
Priority to DE19982498T priority patent/DE19982498T1/en
Priority to KR1020007007461A priority patent/KR20010040321A/en
Priority to GB0014999A priority patent/GB2347741A/en
Publication of WO2000026640A1 publication Critical patent/WO2000026640A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/024Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by means of diode-array scanning
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K13/00Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
    • H05K13/08Monitoring manufacture of assemblages
    • H05K13/081Integration of optical monitoring devices in assembly lines; Processes using optical monitoring devices specially adapted for controlling devices or machines in assembly lines
    • H05K13/0812Integration of optical monitoring devices in assembly lines; Processes using optical monitoring devices specially adapted for controlling devices or machines in assembly lines the monitoring devices being integrated in the mounting machine, e.g. for monitoring components, leads, component placement
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/74Apparatus for manufacturing arrangements for connecting or disconnecting semiconductor or solid-state bodies and for methods related thereto
    • H01L2224/78Apparatus for connecting with wire connectors
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L24/00Arrangements for connecting or disconnecting semiconductor or solid-state bodies; Methods or apparatus related thereto
    • H01L24/74Apparatus for manufacturing arrangements for connecting or disconnecting semiconductor or solid-state bodies
    • H01L24/78Apparatus for connecting with wire connectors
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/0001Technical content checked by a classifier
    • H01L2924/00014Technical content checked by a classifier the subject-matter covered by the group, the symbol of which is combined with the symbol of this group, being disclosed without further technical details
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/10Details of semiconductor or other solid state devices to be connected
    • H01L2924/11Device type
    • H01L2924/12Passive devices, e.g. 2 terminal devices
    • H01L2924/1204Optical Diode
    • H01L2924/12041LED
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/10Details of semiconductor or other solid state devices to be connected
    • H01L2924/11Device type
    • H01L2924/14Integrated circuits

Definitions

  • the present invention relates generally to the automated electronic assembly industry. More specifically, the present invention relates to automated electronic assembly machines such as wire bonders, screen printers, and pick and place machines having an improved imaging system.
  • Fig. 1 is a perspective view of prior art pick and place machine 2 having placement head 4 which picks up electronic components 8 of various sizes.
  • a standard machine vision camera 10 views components 8, and associated electronics (not shown) in pick and place machine 2 compute X, Y and ⁇ orientation of component 8 relative to a work piece such as printed circuit board 12.
  • Area detector 18 is shown in dotted lines within camera 10 and is optically coupled to lens system 20.
  • pick and place machine 2 includes a single placement head 4, pick and place machines having additional placement heads, such as a split-gantry design, are well known. Head 4 is able to move component 8 in the X, Y, Z and ⁇ axes by employing suitable motors and motor control electronics (not shown) .
  • the machine vision system used by pick and place machine 2 is known as an off-head design because camera 10 is mounted in a fixed position relative to head 4.
  • head 4 releasably picks up a component, and transports the component to a position directly above camera 10.
  • Camera 10 then images the component to determine the current orientation of the component in the X, Y and ⁇ directions such that the orientation can be corrected and the component correctly placed.
  • requiring head 4 to transport each component 8 to camera 10 for imaging significantly adds to total component placement time.
  • Fig. 2 is a perspective view of camera 34 having linear detector 30 and lens system 32.
  • Camera 34 performs a similar function to that of camera 10 shown in Fig. 1, at lower cost.
  • Pick and place machine 2 can employ camera 34 to receive an image of higher resolution and larger field of view than that of camera 10.
  • Camera 34 is stationary, ust as camera 10, and thus is an off-head camera.
  • the field of view of any image generated by camera 34 is only limited by the travel of the associated head, the length of the linear detector, and the size of the data file generated by the sum of each line scan image (as will be described in greater detail later in the Specification) . This is in contrast to the field of view provided by camera 10, which is fixed in size.
  • camera 34 is adapted to provide a video output as a component is scanned past camera 34.
  • Camera 34 essentially images a number of individual J inear portions of the component as the component is passed relative to camera 34.
  • Fig. 3 is an elevation view of a portion of pick and place machine 40 having an on-head sensor 44 that includes area detector 42.
  • Sensor 44 includes lens assembly 46, mirror 48, and illuminator 50.
  • sensor 44 moves with multi-nozzle head 52 and moves relative to components 8 as indicated by arrow 41.
  • One of nozzles 54 releasably holds component 8.
  • Sensor 44 is moved under components 8 to sense their respective orientations such that the respective orientations can be adjusted prior to mounting the respective components onto a work piece.
  • Head 52 moves independently of sensor 44 thus allowing sensor 44 to be retracted prior to placing a component 8 on a work piece.
  • An on-head machine vision syste such sensor 44 allows component scanning to be effected while head 52 is transporting components 8 to their respective positions on the workpiece .
  • One on-head prior art line scan camera uses a light source positioned below the component, with two linear arrays located above the component, shining light through leads of the component at an angle. The two shadows are imaged by the linear arrays, but the system is unable to image the bottom of a component because it is backlit.
  • Another prior art line scan camera system is also backlit and has a camera positioned below the component. However, it is less than optimal because it lacks optics for focusing the shadow of the component onto the detector. Such a system may be useful for rough orientation of components, but is unable to inspect a fine pitch component (e.g., less than .3 mm pitch center to center between the leads).
  • a fine pitch component e.g., less than .3 mm pitch center to center between the leads.
  • the improved imaging system should adapted to be integrated into the moving head and adaptable for use with a variety of video processing systems.
  • more complex components mandate the use of imaging systems with a large field of view with high resolution, which includes a computer controlled illumination system to accommodate a large variety of components, at the same or reduced costs than are presently available.
  • a pick and place machine, and its imaging system are provided.
  • the imaging system is movable with a head of the pick and place machine and includes a linear detector which is adapted to move proximate the component to thereby scan the component.
  • the imaging system provides a video output that is used to calculate an orientation of the component and to adjust the orientation for suitable mounting upon a workpiece such as a printed circuit board.
  • the imaging system is also useful in additional electronics assembly devices such as screen printers and wire bonders.
  • Methods of mounting a component to a workpiece are also provided.
  • a linear detector with imaging optics is passed proximate the component to image the component.
  • Component orientation is then computed and corrected such that the component can be suitably mounted upon the work piece .
  • Fig. 1 is a perspective view of a prior art pick and place machine.
  • Fig. 2 is a perspective view of a prior art linescan camera.
  • Fig. 3 is a cutaway elevation view of a prior art on-head area camera.
  • Fig. 4 is a top plan view of a pick and place machine.
  • Fig. 5 is an elevation view of a placement head in accordance with an embodiment of the present invention.
  • Fig. 6 is a side elevation view of a portion of a placement head in accordance with an embodiment of the present invention.
  • Fig. 7 is a rear eJ evation view of a portion of a placement head in accordance with an embodiment of the present invention.
  • Fig. 8 is a top plan view of a portion of a placement head in accordance with an embodiment of the present invention.
  • Figs. 9a and 9b are diagrammatic views illustrating nozzle spacing in a placement head.
  • Fig. 10 is a top plan view of a placement head in accordance with another embodiment of the present invention.
  • Fig. 11 is a flowchart of a method of picking and placing components in accordance with an embodiment of the present invention.
  • Fig. 12 is a timing diagram of component placement in accordance witn an embodiment of the present invention.
  • Fig. 13 is a diagrammatic view of a portion of a pick and place machine in accordance with an embodiment of the present invention.
  • Fig. 14 is a system block diagram of a portion of a pick and place machine in accordance with an embodiment of the present invention.
  • Figs. 15a - 15j are diagrammatic views of a linear detector and methods of effecting photoelement exposure control and image adjustment.
  • Figs. 16a - 16c are diagrammatic views of an illuminator in accordance with an embodiment of the present invention.
  • Figs. 17a - 17c are diagrammatic views of an illuminator in accordance with an embodiment of the present invention.
  • Figs. L8a and 18b are diagrammatic views of an illuminator in accordance with an embodiment of the present invention.
  • Figs. 15a and 19b are diagrammatic views of an illuminator in accordance with an embodiment of the present invention.
  • Figs. 20a - 20d are diagrammatic views of an illuminators in accordance with another embodiment of the present invention.
  • Fig. 21 is a diagrammatic view of a portion of a pick and place machine in accordance with another embodiment of the present invention.
  • Figs. 22a - 22q are images of components acquired in accordance with embodiments of the present invention.
  • Figs. 23a and 23b show a gradient index lens array imaging system.
  • Fig. 24 is a cutaway elevation view of a sensor in accordance with another embodiment of the present invention.
  • Fig. 25 is a chart of measurement times for various component types.
  • Figs. 26a - 26c are sensor velocity profiles in accordance with embodiments of the present invention.
  • Figs. 27a - 27h are diagrammatic views of window outputs in accordance with embodiments of the present invention.
  • Fig. 27i is a diagrammatic view of an embodiment of the present invention for viewing large components .
  • Fig. 28 is a perspective view of a prior art wire bonder.
  • Fig. 29 is- a top plan view of a wire bonder in accordance with an embodiment of the present invention.
  • Fig. 30 is a perspective view of a prior art screen printer.
  • Fig. 31 is a diagrammatic view of a portion of a screen printer in accordance with an embodiment of the present invention.
  • Fig. 4 is a top plan view of pick and place machine 50 in accordance with an embodiment of the invention. Although much of the present invention will be described with respect to pick and place machine 50, other forms of pick and place machines, such as a split gantry design, are useful with embodiments of the present invention. Additionally, although embodiments of the present invention will be described with respect to pick and place machines, some embodiments of the present invention include an imaging system provided with a wire bonder or screen printer, as will be described with respect to Figs. 29 and 31. As shown in Fig. 4, machine 50 includes transport mechanism 52 that is adapted to transport a workpiece such as a printed circuit board. Transport mechanism 52 includes mounting section 54 and conveyor 56.
  • Transport mechanism 52 is disposed on base 58 such that a workpiece is carried to mounting section 54 by conveyor 56.
  • Component reservoirs 60 are disposed on either side of transport mechanism 52 and supply electronic components. Reservoirs 60 can be any suitable device adapted to provide electronic components, such as a tape feeder.
  • Pick and place machine 50 includes head 62 disposed above base 58.
  • Head 62 is movable between either of component reservoirs 60 and mounting section 54.
  • head supports 64 are movable on rails 66 thereby allowing head 62 to move in the Y direction over base 58. Movement of head 62 in the Y direction occurs when motor 70, in response to a motor actuation signal, rotates ball screw 72 which engages one of head supports 64 to thereby displace the support 64 is the Y direction.
  • Head 62 is also supported upon rail 68 to allow head movement in the X direction relative to base 58. Movement of head 62 in the X direction occurs when motor 74, in response to a motor actuation signal, rotates ball screw 76 which engages head 62 and displaces head 62 in the X direction.
  • head 62 includes body 78 nozzle mount 80, nozzles 82 and sensor 84.
  • Nozzle mount 80 is disposed within body 78 and mounts each of nozzles 82 within body 78.
  • Each of nozzles 82 is movable in the Z direction (up/down) and is rotatable about the Z axis by suitable actuation members, such as servo motors.
  • Sensor 84 is adapted to move in the X direction relative to nozzles 82 to acquire images of components h ld by n ⁇ -zles 82.
  • Sensor 84 is coupled to image processor 86.
  • Image processor 86 receives video data from sensor 84 based upon images of components held by nozzles 82.
  • Image processor 86 is adapted through hardware, software, or a combination of both, to calculate respective component orientations of each of the components held by the respective nozzles 82. Image processor 86 then sends suitable orientation information to a controller (not shown) such that each of nozzles 82. is successively displaced to properly mount its respective component upon the workpiece.
  • a controller not shown
  • pick and place machine 50 can include a singular nozzle to practice embodiments of the present invention.
  • Fig. 5 is an elevation view of head 62 in accordance with one embodiment of the present invention.
  • Head 62 includes motor 88 operably coupled to a ball screw (not shown) through belt 89.
  • the ball screw is operably coupled to sensor 84 such that energization of motor 88 causes sensor 84 to move in the X axis direction relative to nozzles 82.
  • Sensor 84 can be adapted to image components coupled to nozzles 82 while scanning in either X axis direction. When such bi-directional scanning is employed, it is useful to provide image processing software that corrects for the fact that the data from scans of opposite directions are essentially flipped around from one another. Additionally, in some bidirectional scanning embodiments, sensor 84 can store the various scanned lines in temporary memory and then send them to the image processing section in correct order.
  • Fig. 6 is an elevation view of head 87 in accordance with an embodiment of the invention.
  • Head 87 includes plate 90 to which nozzles 82 and linear stage 92 are mounted.
  • Sensor 84 is coupled to linear stage 92 via bracket 94 such that sensor 84 is moveable relative to nozzles 82 and thus component 96.
  • sensor 84 is moveable in the X axis direction relative to components 96.
  • Fig. 8 is a top plan view of head 87 in accordance with an embodiment of the present invention. For clarity, only four nozzles 82 are shown in Fig. 8. however, any appropriate number of nozzles, .including one nozzle, can be used.
  • head 87 is movable in X and Y axis directions.
  • sensor 84 is. ovable in the X axis direction with respect to nozzles 82 via its coupling to linear stage 92.
  • Sensor 84 includes detector window 102 which allows a line of sight between a linear detector (not shown) disposed within sensor 84 and a portion of a component held by one of nozzles 82. The line of sight is preferably parallel to the axis of nozzles 82.
  • Figs. 9a and 9b illustrate preferred dimensions in relation to the number of nozzles 82 employed on head 87.
  • Fig. 9a indicates that for a four nozzle head, a scan length, w x , of 200mm is preferred.
  • the maximum scannabie length, w x preferably equals 200mm.
  • total sensor travel should be equal to or greater than the maximum scan length plus the sensor width, Ws, so that the sensor is clear of the components while they are being picked up and placed.
  • the number of nozzles employed by head' 87 is related to the maximum anticipated size of components that will be placed on the workpiece.
  • scan length, w x is preferably the same as that of Fig. 9a.
  • the scan width, W Y (which corresponds to one of the dimensions of the linear detector' s field of view perpendicular to the scan direction) is preferably equal to about 26mm.
  • longer linear detectors can be used to increase the scan width to 35mm. or more. However, as the size of the linear detector is increased, costs rise and sensor scan speed may be adversely affected.
  • Fig. 10 is a top plan view of placement head 104 for a pick and place machine in accordance with another embodiment of the present invention.
  • Head 104 bears many similarities to head 87, and like components are numbered similarly.
  • head 104 includes body 7 and one or more nozzles 82.
  • Sensor 106 is moveable relative to nozzles 82 since sensor 106 is coupled to motor 88 via ball screw 108.
  • Motor 88 also includes encoder 110 that provides a feedback signal indicative of rotational displacement of ball screw 108 and thus axial displacement of sensor 106 in the X direction.
  • sensor 106 ' includes a detector window 112 that is perpendicular to a longitudinal axis 114 of sensor 106.
  • Detector window 112 can be positioned anywhere on sensor 106. It is understood that the linear detector of the present invention may be placed virtually anywhere in the sensor housing, since the invention includes the use of mirrors and the like to fold the optical path. Thus, if sensor 106 is adapted to scan components in a single direction (for example while moving to the right) then window 112 can be disposed proximate a leading edge of sensor 106 such that components are scanned more quickly. In embodiments where sensor 106 is adapted scan components in either direction (left and right) window 112 is preferably centered upon sensor 106.
  • Fig. 11 is a flowchart of a method of picking and placing n components upon a workpiece in accordance with the present invention.
  • n components are picked up by a pick and place machine head, such as head 87.
  • blocks 122 and 124 are initiated.
  • a linescan camera begins moving relative to the components as indicated by block 122 and the head begins traveling to the approximate position or site on the workpiece where the first component will be mounted.
  • blocks 122 and 124 are executed substantially simultaneously.
  • counter P is initialized to equal 1.
  • Counter P is used to track which component coordinates are being computed, as will be described in greater detail with respect to the rest of Fig. 11.
  • blocks 126, 128, and 132 preferably begin execution.
  • blocks 126, 128 and 132 execute while the head is transporting components to the approximate placement site. Although such blocks arc illustrated and described as executing at least partially in parallel, it is contemplated that such blocks can execute sequentially.
  • the linescan camera passes all n components and collects video data based upon the components .
  • the linescan video data is corrected for non-uniformities. Such non-uniformities may be due to changes in sensor scan speed that, occur while scanning is performed. Such correction will be described in greater detail later in the specification .
  • t block 132 X,, Y and ⁇ offset adjustments for component c P are computed.
  • the computed offset adjustments are then used in block 134 to calculate final part placement coordinate endpoints for component c p .
  • counter P is incremented as indicated in block 136.
  • the machine checks to determine whether the incremented counter (P) exceeds the number of components (n) picked up in block 120, as indicated at block 138.
  • part c p is placed as indicated at block 137.
  • the machine checks to determine whether c p is the last component. If component c p is not the last component, control returns to block 124 and the head begins moving to the approximate placement site of the next component. However, if all n components have been placed, then control returns to block 120 and an additional n components are picked up and the method repeats. Preferably, the various steps of placing parts occur while component offset adjustments are calculated.
  • Fig. 12 is an example scan timing chart for a pick and place machine naving four nozzles in accordance with an embodiment of the present invention.
  • the vertical lines in Fig. 12 indicate specific time intervals.
  • nozzle scanning requires three time intervals for completion.
  • nozzle scanning which begins at time to will finish at time t 3 .
  • images of the component held by nozzle # 1 begin to be transferred at time ti.
  • t 2 while the nozzle is still being scanned, and while the image is still being transferred, image processing begins.
  • Fig. 13 is a diagrammatic view of sensor 106 as it scans a portion of component 96 held by nozzle 82.
  • Sensor 106 is operably coupled to motor 88 via ballscrew 108.
  • Motor 88 is operably coupled to encoder 110 which provides an indication of rotary displacement of ballscrew 108 and thus axial displacement of sensor 106 along the X axis.
  • a linear glass scale type encoder could be substituted for encoder 110.
  • Linear detector 150 is preferably a charge coupled device (CCD) comprising a number of photoelements (pixels) arranged in a line. Preferably, the size of each pixel is approximately 14 microns square.
  • Detector 150 is preferably manufactured by Dalsa Inc., of Waterloo Ontario and is model no. IL-CC-2048, although other types of linear detectors may be used in the present invention.
  • Linear detector 150 is optically coupled to a portion of leads 154 through imaging optics 156 and detector window 158. Imaging optics 156 can include lens system 160 and partial mirror 162.
  • sensor 106 also includes one or more illuminators.
  • the embodiment shown in Fig. 13 includes darkfield illuminators 164, diffuse illuminators 166, and brightfield illuminator 168.
  • darkfield illumination is intended to mean illumination which impinges upon the component at a high angle of incidence.
  • Diffuse illumination is intended to mean illumination impinging upon the component at a lesser degree of incidence.
  • Brightfield illumination is intended to mean illumination which impinges upon the component at a substantially zero incidence angle.
  • brightfield illumination can also be considered specular or through-the-lens illumination.
  • Additional types of illumination such as backlighting, can also be provided in the pick and place machine by disposing an appropriate source behind component 96 relative to sensor 106, as will be described with respect to Figs. 20a - 20d, and Fig. 21.
  • sensor 106 In operation, sensor 106 is moved along the X-axis with respect to component 96. While in motion, sensor 106 acquires individual linear images of portions of component 96. By storing multiple linear images and correlating the individual images with sensor location information provided by encoder 110, an image of component 96 can be constructed.
  • Illumination emanating from any of darkfield illuminators 164, diffuse illuminators 166 or brightfield illuminator 168 is reflected by a portion of component 96 proximate detector window 158.
  • the reflected illumination is redirected by partial mirror 156 through lens system 160, and thereby focused upon linear detector 150.
  • Each individual pixel of linear detector 150 is a charge coupled device element which provides a representation of the sum of illumination falling upon the pixel during an integration period.
  • Lens system 160 can be any suitable optical device capable of focusing an object line upon linear detector 150.
  • lens system 160 can be a refractive lens system or a diffractive lens system.
  • a refractive lens system can preferably include a gradient index (GRIN) lens array, available from NSG America, Inc., of Somerset NJ, or a traditional refractive lens system.
  • GRIN gradient index
  • a diffractive lens system can include a holographic lens array.
  • Sensor 106 is coupled to sensor controller 170 of host 172.
  • Sensor controller 170 can receive and store each individual image line in a frame buffer, and provide suitable signals to sensor 106 to control the intensity of any of illuminators 164, 166, and 168 as well as pixel exposure control (as will be explained later in the specification). Since host 172 is coupled to encoder 110, sensor controller 170 can provide illumination intensity signals to any of the illuminators based upon position of sensor 106 along the X-axis or based upon the scan speed of sensor 106 along the X-axis. Host 172 also includes motion controller 174 that is coupled to motor 88, nozzle motor 176 and a nozzle encoder (not shown) .
  • host 172 acquires an image of component 96 from linear detector 150 as sensor 106 is moved in the X direction relative to component 96.
  • Host 172 is adapted through suitable software, hardware, or both, to compute a current orientation cf component 96 in X-axis, Y-axis, and ⁇ directions. Based upon the computed orientation, host 172 causes notion controller 174 to issue suitable motion commands to motors 70, 74 (shown in Fig. 4) and nozzle motor 176 to cause nozzle 82 to deposit component 96 in a desired component position and orientation on the workpiece.
  • Motion controller 174 is adapted to vary scan speed based upon any number of criteria which will be explained in greater detail later in the specification.
  • Fig. 14 is a system block diagram of another embodiment of the present invention.
  • Fig. 14 illustrates sensor head 700 mechanically coupled to a sensor motion system 702, which provides the unidirectional motor drive for head 700.
  • Sensor head 700 moves with a generic component head 708 in at least one direction so as to form an "on-head" system.
  • sensor head 700, sensor motion system 702, controller 706 and component head 708 form a closed control loop 709 (not shown) .
  • the host processor sends a desired placement signal to the component head motor drive.
  • the component head motor drive starts to move the component head to the nominal placement location.
  • the combination of the sensor head and sensor motor drive c ⁇ n the component and output partial images to allow the formatter 714 to form an assembled image of the component, and then video processor 728 processes the assembled image to compute an x, y, and ⁇ orientation of the component.
  • the video processor sends this x, y, and ⁇ orientation to the host processor in the pick and place machine, which computes a correction signal that is provided to the placement head to properly orient and place the component .
  • a host processor 712 within the controller 706 of a pick and place machine, sends a desired placement location via bus 720 and bus 710 to component head 708.
  • Sensor head 700 scans across the component, collecting a plurality of images of the component each representative of a portion of the component. Typically, as many as 2000 partial images can be taken to view an entire 25 millimeter x 25 millimeter component. If the scan includes, for example, four components, 8000 partial images may be needed to compute the necessary orientation data.
  • Video formatter 714 receives outputs from detector read-out block 716 in sensor head 700, via bus 718.
  • the function of formatter 714 is preferably carried out in a separate electronic chip other than processor 712, but both functions may be embodied in the same component.
  • Video formatter 714 assembles the partial images from detector read-out block 716 to form an assembled image. Additionally, formatter 714 optionally performs windowing of specific areas of the assembled images, (e.g., corners), performs magnification of specific areas and also may provide non-uniformity correction of the assembled images, where one of the dimensions of the image is disproportionately modified with respect to other dimensions of the assembled image, due to non-uniform spacing of partial images in time or space.
  • Non-uniformity corrections may be necessary in order to provide a 1:1 x:y representation of the scanned images to the video processor, since a 1:1 x:y representation of the scanned images is only available when the scan speed is set to cover the width of the pixel (typically 14 microns) .
  • Many video processors are incompatible with non 1:1 x:y representations of scanned images, but some are able to receive 8:1 x:y representation of scanned images, for example.
  • the present invention includes decimating the scanned image to make it compatible with the video processor.
  • the non-uniformity in the scanned images results from variable scan speeds and exposure periods, as detailed in the discussion for Figs.
  • the non-uniformity and decimation is performed by video formatter 714 in real time as each partial image received and before each partial image is stored.
  • video formatter 714 is also adaptable to correct for pixel to pixel variations in gain and offset.
  • Internal bus 720 connects formatter 714, processor 712, operator interface and display 722, placement head motion control system 724 and video processor 728.
  • the host processor also provides various timing signals for proper operation of the controller 706. For instance, when the pick and place machine 706 has a more than one component head, the host processor includes suitable collision avoidance functionality to prevent collisions between the various component heads and also to prevent collisions between sensor and nozzles.
  • Operator interface and display 722 allows the operator of the pick and place machine to program specific movements and associated timing of scans, as well as overall operation and diagnostics of the pick and place machine.
  • a video display of the assembled image is also displayed for the operator. Such display is especially useful for displaying the windowing, . magnification and non- uniformity correction options mentioned above.
  • Placement head motion control s stem. 724 includes a set of x, y, z, and ⁇ motors for moving component head
  • Video processor 728 can be a microprocessor such as an Intel Pentium® processor and is preferably included in every embodiment of the present invention.
  • Sensor head 700 includes detector 704.
  • the detector/illuminator control electronics 730 provides control signals for detector 704, depending on the type of operation desired.
  • An illuminator 732 physically resides in sensor head 700, and illuminator control electronics 730 also control the operation of illuminator " 732 to provide illumination from one or a combination of brightfield, backlit, darkfield illuminators (with individually addressable LED's) for rounded objects, and a source for diffuse illumination.
  • control electronics 736 within sensor motion system 702 provide timing and position instructions to head 700.
  • a control loop is formed by control electronics 736, which sends out instructions representative of the desired position for head 700 and motor/encoders 734, and head " '00.
  • the time constant of this control loop should be Less than the other control loop 709, since the line scan sensor should scan faster than the time required to place the component.
  • Fig. 15a is a diagrammatic view of linear detector 150.
  • Linear detector 150 includes pixel reset drain 180, photoelements 182, and CCD readout shift register 184.
  • Each of photoelements 182 is preferably sized to be approximately 14 micrometers x 14 micrometers. However, any appropriately sized photoelements can be used.
  • photoelements 182 are exposed to an image and each photoelement 182 accumulates charge related to the amount of light that falls upon the individual photoelement during the integration period.
  • the integration period thus corresponds to the amount of time that the photoelement is exposed to the image. Modifying the integration period has a direct impact on photoelement exposure control. When sensor scan speed is constant, the integration period is generally centered about fixed intervals along the scan direction.
  • CCD readout shift register 184 For each line scanned, the amount of charge accumulated from a given photoelement is transferred to CCD readout shift register 184. Individual photoelement values are then read from CCD readout shift register sequentially. Additionally, although one CCD readout shift register tap is illustrated in Fig. 15, any appropriate number of CCD readout shift register taps can be used to simultaneously readout data from different portions of the CCD readout shift register. Moreover, although Fig. 15 shows linear detector 150 having a singular CCD readout shift register 184, multiple CCD readout registers or multiple taps, can be used. Multiple CCD readout shift register taps allow different portions of the CCD array to be read simultaneously. In another embodiment, other types of linear arrays of conventional design are acceptable, such as those which provide .video data at a rate of 40MHz.
  • Fig. 15b shows varying exposure lengths, 3a - 3g, where the commencement of each exposure length repeats at fixed intervals. For example, exposure length 3a - 3g each start at 5 tick marks of the previous exposure length.
  • Fig. 15c shows varying exposure periods, 4a - 4g, where each exposure period corresponds to each of exposure lengths 3a - 3g, respectively. Note that while the width of the exposure lengths 3a - 3g vary, the width of the exposure times, 4a - 4g, do not. This method is relatively easy to implement, however, it creates some image distortion because the center to center spacing of the exposure lengths are not constant .
  • FIG. 15d Another way to compensate for scan speed variations is illustrated in Figs. 15d - 15g.
  • Fig. 15d successive exposure lengths 5a - 5g are shown.
  • Each exposure length 5a - 5g commences at fixed intervals, and each exposure length ends at fixed intervals.
  • Fig. 15e shows exposure periods 6a - 6g, which correspond to exposure lengths 5a -5g, respectively.
  • Exposure periods 6a - 6g commence and end at no particular fixed interval, however, because the scan speed varies.
  • this method ensures that the center to center spacing between exposure lengths are constant (allowing for simpler video data processing) , the effective gain of each line of video data (a function of the instantaneous velocity of the camera during the exposure time of each partial image) is not constant.
  • Image 7h comprises video lines 7a - 7g, which are the result of exposure lengths 5a - 5g in Fig. 15d and the corresponding exposure periods 6a - 6g in Fig. 15e.
  • the assembled scanned image 7h displayed in Fig. 15f can be improved dramatically by correcting the effective gain of each line of video data. This is done by normalizing all the pixel values for a partial image by the exposure period associated with such image. For example, if the nominal exposure period, .To, gives a full-scale output for a highly reflective surface, then each pixel in a partial image can be scaled by
  • P ⁇ ,, i is the normalized pixel value for the ith pixel along such image .
  • P : ⁇ l , ⁇ is the measured pixel value for the ith pixel along the line, and a is the measured exposure period for such image.
  • Encoder signals are received based upon sensor movement, and counted by encoder counter 1S1.
  • Count information is provided by counter 181 to CCD timing block 183 such that each exposure interval begins and ends on evenly spaced spatial intervals.
  • Exposure timer 185 contains a digital counter and a accurate time-based clock that rapidly increments the digital counter.
  • the count information from encoder counter 181 is used to reset and start the digital counter of exposure timer 185 at the beginning of an exposure, and to stop the digital counter at the end of the exposure.
  • the ending digital count (representing the exposure period) is then latched by a buffer in exposure timer 185.
  • the latched exposure time information is then provided to digital-to-analog (D/A) converter 189 such that converter 189 sends an analog signal to programmable gain amplifier 191.
  • Video data from detector 150 is applied as an input to amplifier 191, and amplifier 191 generates normalized, or corrected, leveLs of each pixel in the line of video data.
  • Fig. 15h illustrates another device that is useful for compensating for the type of image shown in Fig. 15f .
  • Encoder counter 181, and exposure, timer 185 function as described with respect to Fig. 15 ⁇ .
  • the exposure period in counts from exposure timer 185 is sent to illuminator control block 193 to adjust the illuminator intensity based upon the exposure period.
  • the illuminator would be adjusted to higher intensity levels for the next exposure period.
  • the illuminator would be adjusted to a lower intensity level for the next exposure period.
  • the exposure periods could also be measured over shorter exposure lengths just before the beginning of the next exposure length in order to obtain a better estimate of instantaneous scan speed.
  • Figs. 15i and 15j Another method of compensating for variations in scan speed is described with respect to Figs. 15i and 15j .
  • This method uses a constant exposure period, and centers the exposure lengths at constant intervals.
  • the exposure period of a given partial image is measured and the starting position for the following exposure length is adjusted such that the exposure length is centered about the desired spatial position. For example, if the exposure period is Tf, then the total exposure length is given by:
  • V ⁇ is the instantaneous scan speed. If X c is the desired center of the exposure length, then the starting position of the exposure length, X s , is given by:
  • the instantaneous scan speed, Vi is calculated by measuring the time it takes the sensor to travel a short distance. If the sensor is accelerating rapidly, scan speed can be measured at many points prior to the desired exposure period. Instantaneous scan speed can then be estimated by fitting a curve to these points and extrapolating what the scan speed will be during the next exposure period.
  • TDI detectors improve the light gathering abilities of a photoelement, and overcome the problem of underexposure.
  • TDI-based linear detectors are commonly used for the inspection of very high speed conveyor processes such as beverage bottling or the like.
  • the TDI architecture requires that the linear detector be precisely synchronized to the motion of the sensor, and such synchronization can be effected with encoder 110.
  • Fig. 16a is an elevation view of a portion of sensor 106.
  • Sensor 106 includes darkfield illuminators 164 and diffuse illuminators 166.
  • diffuse illuminators 166 each include a discrete array or bank of light emitting diodes 186 and mirrors 188.
  • Banks 186 can be surface mount, T-l, or monolithic array LEDs, and are disposed to direct illumination horizontally with respect to sensor 106.
  • Mirrors 188 are positioned to reflect the horizontal illumination emanating from banks 186 upon a portion of component 96 at a suitable angle of incidence for diffuse illumination.
  • a lens can be placed in front of each of banks 186 to increase the power density of the illuminators.
  • Such lens can be a common cylinder lens, a Fresnel type lens, or a rod lens. Although the use of such lenses is described with respect to banks 186, such lenses can be used for any illuminators of the embodiments of the present invention.
  • Fig. 16b is a top plan view of the portion of sensor 106 illustrated in Fig. 16a.
  • diffuse illuminators 166 are disposed on opposites sides of detector window 158.
  • Each diffuse illuminator 166 includes a linear bank of light emitting diodes 186 and a mirror 188.
  • Darkfield illuminators 164 are disposed at intervals about detector window 158.
  • darkfield illuminators 164 are disposed in a ring about detector window 158.
  • other configurations can also be used.
  • Fig. 16c is a top plan view of a portion of sensor 106 in accordance with another embodiment.
  • sensor 106 includes detector window 158 and diffuse illuminators 166 precisely as provided in the embodiment shown in Fig. 16b. .
  • individual darkfield illuminators 164 are replaced with darkfield light emitting diode banks 190.
  • the darkfield diode banks are disposed in a ring about detector window 158 as shown in Fig. 16c.
  • Diode banks 190 can
  • any of the illuminator embodiments shown herein can be practiced with sinv7'uiarly addressable illumination components (e.g., diffuse illuminators, LEDs, darkfield LEDs, etc.). It is understood that the singularly addressable elements can also be grouped into contiguous physical segments, such as quadrants or pie-shaped slices. Fig. 16B shows quadrants of addressable LEDs, QI - Q4, although it is understood that the enhanced lighting provisions need not be symmetric. In fact, when confronted with an asymmetrical object of interest, such as in a wire bonder, it is preferred to have a asymmetrical illumination arrangement. Additionally, the enhanced illumination provisions disclosed in this paragraph may be coupled with the use of color (e.g., blue or
  • Fig. 17a is a top plan view of sensor 192 in accordance another embodiment of the invention.
  • Sensor 192 includes detector window 158 disposed approximately equidistant between edges 194 and 196.
  • Sensor 192 includes diffuse illuminators 198 each of which preferably includes a number of individual illuminator sources 200, such as light emitting diodes.
  • Each illuminator 198 also preferably includes a lenticular diffuser 202.
  • Discrete domed surface mount LED's can be used for sources 200. The dome of each LED acts as a lens to increase the power density of the LED. However, when such domed LED's are placed in close proximity to the object, non-uniform illumination can occur since the illumination does not have sufficient distance to spread out . In such case lenticular diffusees 202 are useful, to spread out the illumination from source 200.
  • each lenticular diffuser 202 is disposed to intercept illumination emanating from individual illuminators 200 and diffuse such illumination prior to illuminating component 96.
  • a lenticular diffuser is an array of cylindrical-like grooves which scatters light only in the direction perpendicular to the length of the grooves.
  • Fig. 17c is a diagrammatic view of lenticular diffuser 202 having a multiplicity of grooves 204. Although a lenticular diffuser is shown in figs. 17a-17c, other suitable devices can be used that diffuse illumination from individual illuminators 200.
  • Figs. 18a and 18b are diagrammatic views of a portion of sensor 192 in accordance with another embodiment of the present invention.
  • Sensor 192 includes illuminators 210 disposed on opposites sides of detector window 158.
  • Each illuminator 210 includes a plurality of individual sources 212 disposed to direct light relatively horizontally in relation to sensor 192.
  • Illuminators 210 include faceted reflectors 214 disposed between individual illuminators sources 212 and component window 158.
  • Each reflector 214 includes a reflective surface 216 that has a number of reflective portions disposed at different angles relative to one another.
  • some illumination emanating from individual illuminator sources 212 intercepts faceted reflector 214 at various angles, and faceted reflector 214 reflects the illumination from individual illuminator sources 212 to a portion of component 96 at varying degrees of incidence.
  • some illumination emanating from individual illuminator sources 212 proceeds directly to component 96 thus illuminating a portion of component 96 with dark field illumination.
  • Figs. 19a and 19b are diagrammatic views of illuminators 220 in accordance with another embodiment of the present invention.
  • Each of illuminators 220 is disposed on opposites sides of detector window 158.
  • Each illuminator 220 includes illumination source 222 and waveguide 224.
  • Illumination source 222 is preferably a conventional laser bar.
  • a laser bar consists of an array of many semiconductor laser diodes.
  • the laser bar can be operated in a constant power mode, or pulsed for each line of CCD data. Properly pulsing the laser bar at higher power for a period of time that is short compared to the CCD line rate results in the same amount of energy exposing the linear detector. However, pulsing the laser bar improves its electrical to optical conversion efficiency, and the heat generated by the laser bar is reduced.
  • Illumination emanating from source 222 travels through waveguide 224 and falls upon internal reflector 226.
  • Reflector 226 reflects the illumination to surface 228.
  • Surface 228 is preferably roughened in order to diffuse illumination emanating from surface 228 before illuminating component 96. In a preferred embodiment, surface 228 is roughened by etching.
  • Figs. 20a-20d are diagrammatic views of a linescan sensor including a wing illuminator in accordance with another embodiment of the present invention.
  • Fig. 20a illustrates sensor 240 having illuminator 242 coupled thereto and extending from sensor 240. Arrow 241 indicates the direction of travel for the linescan sensor.
  • Illumination sources 246 are disposed to direct illumination upon diffuser/reflector 247, mounted to nozzle 82 above component 96.
  • Illumination sources 246 preferably comprise light emitting diodes. As can be appreciated, illumination emanating from sources 246 provides backlighting to component 96 while imaged by sensor 240.
  • Fig. 20b is a top plan view illustrating sensor 240 scanning multiple components 96.
  • Fig. 20c illustrates another wing illuminator in accordance with an embodiment of the present invention.
  • Illuminators 242 extend from sensor 240 to a position above component 96 and include LED banks 245.
  • LED banks 245 are adapted to direct illumination to reflectors 251 which direct the illumination through diffuser 253, thus backlighting component 96.
  • a shield 257 is also provided to prevent illumination from essentially leaking past diffuser 253 and falling directly upon sensor 240.
  • Fig. 20d illustrates another wing-lighting embodiment wherein sensor 240 includes ' illuminators 242 each of which includes a source 259 and lens 265 to backlight the chisel tips of component 96, which is illustrated as a connector.
  • Connectors are usually- problematic because the pins are very shiny and come nearly to a point. These tips reflect most of the light away from the imaging system in standard frontlighting methods and hence are not visible in the image.
  • Sources 259 direct illumination through lenses 265 and onto the pin tips of component 96 where they are specularly reflected and collected by the imaging system of sensor 240.to • generate an .image of the pin tips.
  • the .illumination illustrated in Fig. 20d is referred to as directional side lighting.
  • Fig. 21 is a elevation view of a portion of a pick and place machine in accordance with another embodiment of the present invention.
  • sensor 260 is positioned below component 96.
  • Sensor 260 includes a linear detector as described in the above embodiments and can include any and all of the illumination embodiments described thus far.
  • backlight illuminator 269 is operably coupled to nozzle 82 to provide backlight to component 96 while being imaged by sensor 260.
  • Backlight illuminator 269 includes individual illumination sources 271 and diffuser 273.
  • individual illumination sources 271 are LED's and diffuser 273 is any suitable optical device.
  • Backlight illuminator 269 is preferably provided on all nozzles 82 within the pick and place machine.
  • the pick and place machine can include any appropriate combination of backlit and non-backlit nozzles.
  • the enhanced illumination features discussed at Figs. 16D, particularly with respect to singularly addressable colored LEDs, are useful with the present embodiment .
  • Figs. 22a-22q are photographs of various components imaged in accordance with an embodiment of the present invention with varying intensities of different types of illumination. Different illumination types highlight or suppress certain component features- Such illumination is thus useful for component placement and inspection.
  • Fig. 22a is a flip-chip image acquired in accordance with the present invention having a field of view of 16mm x 16mm.
  • the image of Fig. 22a was acquired using brightfield illumination and adjusted to half its maximum intensity vaiue (i.e., 0.50 units) .
  • Fig. 22b is a flip-chip image acquired in accordance with an embodiment of the present invention.
  • Fig. 22b has a field of view of 16mm x 16mm similar to that of Fig. 22a. Illumination of the flip-chip in Fig. 22b differs from that of Fig. 22a in that darkfield illumination of .75 units and diffuse illumination of .5 units was directed upon the flip- chip during image acquisition.
  • Fig. 22c is an enlarged view of a portion of the flip-chip shown in Figs. 22a and 22b.
  • the field of view is approximately 2.7mm x 2,0mm.
  • Fig. 22c shows individual balls on the surface of the flip-chip, such image magnified as output from video formatter 714. Each ball has a diameter of approximately 150 micrometers.
  • the illumination used to acquire Fig. 22c consisted on .75 darkfield units in combination with .50 diffuse illumination units
  • Fig. 22d is an image acquired of a quad flat pack (QFP) .
  • the field of view of Fig. 22d is approximately 26mm x 26mm.
  • the illumination employed to acquire the image of Fig. 22d consisted of 1.0 brightfield units. Surface indicia on the bottom of the QFP can be seen clearly in Fig. 22d as well as individual leads extending from all four sides of the QFP.
  • Fig. 22e is another image acquired of the QFP in accordance with an embodiment of the present invention. Illumination used to acquire image 22e consisted of .2 brightfield units, .4 diffuse units and .7 darkfield units. Contrasting images 22e with 22d, illustrates that illumination intensity adjustment can facilitate the imaging of the individual QFP leads.
  • Fig. 22f is an image of a portion of the QFP shown in Figs. 22d and 22e, which portion has been digitally magnified and windowed, as output from video formatter 714.
  • the field of view of Fig. 22f is approximately 7.25mm x 6.5mm.
  • the illumination used to acquire the image of Fig. 22f consisted of .2 brightfield units, .4 diffuse units, and .7 darkfield units.
  • the individual leads of the QFP shown in Fig. 22f have a width of approximately .30mm and have a lead pitch of .65mm.
  • Fig. 22g is an image of a micro ball grid array acquired in accordance with an embodiment of the present invention.
  • Image 22g has a field of view of approximately 9.5mm x 7.6mm.
  • Illumination used to acquire the image of Fig. 22g consisted of .4 brightfield units and .8 darkfield units.
  • Fig. 22h is an image of the same micro ball grid array except that the illumination used for Fig. 22h consisted solely of 1.0 brightfield units.
  • Fig. 22i is an image of a portion of the micro ball grid array shown in Figs. 22g and 22h, magnified and windowed as output from video formatter 714.
  • Fig. 22i has the same field of view of Figs. 22h and 22g.
  • the illumination used to acquire Fig. 22i consisted solely of 1.0 darkfield units.
  • Fig. 22j is an image of a semiconductor die- in-package. Although the component shown in Fig. 22j is generally not placed by a pick and placed machine, the image of Fig. 22j is highly useful for a wire bonder which couples minute wires between pads on the semiconductor and the leads.
  • the image shown in Fig. 22j has a field of view for approximately 16mm x 16mm. Illumination consisted of .5 brightfield units.
  • Fig. 22k shows the same semiconductor die- in-package and has an identical field of view as that of Fig. 22j . Illumination consisted of .25 brightfield units, 1.0 darkfield unit, and 1.0 diffuse unit. Note, that when diffuse lighting is used, such as in Fig. 22k, wire bonds between semiconductor and the individual leads are visible.
  • Fig. 221 is an enlarged view of a portion of the semiconductor die-in-package .
  • the field of view of Fig. 221 is approximately 5.0mm x 7.0mm.
  • Fig. 221 is taken from a portion of the image of Fig. 22k, which portion has been digitally magnified and windowed as output from formatter 714.
  • Fig. 22m is an image of a screen printing stencil acquired in accordance with an embodiment of the present invention.
  • a stencil is generally not a component that is placed by a pick and placed machine, imaging such a stencil is highly useful for a screen printer device.
  • the image of Fig. 22m has a field of view of approximately 9mm x 13mm. Illumination consisted of .6 brightfield units. Note, the larger stencil openings are .65mm in pitch and the smaller stencil openings have a .30mm pitch.
  • Fig 22m illustrates an undesirable screen printing condition in which the stencil is clogged. .Specif cally, some of the larger openings near the top Fig. 22m are partially obstructed by solder paste.
  • Fig. 22n is an image acquired of the same screen printing stencil as that of Fig. 22m.
  • the field of view is identical to Fig. 22m and the main difference between the two figures is that of illumination.
  • 1.0 darkfield units and 1.0 diffuse units of illumination were employed during the acquisition of Fig. 22n
  • Contrasting Figs. 22n and 22m reveals that darkfield illumination is highly- useful in accentuating solder paste adhering to the bottom of the stencil.
  • the solder paste is shown in Fig. 22m proximate the large stencil openings.
  • Fig. 22o is an image of bare copper fiducial marks acquired in accordance with an embodiment of the present invention. Specifically, Fig. 22o has a field of view of approximately 7mm x 22mm. 1.0 diffuse illumination units and 1.0 darkfield illumination units were employed during the acquisition of Fig. 22o. The shape of each fiducial copper mark can be seen clearly in Fig. 22o. Such fiducial marks are often employed on circuit boards to provide reference location information. Imaging systems image such fiducial marks to essentially calculate the relative position of the circuit board to facilitate board assembly functions sucfi as screen printing, or component pick and placement.
  • Fig. 22p is an image of a solder paste test print acquired in accordance with an embodiment of the present invention.
  • the field of view of Fig. 22p is approximately 26mm x 26mm.
  • 1.0 diffuse illumination units and 1.0 darkfield .illumination units were employed during the acquisition of fig. 22p.
  • fig. 22q is an enlarged view f the boxed port on of Fig. 22p.
  • the field of view of Fig. 22q is approximately 2.5mm x 2.5mm, and was magnified and w ndowed as output from formatter 714.
  • the illumination used during the acquisition of Fig. 22q is the same as that of Fig. 22p.
  • each solder paste print shown in Fig. 22q has a diameter of approximately .3mm.
  • Figs. 22a-22q show images where each line of the image was acquired with the same illumination type and intensity as its neighbor, embodiments of the invention can be practiced with interleaved images.
  • Interleaved images are created by changing illumination levels on a line by line basis. For example, every odd line can be exposed to darkfield illumination while every even line is exposed to diffuse illumination. This creates essentially two images which are parsed by the video formatter to generate separate windows.
  • processor then uses the windows to generate placement offset information, and to perform component inspection (which will be described later in tne specification) .
  • Interleaved images can also be used to increase the dynamic range of the linescan sensor. Such enhanced dynamic range is useful for imaging components that have reflectivity changes greater than the dynamic range of the linear detector. Without extending the dynamic range of the detector, some pixels of the image may be saturated or clipped, or some pixels may be completely dark if the reflectivity changes are too large.
  • a first interleaved image is acquired with illuminators set to the maximum possible illumination levels that still allow the desired illuminator type ratios (such as 2:1 darkfield: diffuse) .
  • the second interleaved image is acquired with lower illumination levels, while still maintaining the chosen illuminator ratios.
  • the second image is acquired with all illuminator levels reduced by a factor of 16.
  • the video formatter takes the first interleaved image and essentially discards all saturated pixels. Then the video formatter replaces the discarded pixels with corresponding pixels from the second interleaved image, that are then, properly scaled by the ratio of illumination levels between the first and second images (such as 16) .
  • This method can be adapted to use third, fourth, fifth, . . . etc. images to further extend the dynamic range of the linescan sensor.
  • embodiments of the present invention are highly useful not only for a pick and place machine but also for wire bonders and screen printers. Moreover, different characteristic., of individual features on the bottom of a given component can be resolved to greater or lesser degrees as desired. Thus, embodiments of the present invention are useful not only for the placing of components, but the inspection of components as well. "Inspection” includes presence, absence, feature dimension (such as ball diameter) , and lead tweeze .
  • Tombstoning is a condition in which the component is picked up by a surface other than that opposite the mounting surface.
  • tombstoning is when a chip capacitor is picked up in such a way as to extend partially into the nozzle.
  • This condition is detected by measuring the length and width of the component and comparing such measurements to nominal dimensions.
  • the component dimensions will deviate slightly from nominal based upon manufacturing tolerances, however the deviation is very drastic when a component is tombstoned.
  • Fig. 23a and 23b are diagrammatic views of lens system 160 in accordance with an embodiment of the present invention.
  • the system shown in figs. 23a and 23b includes a gradient index (GRIN) lens array.
  • a gradient index lens is an optical element within which the refractive index is a smooth, but not constant, function of position and, as a result, the ray paths are curved. This is also known as a graded index lens.
  • the ray curving characteristic of GRIN lens 270 is shown in Fig. 23b where rays emanating from object line 261 enter GRIN lens 270 and begin to curve as indicated. Rays exiting GRIN lens 270 converge and are focused at 262 upon linear detector 150.
  • a GRIN lens array provides a large, compact field of view for imaging systems of embodiments of the invention.
  • GRIN lens 270 is shown as an example of lens array 160, any suitable optical element capable of focusing object line 261 upon a linear detector can be used.
  • the compact nature of the present invention with the GRIN lens array allows for the pick and place machine of the present invention to have a reduced nozzle " z" stroke.
  • a reduced nozzle " z" stroke is essential to rapid placement of components, since each time a component is placed, the nozzle must be lifted in order to clear the sensor for scanning and then lowered by approximately the same distance to place the component .
  • Fig. 24 is an elevation view of sensor 278 in accordance with another embodiment of the present invention.
  • Sensor 278 bears many similarities to sensor 106 (shown in Fig. 13) and like components are numbered similarly.
  • Sensor 278 differs from sensor 106 because sensor 278 includes an additional linear detector 280 optically coupled to a portion of the component to be imaged via imaging optics 282.
  • imaging optics 282 includes a set of lenses 284 and beam splitting mirror 286. Beam splitting mirror 286 reflects a portion of the component image through lens 284 which reflected portion is focused upon second linear detector 280.
  • a portion of the image also passes through beam splitting mirror 286 and is reflected by beam splitting mirror 156, which reflected image is then focused by imaging optics 156 upon first linear detector 150.
  • the arrangement illustrated in Fig. 24 provides a dual field of view optical system that provides high magnification and high resolution as well as a large field of view. Although the dual- field of view embodiment is useful for a variety of applications, it is particularly useful for wire bonder applications.
  • eacn of nozzles 82 within tne pick and place machine can be adapted to pick and place a different type cf electrical component:.
  • sucn different component types include flip-chips, ball grid arrays (BGA's), micro ball grid arrays, quad flat pack (QFP), connector, pin grid array, dual inline package, single inline package, plastic leaded chip carrier (PLCC), chip capacitors, and chip resistors.
  • each nozzle 82 can be independently adapted to pick and place a different type of component than other nozzles 82. Because different component types can require different image resolutions, embodiments of the present invention can preferably change image resolution by changing scan velocity based upon component type.
  • Fig. 25 is a chart of preferred image resolution for a variety of component types, and the associated scan time required to provide the desired resolution.
  • an image resolution of 14 micrometers is used. Scanning 200mm at a 14 micrometer resolution takes approximately 1.6 seconds with the first image becoming available at 300 milliseconds .
  • image resolution In order to image a micro ball grid assembly having 150 micrometer balls, or a quad flat pack having .3 to .4mm lead pitch, it is preferred to use an image resolution of approximately 28 micrometers. Image acquisition at the desired resolution takes approximately 800 milliseconds to scan a 200mm scan length, and the first image becomes available at 200 milliseconds .
  • a 0603 metric chip cap or quad flat pack having a lead pitch between .5 and .625mm is preferably imaged at a resolution of about 56 micrometers. Total scan time at the 56 micrometer image resolution takes approximately 400 milliseconds to scan 200mm, with the first image becoming available after 100 milliseconds.
  • a 1005 metric chip cap, ball grid assembly having a 1.25mm lead pitch, or a plastic leaded chip carrier having a 1.25mm lead pitch preferably is imaged at a resolution of approximately 112 micrometers.
  • Total scan time for a scan length of 200mm takes about 200 milliseconds and the first image becomes available after approximately 50 milliseconds. Because various nozzles 82 may hold different types of electrical components, scan speed, and thus resolution, can be varied as the sensor traverses the entire scan length.
  • Figs. 26a-26c show various possible velocity profiles for the sensor.
  • Fig. 26a shows scan velocity increasing linearly during an initial period and leveling off for an extended period of time for a fine pitch quad flat pack component.
  • Fig. 26b shows a sample velocity profile for scanning a course pitch quad flat pack component.
  • scanning as shown in Fig. 26b provides higher sensor acceleration, and a higher scan velocity than that of Fig. 26a.
  • Fig. 26c is a sample velocity profile when multiple types of components are used. Sensor velocity increases initially just as in Fig. 26a and maintains the selected velocity while the appropriate component (such as a fine pitch quad flat pack component) is scanned.
  • sensor scan velocity is increased such as shown for a different type of component (such as a course pitch quad flat pack component) .
  • sensor velocity can again be changed such as the decrease shown in Fig. 26c to scan a third component, such as a micro ball grid array.
  • Figs. 27a-27h illustrate ways in which subsets of contiguous video data, or windows, can be used to provide enhanced video data for specific types of applications.
  • the sensor and video formatter preferably include window circuitry that is adapted to provide one or more windows as the video output .
  • the scan width, W ⁇ is shown on each of Fig. 27a - 27h.
  • Fig. 27a illustrates an embodiment where a high resolution window 300 is provided as the video output.
  • Window 300 encompasses component 96 and this particular embodiment is useful for small components. Thus, all features of the entire bottom surface of component 96 are imaged at high resolution and provided in window 300.
  • Fig. 27b illustrates an embodiment where two high resolution windows 302 are provided as the video output. Such configuration is particularly useful for identifying and precisely positioning corners of component 304 when component 304 has dimensions which exceed that of high resolution windows 302. As illustrated in Fig. 27c, additional high resolution windows can be provided to image all corners of component 304, and is particularly useful in precisely locating corners of a component.
  • Fig. 27d illustrates a large field of view window 306 encompassing all of component 304 .
  • Such configuration is particularly useful for 100% presence inspection.
  • the pick and place machine can ensure that all balls cf a ball grid array, for example, are present.
  • Fig. 27e shows an embodiment combining the windows of Figs. 27c and 27d.
  • four high resolution windows 302 precisely locate corners of component 304 while window 306 is used for component inspection.
  • the embodiment shown in this Fig. is useful for detecting balls.
  • Fig. 27f illustrates an embodiment where high resolution windows 302 image portions of component 310.
  • the four high resolution windows 302 are then combined by the sensor to provide one EIA or CCIR composite image 312 as the sensor video output.
  • Fig. 27g illustrates an embodiment where multiple windows 314 are used to scan a long component such as a connector 316.
  • Fig. 27h illustrates an embodiment where windows 318 are oriented at an angle relative to the scan direction to enhance corner detection. Preferably, windows 318 are oriented at 45° from the scan direction.
  • additional techniques can be employed. For example, once the part is picked up by nozzle 82, nozzle 82 can rotate the part 45° such that at least two of its corners should be within the scan width.
  • the component can be rotated an additional 90° to place the other two corners within the scan width, after which a second scan can be performed,, preferably in the opposite direction for efficiency. In this manner, all four corners can be precisely located and/or inspected. Further, for a very large components, any number of scans can be performed at any appropriate angular increments. Additionally, in some embodiments where the pick-up uncertainty is smaller than the pitch of the leaded component, only the center portion of a large component is measured for component orientation. These techniques are especially useful for placing and/or inspecting quad flat packs (QFP's) and ball grid arrays having a size greater than 25mm..
  • QFP's quad flat packs
  • ball grid arrays having a size greater than 25mm.
  • Fig. 27i illustrates another embodiment of a line scan sensor 151 where large components can be measured.
  • Components larger than Ydet can be measured by offsetting nozzle 82 with respect to detector window 152.
  • Detector 150 is shown within detector window 152, establishing a line of sight of a component 153.
  • parts having a dimension up to twice the Y 2 dimension can be measured by scanning once, and then rotating the component 180 degrees and scanning again, preferably in the opposite direction.
  • Components smaller than twice the Yi dimension can be measured in a single pass.
  • the offset nozzle method can be combined with the CCD readout register multi-tap method. Such combination allows small parts to be measured by using only one or two of the taps, while larger parts are measured using all taps. This facilitates high speed, high resolution scanning of small components while retaining the ability to measure large components.
  • Fig. 28 is a perspective view of a wire bonder in accordance with the prior art.
  • Bonder 320 includes a bonder head 322 that is adapted to dispense and connect individual wires between die pads 324,324 and lead pads 326. Wire loop 323 has been bonded between a pad 326 and one of pads 324.
  • Bonder 320 uses conventional imaging camera 328 to precisely locate the various pads in order to electrically couple them with wires.
  • Camera 328 includes illuminators 330, lensing system 332, mirror 334 and area detector 336. As is known, illuminators 330 illuminate the pads to be bonded and lens system 332 and mirror 334 cooperate to focus an image of the pads upon area detector 336.
  • Area detector 336 is coupled to additional electronics to process the image to thereby compute precise die pad and lead pad locations .
  • Fig. 29 is a top plan view of a wire bonder in accordance with an embodimenc of the present invention.
  • Wire bonder 340 includes bonder head 322 and linescan camera 344 in accordance with an embodiment of the present invention.
  • Linescan camera 344 is preferably constructed in accordance with any of the various embodiments described above with respect to pick and place machines.
  • detector window 346 of line scan detector 344 is disposed at an angle (of approximately 45°) relative to the scan direction X, other embodiments are possible where the detector window is perpendicular to scan direction X.
  • orienting the detector window at an angle relative to the scan direction facilitates scanning all four sides of the die.
  • Fig. 30 shows a prior art printed circuit board solder paste application system 350.
  • System 350 includes stencil 352 with stencil apertures 355 for applying solder paste through apertures 355 onto printed circuit board 354 to form corresponding solder bricks 357. After the solder paste is squeegeed onto board 354, the board is lowered, or stencil 352 is raised, and imaging system 356 is moved into place to inspect both stencil 352 and board 354 simultaneously.
  • Imaging system 356 includes mirror 358 for directing both images through a lens assembly 360 and onto CCD area detector 362.
  • System 350 suffers from the limitations that it is relatively thick, and that its field of view is limited by the size of area detector 362.
  • Fig. 31 shows a portion of a printed circuit board solder paste application system in accordance with an embodiment of the present invention.
  • Sensor 370 is adapted to move relative to stencil 352 and printed circuit board 354.- in the direction of arrow 371.
  • Sensor 370 includes first linear detector 374 disposed within housing 372.
  • First linear detector can be identical to detector 150 described above.
  • First linear detector 374 is adapted to view stencil 352 through first imaging optics 376.
  • Illuminators 378 are provided to illuminate stencil 352 or circuit board 354.
  • a second linear detector 380 is also disposed within housing 372, and is adapted to view circuit board 354 through second imaging optics 382.
  • the linescan camera of the above embodiments has essentially been duplicated such that both the stencil and the printed circui "board can be scanned .simultaneously. Since sensor 370 is thin and compact, it is able to fie in smaller spaces between stencil 372 and board 354 than prior systems. Thus, a screen printer in accordance with embodiments of the invention does not require as much time for the stencil and board to separate, thus reducing cycle time and increasing system throughput.
  • a screen printer in accordance with an embodiment of the present, invention provides a variety of inspections for the stencil, circuit board, or both.
  • the stencil can be inspected for clogged openings (see Fig. 22m) or solder paste adhered to the bottom of the stencil (see Fig. 22n) .
  • the circuit board can be inspected for proper solder pad registration (with respect to the circuit board) , bridging between solder pads, and irregularly shaped solder pads.

Abstract

A pick and place machine (50), and its imaging system (84, 106, 278) are provided. The imaging system is movable with a head (87) of the pick and place machine (50) and includes a linear detector (150) which is adapted to move proximate the component (96) to thereby scan the component (96). The imaging system (84, 106, 278) provides a video output (718) that is used to calculate an orienation of the component (96) and to adjust the orientation for suitable mounting upon a workpiece such as a printed circuit board. The imaging system (84, 106, 278) is also useful in additional electronics assembly devices such as screen printers (350) and wire bonders (340). Methods of mounting a component (96) to a workpiece are also provided. Generally, a linear detector (150) with imaging optics (156) is passed proximate the component (96) to image the component (96). Component orientation is then computed and corrected such that the component (96) can be suitably mounted upon the work piece.

Description

ELECTRONICS ASSEMBLY APPARATUS WITH
IMPROVED IMAGING SYSTEM
CROSS-REFERENCE TO RELATED APPLICATIONS This application claims the priority of earlier filed co-pending Provisional Applications; serial no. 60/107,188, filed November 5, 1998, entitled COMPACT SCANNING CAMERA; serial no. 60/107,505 filed November 6, 1998, entitled COMPACT SCANNING CAMERA; serial no. 60/131,996, filed April 30, 1999, entitled COMPACT LINE SCAN CAMERA WITH IMPROVED THROUGHPUT; serial no. 60/144,616, filed July 20, 1999, entitled SINGLE PATH LINESCAN CAMERA FOR SENSING HEIGHT THROUGH DEFOCUSING; and serial no. 60/144,614, filed July 20, 1999, entitled STEREO VISION LINESCAN CAMERA WITH COPLANARITY AND RELATED APPLICATIONS THEREOF.
BACKGROUND OF THE INVENTION The present invention relates generally to the automated electronic assembly industry. More specifically, the present invention relates to automated electronic assembly machines such as wire bonders, screen printers, and pick and place machines having an improved imaging system.
Prior art vision systems used in pick and place machines for measuring component alignment and inspection prior to placement have several limitations. Such vision systems have area array detectors and provide moderate resolution (e.g., 640x480) . They are typically large in size, thereby limiting the speed at which the vision system can be moved. Although enhanced resolution vision systems are available, they require modification of existing hardware, generally are of larger size, and are costly. Fig. 1 is a perspective view of prior art pick and place machine 2 having placement head 4 which picks up electronic components 8 of various sizes. A standard machine vision camera 10 views components 8, and associated electronics (not shown) in pick and place machine 2 compute X, Y and θ orientation of component 8 relative to a work piece such as printed circuit board 12. Area detector 18 is shown in dotted lines within camera 10 and is optically coupled to lens system 20. Although pick and place machine 2 includes a single placement head 4, pick and place machines having additional placement heads, such as a split-gantry design, are well known. Head 4 is able to move component 8 in the X, Y, Z and θ axes by employing suitable motors and motor control electronics (not shown) .
The machine vision system used by pick and place machine 2 is known as an off-head design because camera 10 is mounted in a fixed position relative to head 4. In operation, head 4 releasably picks up a component, and transports the component to a position directly above camera 10. Camera 10 then images the component to determine the current orientation of the component in the X, Y and θ directions such that the orientation can be corrected and the component correctly placed. As can be appreciated, requiring head 4 to transport each component 8 to camera 10 for imaging significantly adds to total component placement time.
Fig. 2 is a perspective view of camera 34 having linear detector 30 and lens system 32. Camera 34 performs a similar function to that of camera 10 shown in Fig. 1, at lower cost. Pick and place machine 2 can employ camera 34 to receive an image of higher resolution and larger field of view than that of camera 10. Camera 34 is stationary, ust as camera 10, and thus is an off-head camera. The field of view of any image generated by camera 34 is only limited by the travel of the associated head, the length of the linear detector, and the size of the data file generated by the sum of each line scan image (as will be described in greater detail later in the Specification) . This is in contrast to the field of view provided by camera 10, which is fixed in size. Thus, camera 34 is adapted to provide a video output as a component is scanned past camera 34. Camera 34 essentially images a number of individual J inear portions of the component as the component is passed relative to camera 34. Although such design provides a larger field of view, ..and lower cost, it still suffers from the limitation of increased total component placement time due to the fact that individual components must be transported to camera 34 for imaging.
Fig. 3 is an elevation view of a portion of pick and place machine 40 having an on-head sensor 44 that includes area detector 42. Sensor 44 includes lens assembly 46, mirror 48, and illuminator 50. As an on-head system, sensor 44 moves with multi-nozzle head 52 and moves relative to components 8 as indicated by arrow 41. One of nozzles 54 releasably holds component 8. Sensor 44 is moved under components 8 to sense their respective orientations such that the respective orientations can be adjusted prior to mounting the respective components onto a work piece. Head 52 moves independently of sensor 44 thus allowing sensor 44 to be retracted prior to placing a component 8 on a work piece. An on-head machine vision syste such sensor 44 allows component scanning to be effected while head 52 is transporting components 8 to their respective positions on the workpiece .
One on-head prior art line scan camera uses a light source positioned below the component, with two linear arrays located above the component, shining light through leads of the component at an angle. The two shadows are imaged by the linear arrays, but the system is unable to image the bottom of a component because it is backlit.
Another prior art line scan camera system is also backlit and has a camera positioned below the component. However, it is less than optimal because it lacks optics for focusing the shadow of the component onto the detector. Such a system may be useful for rough orientation of components, but is unable to inspect a fine pitch component (e.g., less than .3 mm pitch center to center between the leads).
Furthermore, systems such as this which are exclusively backlit are unable to view an entire class of components; those with features like balls, columns and grids on their underside.
As consumer demand for low cost electronic components increases, the electronic assembly industry demands higher cost effective throughput. Pick and place machines, wire bonders and screen printers are required to process workpieces much more rapidly and more accurately than ever before.
There is a need to provide a compact, lightweight imaging system suitable for wire bonders, screen printers and pick and place machines that can quickly and accurately image a large variety and number of components. Preferably, the improved imaging system should adapted to be integrated into the moving head and adaptable for use with a variety of video processing systems. Additionally, more complex components mandate the use of imaging systems with a large field of view with high resolution, which includes a computer controlled illumination system to accommodate a large variety of components, at the same or reduced costs than are presently available.
SUMMARY OF THE INVENTION
A pick and place machine, and its imaging system are provided. The imaging system is movable with a head of the pick and place machine and includes a linear detector which is adapted to move proximate the component to thereby scan the component. The imaging system provides a video output that is used to calculate an orientation of the component and to adjust the orientation for suitable mounting upon a workpiece such as a printed circuit board. The imaging system is also useful in additional electronics assembly devices such as screen printers and wire bonders.
Methods of mounting a component to a workpiece are also provided. Generally, a linear detector with imaging optics is passed proximate the component to image the component. Component orientation is then computed and corrected such that the component can be suitably mounted upon the work piece .
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 is a perspective view of a prior art pick and place machine.
Fig. 2 is a perspective view of a prior art linescan camera.
Fig. 3 is a cutaway elevation view of a prior art on-head area camera.
Fig. 4 is a top plan view of a pick and place machine.
Fig. 5 is an elevation view of a placement head in accordance with an embodiment of the present invention.
Fig. 6 is a side elevation view of a portion of a placement head in accordance with an embodiment of the present invention.
Fig. 7 is a rear eJ evation view of a portion of a placement head in accordance with an embodiment of the present invention.
Fig. 8 is a top plan view of a portion of a placement head in accordance with an embodiment of the present invention.
Figs. 9a and 9b are diagrammatic views illustrating nozzle spacing in a placement head.
Fig. 10 is a top plan view of a placement head in accordance with another embodiment of the present invention.
Fig. 11 is a flowchart of a method of picking and placing components in accordance with an embodiment of the present invention.
Fig. 12 is a timing diagram of component placement in accordance witn an embodiment of the present invention.
Fig. 13 is a diagrammatic view of a portion of a pick and place machine in accordance with an embodiment of the present invention. Fig. 14 is a system block diagram of a portion of a pick and place machine in accordance with an embodiment of the present invention.
Figs. 15a - 15j are diagrammatic views of a linear detector and methods of effecting photoelement exposure control and image adjustment.
Figs. 16a - 16c are diagrammatic views of an illuminator in accordance with an embodiment of the present invention.
Figs. 17a - 17c are diagrammatic views of an illuminator in accordance with an embodiment of the present invention.
Figs. L8a and 18b are diagrammatic views of an illuminator in accordance with an embodiment of the present invention.
Figs. 15a and 19b are diagrammatic views of an illuminator in accordance with an embodiment of the present invention.
Figs. 20a - 20d are diagrammatic views of an illuminators in accordance with another embodiment of the present invention.
Fig. 21 is a diagrammatic view of a portion of a pick and place machine in accordance with another embodiment of the present invention.
Figs. 22a - 22q are images of components acquired in accordance with embodiments of the present invention.
Figs. 23a and 23b show a gradient index lens array imaging system.
Fig. 24 is a cutaway elevation view of a sensor in accordance with another embodiment of the present invention.
Fig. 25 is a chart of measurement times for various component types.
Figs. 26a - 26c are sensor velocity profiles in accordance with embodiments of the present invention.
Figs. 27a - 27h are diagrammatic views of window outputs in accordance with embodiments of the present invention.
Fig. 27i is a diagrammatic view of an embodiment of the present invention for viewing large components .
Fig. 28 is a perspective view of a prior art wire bonder.
Fig. 29 is- a top plan view of a wire bonder in accordance with an embodiment of the present invention.
Fig. 30 is a perspective view of a prior art screen printer.
Fig. 31 is a diagrammatic view of a portion of a screen printer in accordance with an embodiment of the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Fig. 4 is a top plan view of pick and place machine 50 in accordance with an embodiment of the invention. Although much of the present invention will be described with respect to pick and place machine 50, other forms of pick and place machines, such as a split gantry design, are useful with embodiments of the present invention. Additionally, although embodiments of the present invention will be described with respect to pick and place machines, some embodiments of the present invention include an imaging system provided with a wire bonder or screen printer, as will be described with respect to Figs. 29 and 31. As shown in Fig. 4, machine 50 includes transport mechanism 52 that is adapted to transport a workpiece such as a printed circuit board. Transport mechanism 52 includes mounting section 54 and conveyor 56. Transport mechanism 52 is disposed on base 58 such that a workpiece is carried to mounting section 54 by conveyor 56. Component reservoirs 60 are disposed on either side of transport mechanism 52 and supply electronic components. Reservoirs 60 can be any suitable device adapted to provide electronic components, such as a tape feeder.
Pick and place machine 50 includes head 62 disposed above base 58. Head 62 is movable between either of component reservoirs 60 and mounting section 54. As can be seen, head supports 64 are movable on rails 66 thereby allowing head 62 to move in the Y direction over base 58. Movement of head 62 in the Y direction occurs when motor 70, in response to a motor actuation signal, rotates ball screw 72 which engages one of head supports 64 to thereby displace the support 64 is the Y direction.
Head 62 is also supported upon rail 68 to allow head movement in the X direction relative to base 58. Movement of head 62 in the X direction occurs when motor 74, in response to a motor actuation signal, rotates ball screw 76 which engages head 62 and displaces head 62 in the X direction.
As can also be seen, head 62 includes body 78 nozzle mount 80, nozzles 82 and sensor 84. Nozzle mount 80 is disposed within body 78 and mounts each of nozzles 82 within body 78. Each of nozzles 82 is movable in the Z direction (up/down) and is rotatable about the Z axis by suitable actuation members, such as servo motors. Sensor 84 is adapted to move in the X direction relative to nozzles 82 to acquire images of components h ld by n^-zles 82. Sensor 84 is coupled to image processor 86. Image processor 86 receives video data from sensor 84 based upon images of components held by nozzles 82. Image processor 86 is adapted through hardware, software, or a combination of both, to calculate respective component orientations of each of the components held by the respective nozzles 82. Image processor 86 then sends suitable orientation information to a controller (not shown) such that each of nozzles 82. is successively displaced to properly mount its respective component upon the workpiece. Although Fig. 4.shows a number of nozzles 82, it is expressly contemplated that pick and place machine 50 can include a singular nozzle to practice embodiments of the present invention.
Fig. 5 is an elevation view of head 62 in accordance with one embodiment of the present invention. Head 62 includes motor 88 operably coupled to a ball screw (not shown) through belt 89. The ball screw is operably coupled to sensor 84 such that energization of motor 88 causes sensor 84 to move in the X axis direction relative to nozzles 82. Sensor 84 can be adapted to image components coupled to nozzles 82 while scanning in either X axis direction. When such bi-directional scanning is employed, it is useful to provide image processing software that corrects for the fact that the data from scans of opposite directions are essentially flipped around from one another. Additionally, in some bidirectional scanning embodiments, sensor 84 can store the various scanned lines in temporary memory and then send them to the image processing section in correct order.
Fig. 6 is an elevation view of head 87 in accordance with an embodiment of the invention. Head 87 includes plate 90 to which nozzles 82 and linear stage 92 are mounted. Sensor 84 is coupled to linear stage 92 via bracket 94 such that sensor 84 is moveable relative to nozzles 82 and thus component 96.
As can be seen in Fig. 7 at. arrow 100, sensor 84 is moveable in the X axis direction relative to components 96.
Fig. 8 is a top plan view of head 87 in accordance with an embodiment of the present invention. For clarity, only four nozzles 82 are shown in Fig. 8. however, any appropriate number of nozzles, .including one nozzle, can be used. As indicated by arrows 97 and 98, head 87 is movable in X and Y axis directions. As Indicated by arrow 100, sensor 84 is. ovable in the X axis direction with respect to nozzles 82 via its coupling to linear stage 92. Sensor 84 includes detector window 102 which allows a line of sight between a linear detector (not shown) disposed within sensor 84 and a portion of a component held by one of nozzles 82. The line of sight is preferably parallel to the axis of nozzles 82.
Figs. 9a and 9b illustrate preferred dimensions in relation to the number of nozzles 82 employed on head 87. Fig. 9a indicates that for a four nozzle head, a scan length, wx, of 200mm is preferred. Thus, the maximum scannabie length, wx, preferably equals 200mm. Note, total sensor travel should be equal to or greater than the maximum scan length plus the sensor width, Ws, so that the sensor is clear of the components while they are being picked up and placed. The number of nozzles employed by head' 87 is related to the maximum anticipated size of components that will be placed on the workpiece. Thus, if the anticipated components are larger, such as microprocessors or applications specific integrated circuits (ASICs) , then fewer nozzles are used. Conversely, if the components are relatively small, such as surface mount capacitors and resistors, then a larger number of nozzles are employed. In Fig. 9b, scan length, wx, is preferably the same as that of Fig. 9a. In both figs. 9a and 9b, the scan width, WY, (which corresponds to one of the dimensions of the linear detector' s field of view perpendicular to the scan direction) is preferably equal to about 26mm. Optionally, longer linear detectors can be used to increase the scan width to 35mm. or more. However, as the size of the linear detector is increased, costs rise and sensor scan speed may be adversely affected.
Fig. 10 is a top plan view of placement head 104 for a pick and place machine in accordance with another embodiment of the present invention. Head 104 bears many similarities to head 87, and like components are numbered similarly. As can be seen, head 104 includes body 7 and one or more nozzles 82. Sensor 106 is moveable relative to nozzles 82 since sensor 106 is coupled to motor 88 via ball screw 108. Motor 88 also includes encoder 110 that provides a feedback signal indicative of rotational displacement of ball screw 108 and thus axial displacement of sensor 106 in the X direction. In contrast to sensor 84 shown in Fig. 8, sensor 106' includes a detector window 112 that is perpendicular to a longitudinal axis 114 of sensor 106. Detector window 112, and thus its line of sight, can be positioned anywhere on sensor 106. It is understood that the linear detector of the present invention may be placed virtually anywhere in the sensor housing, since the invention includes the use of mirrors and the like to fold the optical path. Thus, if sensor 106 is adapted to scan components in a single direction (for example while moving to the right) then window 112 can be disposed proximate a leading edge of sensor 106 such that components are scanned more quickly. In embodiments where sensor 106 is adapted scan components in either direction (left and right) window 112 is preferably centered upon sensor 106.
Fig. 11 is a flowchart of a method of picking and placing n components upon a workpiece in accordance with the present invention. At block 120, n components are picked up by a pick and place machine head, such as head 87. Subsequently, blocks 122 and 124 are initiated. Thus, a linescan camera begins moving relative to the components as indicated by block 122 and the head begins traveling to the approximate position or site on the workpiece where the first component will be mounted. Preferably, blocks 122 and 124 are executed substantially simultaneously.
At block 130, counter P is initialized to equal 1. Counter P is used to track which component coordinates are being computed, as will be described in greater detail with respect to the rest of Fig. 11. After block 130, blocks 126, 128, and 132 preferably begin execution. Preferably, blocks 126, 128 and 132 execute while the head is transporting components to the approximate placement site. Although such blocks arc illustrated and described as executing at least partially in parallel, it is contemplated that such blocks can execute sequentially.
At block 126, the linescan camera passes all n components and collects video data based upon the components .
At block 128, the linescan video data is corrected for non-uniformities. Such non-uniformities may be due to changes in sensor scan speed that, occur while scanning is performed. Such correction will be described in greater detail later in the specification . t block 132, X,, Y and θ offset adjustments for component cP are computed. The computed offset adjustments are then used in block 134 to calculate final part placement coordinate endpoints for component cp. After component offset adjustments have been computed, counter P is incremented as indicated in block 136. The machine then checks to determine whether the incremented counter (P) exceeds the number of components (n) picked up in block 120, as indicated at block 138. If the incremented counter exceeds the number of components, then control passes to block 140 and offset calculations are ceased. However, if the incremented counter does not exceed the number of components, control returns to block 132 and offset adjustments for component cp are computed. The loop continues with block 132 providing computed offset adjustments to block 134 unti.l offset adjustments have been computed for all n components.
After block 134 receives the placement coordinates, part cp is placed as indicated at block 137. At block 139, the machine checks to determine whether cp is the last component. If component cp is not the last component, control returns to block 124 and the head begins moving to the approximate placement site of the next component. However, if all n components have been placed, then control returns to block 120 and an additional n components are picked up and the method repeats. Preferably, the various steps of placing parts occur while component offset adjustments are calculated.
Throughout this document, sequential operations for practicing the method of the present invention are disclosed. It is understood that for any successive sequential. operations, that tne first operation need only fee commenced before the second operation is started. For example, in the flow chart of Fig. II, once block 120 is commenced, the operation of picking the component up need not be fully completed before the operation of block 124 is started.
Fig. 12 is an example scan timing chart for a pick and place machine naving four nozzles in accordance with an embodiment of the present invention. The vertical lines in Fig. 12 indicate specific time intervals. As can be seen, at time to nozzle # 1 is scanned. For the example illustrated in Fig. 12, nozzle scanning requires three time intervals for completion. Thus, nozzle scanning which begins at time to will finish at time t3. As can be seen, while nozzle # 1 is scanned, images of the component held by nozzle # 1 begin to be transferred at time ti. At t2, while the nozzle is still being scanned, and while the image is still being transferred, image processing begins. At time t3, scanning of nozzle # 1 has completed and scanning of nozzle r 2 begins even while images of the component held by nozzle # 1 is still being transferred and processed. During time t, the sensor clears nozzle # 1 thereby allowing component # 1 to be placed, which occurs during time tβ . As can be seen, component # 1 is placed even while images are of component # 2 are transferred and processed. Thus, those skilled in the art will appreciate that the various steps of scanning, transferring, processing, and placing can overlap to some extent, temporally. Although the description of Fig. 12 indicates that the video data windows are processed in sequential order, such notation is provided for clarity since in some instances it is advantageous to process video windows in an order that enhances assembly. Such processing order can be based upon image collection order, placement order, processing time, and travel time between subsequent sites. Thus, it is expressly contemplated that component images can be processed in an order that differs from the order in which the components were picked up by the head.
Fig. 13 is a diagrammatic view of sensor 106 as it scans a portion of component 96 held by nozzle 82. Sensor 106 is operably coupled to motor 88 via ballscrew 108. Motor 88 is operably coupled to encoder 110 which provides an indication of rotary displacement of ballscrew 108 and thus axial displacement of sensor 106 along the X axis. A linear glass scale type encoder could be substituted for encoder 110.
Sensor 106 includes linear detector 150 coupled to sensor electronics 152. Linear detector 150 is preferably a charge coupled device (CCD) comprising a number of photoelements (pixels) arranged in a line. Preferably, the size of each pixel is approximately 14 microns square. Detector 150 is preferably manufactured by Dalsa Inc., of Waterloo Ontario and is model no. IL-CC-2048, although other types of linear detectors may be used in the present invention. Linear detector 150 is optically coupled to a portion of leads 154 through imaging optics 156 and detector window 158. Imaging optics 156 can include lens system 160 and partial mirror 162.
Preferably, sensor 106 also includes one or more illuminators. The embodiment shown in Fig. 13 includes darkfield illuminators 164, diffuse illuminators 166, and brightfield illuminator 168. As used herein, darkfield illumination is intended to mean illumination which impinges upon the component at a high angle of incidence. Diffuse illumination, as used herein, is intended to mean illumination impinging upon the component at a lesser degree of incidence. Brightfield illumination, as used herein, is intended to mean illumination which impinges upon the component at a substantially zero incidence angle.
Thus, brightfield illumination can also be considered specular or through-the-lens illumination. Additional types of illumination, such as backlighting, can also be provided in the pick and place machine by disposing an appropriate source behind component 96 relative to sensor 106, as will be described with respect to Figs. 20a - 20d, and Fig. 21.
In operation, sensor 106 is moved along the X-axis with respect to component 96. While in motion, sensor 106 acquires individual linear images of portions of component 96. By storing multiple linear images and correlating the individual images with sensor location information provided by encoder 110, an image of component 96 can be constructed.
Illumination emanating from any of darkfield illuminators 164, diffuse illuminators 166 or brightfield illuminator 168 is reflected by a portion of component 96 proximate detector window 158. The reflected illumination is redirected by partial mirror 156 through lens system 160, and thereby focused upon linear detector 150. Each individual pixel of linear detector 150 is a charge coupled device element which provides a representation of the sum of illumination falling upon the pixel during an integration period.
Lens system 160 can be any suitable optical device capable of focusing an object line upon linear detector 150. Thus, lens system 160 can be a refractive lens system or a diffractive lens system. Such a refractive lens system can preferably include a gradient index (GRIN) lens array, available from NSG America, Inc., of Somerset NJ, or a traditional refractive lens system. A diffractive lens system can include a holographic lens array.
Sensor 106 is coupled to sensor controller 170 of host 172. Sensor controller 170 can receive and store each individual image line in a frame buffer, and provide suitable signals to sensor 106 to control the intensity of any of illuminators 164, 166, and 168 as well as pixel exposure control (as will be explained later in the specification). Since host 172 is coupled to encoder 110, sensor controller 170 can provide illumination intensity signals to any of the illuminators based upon position of sensor 106 along the X-axis or based upon the scan speed of sensor 106 along the X-axis. Host 172 also includes motion controller 174 that is coupled to motor 88, nozzle motor 176 and a nozzle encoder (not shown) . Thus, host 172 acquires an image of component 96 from linear detector 150 as sensor 106 is moved in the X direction relative to component 96. Host 172 is adapted through suitable software, hardware, or both, to compute a current orientation cf component 96 in X-axis, Y-axis, and θ directions. Based upon the computed orientation, host 172 causes notion controller 174 to issue suitable motion commands to motors 70, 74 (shown in Fig. 4) and nozzle motor 176 to cause nozzle 82 to deposit component 96 in a desired component position and orientation on the workpiece.
Motion controller 174 is adapted to vary scan speed based upon any number of criteria which will be explained in greater detail later in the specification.
Fig. 14 is a system block diagram of another embodiment of the present invention. Fig. 14 illustrates sensor head 700 mechanically coupled to a sensor motion system 702, which provides the unidirectional motor drive for head 700. Sensor head 700 moves with a generic component head 708 in at least one direction so as to form an "on-head" system.
On a system-level basis, sensor head 700, sensor motion system 702, controller 706 and component head 708 form a closed control loop 709 (not shown) . In control loop 709, the host processor sends a desired placement signal to the component head motor drive. The component head motor drive starts to move the component head to the nominal placement location. Then, the combination of the sensor head and sensor motor drive cαn the component and output partial images to allow the formatter 714 to form an assembled image of the component, and then video processor 728 processes the assembled image to compute an x, y, and θ orientation of the component. The video processor sends this x, y, and θ orientation to the host processor in the pick and place machine, which computes a correction signal that is provided to the placement head to properly orient and place the component .
A host processor 712, within the controller 706 of a pick and place machine, sends a desired placement location via bus 720 and bus 710 to component head 708. Sensor head 700 scans across the component, collecting a plurality of images of the component each representative of a portion of the component. Typically, as many as 2000 partial images can be taken to view an entire 25 millimeter x 25 millimeter component. If the scan includes, for example, four components, 8000 partial images may be needed to compute the necessary orientation data.
Video formatter 714 receives outputs from detector read-out block 716 in sensor head 700, via bus 718. The function of formatter 714 is preferably carried out in a separate electronic chip other than processor 712, but both functions may be embodied in the same component. Video formatter 714 assembles the partial images from detector read-out block 716 to form an assembled image. Additionally, formatter 714 optionally performs windowing of specific areas of the assembled images, (e.g., corners), performs magnification of specific areas and also may provide non-uniformity correction of the assembled images, where one of the dimensions of the image is disproportionately modified with respect to other dimensions of the assembled image, due to non-uniform spacing of partial images in time or space.
Non-uniformity corrections may be necessary in order to provide a 1:1 x:y representation of the scanned images to the video processor, since a 1:1 x:y representation of the scanned images is only available when the scan speed is set to cover the width of the pixel (typically 14 microns) . Many video processors are incompatible with non 1:1 x:y representations of scanned images, but some are able to receive 8:1 x:y representation of scanned images, for example. When it is necessary to re-format the scanned video, the present invention includes decimating the scanned image to make it compatible with the video processor. In some cases, the non-uniformity in the scanned images results from variable scan speeds and exposure periods, as detailed in the discussion for Figs. 15a- g. Preferably, the non-uniformity and decimation is performed by video formatter 714 in real time as each partial image received and before each partial image is stored. Finally, video formatter 714 is also adaptable to correct for pixel to pixel variations in gain and offset.
Internal bus 720 connects formatter 714, processor 712, operator interface and display 722, placement head motion control system 724 and video processor 728. In addition to providing a desired placement location for the component to placement head motion system 724, the host processor also provides various timing signals for proper operation of the controller 706. For instance, when the pick and place machine 706 has a more than one component head, the host processor includes suitable collision avoidance functionality to prevent collisions between the various component heads and also to prevent collisions between sensor and nozzles. Operator interface and display 722 allows the operator of the pick and place machine to program specific movements and associated timing of scans, as well as overall operation and diagnostics of the pick and place machine. A video display of the assembled image is also displayed for the operator. Such display is especially useful for displaying the windowing, . magnification and non- uniformity correction options mentioned above. Placement head motion control s stem. 724 includes a set of x, y, z, and θ motors for moving component head
708 in the x, y, z, and θ directions, as well as control electronics 726 for timing such movements and re-formatting the electrical digital signals from host processor 712 into analog signals generally required to drive the x, y, z, and Q motors. A bank of x, y, z, and θ encoders encodes the position of component head 708 and provides these signals to host processor 712. The encoder signals from encoder 712, together with the x, y, z, and Q position of the component provided by video processor 728, provide a control loop for x, y, and θ positioning of the component. Video processor 728 can be a microprocessor such as an Intel Pentium® processor and is preferably included in every embodiment of the present invention.
Sensor head 700 includes detector 704. The detector/illuminator control electronics 730 provides control signals for detector 704, depending on the type of operation desired.
An illuminator 732 physically resides in sensor head 700, and illuminator control electronics 730 also control the operation of illuminator "732 to provide illumination from one or a combination of brightfield, backlit, darkfield illuminators (with individually addressable LED's) for rounded objects, and a source for diffuse illumination.
Finally, control electronics 736 within sensor motion system 702 provide timing and position instructions to head 700. A control loop is formed by control electronics 736, which sends out instructions representative of the desired position for head 700 and motor/encoders 734, and head "'00. The time constant of this control loop, however, should be Less than the other control loop 709, since the line scan sensor should scan faster than the time required to place the component.
Fig. 15a is a diagrammatic view of linear detector 150. Linear detector 150 includes pixel reset drain 180, photoelements 182, and CCD readout shift register 184. Each of photoelements 182 is preferably sized to be approximately 14 micrometers x 14 micrometers. However, any appropriately sized photoelements can be used. During scanning, photoelements 182 are exposed to an image and each photoelement 182 accumulates charge related to the amount of light that falls upon the individual photoelement during the integration period. The integration period thus corresponds to the amount of time that the photoelement is exposed to the image. Modifying the integration period has a direct impact on photoelement exposure control. When sensor scan speed is constant, the integration period is generally centered about fixed intervals along the scan direction.
For each line scanned, the amount of charge accumulated from a given photoelement is transferred to CCD readout shift register 184. Individual photoelement values are then read from CCD readout shift register sequentially. Additionally, although one CCD readout shift register tap is illustrated in Fig. 15, any appropriate number of CCD readout shift register taps can be used to simultaneously readout data from different portions of the CCD readout shift register. Moreover, although Fig. 15 shows linear detector 150 having a singular CCD readout shift register 184, multiple CCD readout registers or multiple taps, can be used. Multiple CCD readout shift register taps allow different portions of the CCD array to be read simultaneously. In another embodiment, other types of linear arrays of conventional design are acceptable, such as those which provide .video data at a rate of 40MHz.
When sensor scan speed varies during a scan, such speed variations can introduce image distortions, or non-uniformities unless they are compensated. These non-uniformities arise from imperfect motion control or even intentional acceleration of the camera and the component with respect to each other. Such compensation will be described with respect to Figs. 15b - 15j.
In the pair of Figs. 15b and 15c, the velocity of the camera with respect to the component is not constant. Fig. 15b shows varying exposure lengths, 3a - 3g, where the commencement of each exposure length repeats at fixed intervals. For example, exposure length 3a - 3g each start at 5 tick marks of the previous exposure length. Fig. 15c shows varying exposure periods, 4a - 4g, where each exposure period corresponds to each of exposure lengths 3a - 3g, respectively. Note that while the width of the exposure lengths 3a - 3g vary, the width of the exposure times, 4a - 4g, do not. This method is relatively easy to implement, however, it creates some image distortion because the center to center spacing of the exposure lengths are not constant .
Another way to compensate for scan speed variations is illustrated in Figs. 15d - 15g. In Fig. 15d, successive exposure lengths 5a - 5g are shown. Each exposure length 5a - 5g commences at fixed intervals, and each exposure length ends at fixed intervals. Fig. 15e shows exposure periods 6a - 6g, which correspond to exposure lengths 5a -5g, respectively. Exposure periods 6a - 6g commence and end at no particular fixed interval, however, because the scan speed varies. Although this method ensures that the center to center spacing between exposure lengths are constant (allowing for simpler video data processing) , the effective gain of each line of video data (a function of the instantaneous velocity of the camera during the exposure time of each partial image) is not constant. This results in variations in average intensity between successive lines of video data 7a - 7g (assuming that the target which is being viewed is of a uniform reflectivity) , as illustrated by assembled scanned image 7h in Fig. 15f. Image 7h comprises video lines 7a - 7g, which are the result of exposure lengths 5a - 5g in Fig. 15d and the corresponding exposure periods 6a - 6g in Fig. 15e.
The assembled scanned image 7h displayed in Fig. 15f can be improved dramatically by correcting the effective gain of each line of video data. This is done by normalizing all the pixel values for a partial image by the exposure period associated with such image. For example, if the nominal exposure period, .To, gives a full-scale output for a highly reflective surface, then each pixel in a partial image can be scaled by
Pn,i = Pm,! * (To / Tffl)
where Pτ,, i is the normalized pixel value for the ith pixel along such image., P:ιl,ι is the measured pixel value for the ith pixel along the line, and a is the measured exposure period for such image.
A preferred device for correcting the effective gain is illustrated in Fig. 15g. Encoder signals are received based upon sensor movement, and counted by encoder counter 1S1. Count information is provided by counter 181 to CCD timing block 183 such that each exposure interval begins and ends on evenly spaced spatial intervals. Exposure timer 185 contains a digital counter and a accurate time-based clock that rapidly increments the digital counter. The count information from encoder counter 181 is used to reset and start the digital counter of exposure timer 185 at the beginning of an exposure, and to stop the digital counter at the end of the exposure. The ending digital count (representing the exposure period) is then latched by a buffer in exposure timer 185. The latched exposure time information is then provided to digital-to-analog (D/A) converter 189 such that converter 189 sends an analog signal to programmable gain amplifier 191. Video data from detector 150 is applied as an input to amplifier 191, and amplifier 191 generates normalized, or corrected, leveLs of each pixel in the line of video data.
Fig. 15h illustrates another device that is useful for compensating for the type of image shown in Fig. 15f . Encoder counter 181, and exposure, timer 185 function as described with respect to Fig. 15σ. However, the exposure period in counts from exposure timer 185 is sent to illuminator control block 193 to adjust the illuminator intensity based upon the exposure period. For short exposure periods, the illuminator would be adjusted to higher intensity levels for the next exposure period. Conversely", for long exposure periods, the illuminator would be adjusted to a lower intensity level for the next exposure period. The exposure periods could also be measured over shorter exposure lengths just before the beginning of the next exposure length in order to obtain a better estimate of instantaneous scan speed.
Another method of compensating for variations in scan speed is described with respect to Figs. 15i and 15j . This method uses a constant exposure period, and centers the exposure lengths at constant intervals. The exposure period of a given partial image is measured and the starting position for the following exposure length is adjusted such that the exposure length is centered about the desired spatial position. For example, if the exposure period is Tf, then the total exposure length is given by:
XL = V. * Tf
where V± is the instantaneous scan speed. If Xc is the desired center of the exposure length, then the starting position of the exposure length, Xs, is given by:
Xs = Xc - (XL / 2)
The instantaneous scan speed, Vi, is calculated by measuring the time it takes the sensor to travel a short distance. If the sensor is accelerating rapidly, scan speed can be measured at many points prior to the desired exposure period. Instantaneous scan speed can then be estimated by fitting a curve to these points and extrapolating what the scan speed will be during the next exposure period.
As the readout rate of the linear detector is increased, the amount of time that each photoelement (pixel) is exposed to light (the integration period) decreases. Thus, increasing the readout rate eventually leads to a situation where the photoelements are inadequately exposed using common illumination methods. In such case, another type of CCD linear detector, known as a Time Delay and Integrate (TDI) detector, can be used. TDI detectors improve the light gathering abilities of a photoelement, and overcome the problem of underexposure. TDI-based linear detectors are commonly used for the inspection of very high speed conveyor processes such as beverage bottling or the like. The TDI architecture requires that the linear detector be precisely synchronized to the motion of the sensor, and such synchronization can be effected with encoder 110.
Fig. 16a is an elevation view of a portion of sensor 106. Sensor 106 includes darkfield illuminators 164 and diffuse illuminators 166. In the embodiment shown in Fig. 16a, diffuse illuminators 166 each include a discrete array or bank of light emitting diodes 186 and mirrors 188. Banks 186 can be surface mount, T-l, or monolithic array LEDs, and are disposed to direct illumination horizontally with respect to sensor 106. Mirrors 188 are positioned to reflect the horizontal illumination emanating from banks 186 upon a portion of component 96 at a suitable angle of incidence for diffuse illumination. Additionally, a lens can be placed in front of each of banks 186 to increase the power density of the illuminators. Such lens can be a common cylinder lens, a Fresnel type lens, or a rod lens. Although the use of such lenses is described with respect to banks 186, such lenses can be used for any illuminators of the embodiments of the present invention.
Fig. 16b is a top plan view of the portion of sensor 106 illustrated in Fig. 16a. As can be seen, diffuse illuminators 166 are disposed on opposites sides of detector window 158. Each diffuse illuminator 166 includes a linear bank of light emitting diodes 186 and a mirror 188. Darkfield illuminators 164 are disposed at intervals about detector window 158. Preferably, darkfield illuminators 164 are disposed in a ring about detector window 158. However, other configurations can also be used.
Fig. 16c is a top plan view of a portion of sensor 106 in accordance with another embodiment. As can be seen, sensor 106 includes detector window 158 and diffuse illuminators 166 precisely as provided in the embodiment shown in Fig. 16b. .However, in the embodiment shown in Fig. 16c, individual darkfield illuminators 164 are replaced with darkfield light emitting diode banks 190. Preferably, the darkfield diode banks are disposed in a ring about detector window 158 as shown in Fig. 16c. Diode banks 190 can
include lenses as described in connection with diffuse illuminators 166 described with respect to Fig. 16A.
Any of the illuminator embodiments shown herein can be practiced with sinv7'uiarly addressable illumination components (e.g., diffuse illuminators, LEDs, darkfield LEDs, etc.). It is understood that the singularly addressable elements can also be grouped into contiguous physical segments, such as quadrants or pie-shaped slices. Fig. 16B shows quadrants of addressable LEDs, QI - Q4, although it is understood that the enhanced lighting provisions need not be symmetric. In fact, when confronted with an asymmetrical object of interest, such as in a wire bonder, it is preferred to have a asymmetrical illumination arrangement. Additionally, the enhanced illumination provisions disclosed in this paragraph may be coupled with the use of color (e.g., blue or
• red LEDs) which may be addressed as provided for above .
Fig. 17a is a top plan view of sensor 192 in accordance another embodiment of the invention. Sensor 192 includes detector window 158 disposed approximately equidistant between edges 194 and 196. Sensor 192 includes diffuse illuminators 198 each of which preferably includes a number of individual illuminator sources 200, such as light emitting diodes. Each illuminator 198 also preferably includes a lenticular diffuser 202. Discrete domed surface mount LED's can be used for sources 200. The dome of each LED acts as a lens to increase the power density of the LED. However, when such domed LED's are placed in close proximity to the object, non-uniform illumination can occur since the illumination does not have sufficient distance to spread out . In such case lenticular diffusees 202 are useful, to spread out the illumination from source 200.
As can be seen i Fiq. 1 '7b, each lenticular diffuser 202 is disposed to intercept illumination emanating from individual illuminators 200 and diffuse such illumination prior to illuminating component 96.
A lenticular diffuser is an array of cylindrical-like grooves which scatters light only in the direction perpendicular to the length of the grooves. Fig. 17c is a diagrammatic view of lenticular diffuser 202 having a multiplicity of grooves 204. Although a lenticular diffuser is shown in figs. 17a-17c, other suitable devices can be used that diffuse illumination from individual illuminators 200.
Figs. 18a and 18b are diagrammatic views of a portion of sensor 192 in accordance with another embodiment of the present invention. Sensor 192 includes illuminators 210 disposed on opposites sides of detector window 158. Each illuminator 210 includes a plurality of individual sources 212 disposed to direct light relatively horizontally in relation to sensor 192. Illuminators 210 include faceted reflectors 214 disposed between individual illuminators sources 212 and component window 158. Each reflector 214 includes a reflective surface 216 that has a number of reflective portions disposed at different angles relative to one another. Thus, as can be seen in Fig. 18b, some illumination emanating from individual illuminator sources 212 intercepts faceted reflector 214 at various angles, and faceted reflector 214 reflects the illumination from individual illuminator sources 212 to a portion of component 96 at varying degrees of incidence. As can further be seen, some illumination emanating from individual illuminator sources 212 proceeds directly to component 96 thus illuminating a portion of component 96 with dark field illumination.
Figs. 19a and 19b are diagrammatic views of illuminators 220 in accordance with another embodiment of the present invention. Each of illuminators 220 is disposed on opposites sides of detector window 158. Each illuminator 220 includes illumination source 222 and waveguide 224.
Illumination source 222 is preferably a conventional laser bar. A laser bar consists of an array of many semiconductor laser diodes. The laser bar can be operated in a constant power mode, or pulsed for each line of CCD data. Properly pulsing the laser bar at higher power for a period of time that is short compared to the CCD line rate results in the same amount of energy exposing the linear detector. However, pulsing the laser bar improves its electrical to optical conversion efficiency, and the heat generated by the laser bar is reduced.
Illumination emanating from source 222 (shown diagrammatically as rays R) travels through waveguide 224 and falls upon internal reflector 226. Reflector 226 reflects the illumination to surface 228. Surface 228 is preferably roughened in order to diffuse illumination emanating from surface 228 before illuminating component 96. In a preferred embodiment, surface 228 is roughened by etching.
Figs. 20a-20d are diagrammatic views of a linescan sensor including a wing illuminator in accordance with another embodiment of the present invention. Fig. 20a illustrates sensor 240 having illuminator 242 coupled thereto and extending from sensor 240. Arrow 241 indicates the direction of travel for the linescan sensor. Illumination sources 246 are disposed to direct illumination upon diffuser/reflector 247, mounted to nozzle 82 above component 96. Illumination sources 246 preferably comprise light emitting diodes. As can be appreciated, illumination emanating from sources 246 provides backlighting to component 96 while imaged by sensor 240. Fig. 20b is a top plan view illustrating sensor 240 scanning multiple components 96.
Fig. 20c illustrates another wing illuminator in accordance with an embodiment of the present invention. Illuminators 242 extend from sensor 240 to a position above component 96 and include LED banks 245. LED banks 245 are adapted to direct illumination to reflectors 251 which direct the illumination through diffuser 253, thus backlighting component 96. A shield 257 is also provided to prevent illumination from essentially leaking past diffuser 253 and falling directly upon sensor 240.
Fig. 20d illustrates another wing-lighting embodiment wherein sensor 240 includes' illuminators 242 each of which includes a source 259 and lens 265 to backlight the chisel tips of component 96, which is illustrated as a connector. Connectors are usually- problematic because the pins are very shiny and come nearly to a point. These tips reflect most of the light away from the imaging system in standard frontlighting methods and hence are not visible in the image. Sources 259 direct illumination through lenses 265 and onto the pin tips of component 96 where they are specularly reflected and collected by the imaging system of sensor 240.to generate an .image of the pin tips. The .illumination illustrated in Fig. 20d is referred to as directional side lighting.
Fig. 21 is a elevation view of a portion of a pick and place machine in accordance with another embodiment of the present invention. As can be seen, sensor 260 is positioned below component 96. Sensor 260 includes a linear detector as described in the above embodiments and can include any and all of the illumination embodiments described thus far. Additionally, backlight illuminator 269 is operably coupled to nozzle 82 to provide backlight to component 96 while being imaged by sensor 260. Backlight illuminator 269 includes individual illumination sources 271 and diffuser 273. Preferably, individual illumination sources 271 are LED's and diffuser 273 is any suitable optical device. Backlight illuminator 269 is preferably provided on all nozzles 82 within the pick and place machine. However, it is expressly contemplated that the pick and place machine can include any appropriate combination of backlit and non-backlit nozzles. Furthermore, it is understood that the enhanced illumination features discussed at Figs. 16D, particularly with respect to singularly addressable colored LEDs, are useful with the present embodiment .
Figs. 22a-22q are photographs of various components imaged in accordance with an embodiment of the present invention with varying intensities of different types of illumination. Different illumination types highlight or suppress certain component features- Such illumination is thus useful for component placement and inspection.
Fig. 22a is a flip-chip image acquired in accordance with the present invention having a field of view of 16mm x 16mm. The image of Fig. 22a was acquired using brightfield illumination and adjusted to half its maximum intensity vaiue (i.e., 0.50 units) .
Fig. 22b is a flip-chip image acquired in accordance with an embodiment of the present invention. Fig. 22b has a field of view of 16mm x 16mm similar to that of Fig. 22a. Illumination of the flip-chip in Fig. 22b differs from that of Fig. 22a in that darkfield illumination of .75 units and diffuse illumination of .5 units was directed upon the flip- chip during image acquisition.
Fig. 22c is an enlarged view of a portion of the flip-chip shown in Figs. 22a and 22b. In Fig. 22c, the field of view is approximately 2.7mm x 2,0mm. Fig. 22c shows individual balls on the surface of the flip-chip, such image magnified as output from video formatter 714. Each ball has a diameter of approximately 150 micrometers. The illumination used to acquire Fig. 22c consisted on .75 darkfield units in combination with .50 diffuse illumination units
Fig. 22d is an image acquired of a quad flat pack (QFP) . The field of view of Fig. 22d is approximately 26mm x 26mm. The illumination employed to acquire the image of Fig. 22d consisted of 1.0 brightfield units. Surface indicia on the bottom of the QFP can be seen clearly in Fig. 22d as well as individual leads extending from all four sides of the QFP.
Fig. 22e is another image acquired of the QFP in accordance with an embodiment of the present invention. Illumination used to acquire image 22e consisted of .2 brightfield units, .4 diffuse units and .7 darkfield units. Contrasting images 22e with 22d, illustrates that illumination intensity adjustment can facilitate the imaging of the individual QFP leads.
Fig. 22f is an image of a portion of the QFP shown in Figs. 22d and 22e, which portion has been digitally magnified and windowed, as output from video formatter 714. The field of view of Fig. 22f is approximately 7.25mm x 6.5mm. The illumination used to acquire the image of Fig. 22f consisted of .2 brightfield units, .4 diffuse units, and .7 darkfield units. The individual leads of the QFP shown in Fig. 22f have a width of approximately .30mm and have a lead pitch of .65mm.
Fig. 22g is an image of a micro ball grid array acquired in accordance with an embodiment of the present invention. Image 22g has a field of view of approximately 9.5mm x 7.6mm. Illumination used to acquire the image of Fig. 22g consisted of .4 brightfield units and .8 darkfield units.
Fig. 22h is an image of the same micro ball grid array except that the illumination used for Fig. 22h consisted solely of 1.0 brightfield units.
Fig. 22i is an image of a portion of the micro ball grid array shown in Figs. 22g and 22h, magnified and windowed as output from video formatter 714. Fig. 22i has the same field of view of Figs. 22h and 22g. The illumination used to acquire Fig. 22i consisted solely of 1.0 darkfield units. By contrasting Figs. 22g and 22h with each other, one can appreciate the versatility of providing different types of illumination upon components in order to resolve certain types of component characteristics.
Fig. 22j is an image of a semiconductor die- in-package. Although the component shown in Fig. 22j is generally not placed by a pick and placed machine, the image of Fig. 22j is highly useful for a wire bonder which couples minute wires between pads on the semiconductor and the leads. The image shown in Fig. 22j has a field of view for approximately 16mm x 16mm. Illumination consisted of .5 brightfield units.
Fig. 22k shows the same semiconductor die- in-package and has an identical field of view as that of Fig. 22j . Illumination consisted of .25 brightfield units, 1.0 darkfield unit, and 1.0 diffuse unit. Note, that when diffuse lighting is used, such as in Fig. 22k, wire bonds between semiconductor and the individual leads are visible.
Fig. 221 is an enlarged view of a portion of the semiconductor die-in-package . The field of view of Fig. 221 is approximately 5.0mm x 7.0mm. Fig. 221 is taken from a portion of the image of Fig. 22k, which portion has been digitally magnified and windowed as output from formatter 714.
Fig. 22m is an image of a screen printing stencil acquired in accordance with an embodiment of the present invention. Although a stencil is generally not a component that is placed by a pick and placed machine, imaging such a stencil is highly useful for a screen printer device. The image of Fig. 22m has a field of view of approximately 9mm x 13mm. Illumination consisted of .6 brightfield units. Note, the larger stencil openings are .65mm in pitch and the smaller stencil openings have a .30mm pitch. Fig 22m illustrates an undesirable screen printing condition in which the stencil is clogged. .Specif cally, some of the larger openings near the top Fig. 22m are partially obstructed by solder paste.
Fig. 22n is an image acquired of the same screen printing stencil as that of Fig. 22m. The field of view is identical to Fig. 22m and the main difference between the two figures is that of illumination. Specifically, 1.0 darkfield units and 1.0 diffuse units of illumination were employed during the acquisition of Fig. 22n Contrasting Figs. 22n and 22m reveals that darkfield illumination is highly- useful in accentuating solder paste adhering to the bottom of the stencil. The solder paste is shown in Fig. 22m proximate the large stencil openings.
Fig. 22o is an image of bare copper fiducial marks acquired in accordance with an embodiment of the present invention. Specifically, Fig. 22o has a field of view of approximately 7mm x 22mm. 1.0 diffuse illumination units and 1.0 darkfield illumination units were employed during the acquisition of Fig. 22o. The shape of each fiducial copper mark can be seen clearly in Fig. 22o. Such fiducial marks are often employed on circuit boards to provide reference location information. Imaging systems image such fiducial marks to essentially calculate the relative position of the circuit board to facilitate board assembly functions sucfi as screen printing, or component pick and placement.
Fig. 22p is an image of a solder paste test print acquired in accordance with an embodiment of the present invention. The field of view of Fig. 22p is approximately 26mm x 26mm. 1.0 diffuse illumination units and 1.0 darkfield .illumination units were employed during the acquisition of fig. 22p. fig. 22q is an enlarged view f the boxed port on of Fig. 22p. The field of view of Fig. 22q is approximately 2.5mm x 2.5mm, and was magnified and w ndowed as output from formatter 714. The illumination used during the acquisition of Fig. 22q is the same as that of Fig. 22p. Note, each solder paste print shown in Fig. 22q has a diameter of approximately .3mm.
Although Figs. 22a-22q show images where each line of the image was acquired with the same illumination type and intensity as its neighbor, embodiments of the invention can be practiced with interleaved images. Interleaved images are created by changing illumination levels on a line by line basis. For example, every odd line can be exposed to darkfield illumination while every even line is exposed to diffuse illumination. This creates essentially two images which are parsed by the video formatter to generate separate windows. An image co¬
processor then uses the windows to generate placement offset information, and to perform component inspection (which will be described later in tne specification) .
Interleaved images can also be used to increase the dynamic range of the linescan sensor. Such enhanced dynamic range is useful for imaging components that have reflectivity changes greater than the dynamic range of the linear detector. Without extending the dynamic range of the detector, some pixels of the image may be saturated or clipped, or some pixels may be completely dark if the reflectivity changes are too large. For dynamic range enhancement, a first interleaved image is acquired with illuminators set to the maximum possible illumination levels that still allow the desired illuminator type ratios (such as 2:1 darkfield: diffuse) . The second interleaved image is acquired with lower illumination levels, while still maintaining the chosen illuminator ratios. Preferably, the second image is acquired with all illuminator levels reduced by a factor of 16. The video formatter takes the first interleaved image and essentially discards all saturated pixels. Then the video formatter replaces the discarded pixels with corresponding pixels from the second interleaved image, that are then, properly scaled by the ratio of illumination levels between the first and second images (such as 16) . This method can be adapted to use third, fourth, fifth, . . . etc. images to further extend the dynamic range of the linescan sensor.
As can be appreciated, the various types of illumination and image acquisition provided by embodiments of the present invention are highly useful not only for a pick and place machine but also for wire bonders and screen printers. Moreover, different characteristic., of individual features on the bottom of a given component can be resolved to greater or lesser degrees as desired. Thus, embodiments of the present invention are useful not only for the placing of components, but the inspection of components as well. "Inspection" includes presence, absence, feature dimension (such as ball diameter) , and lead tweeze .
Dimensional inspection of the component itself is useful for identifying an undesirable condition known as tombstoning. Tombstoning is a condition in which the component is picked up by a surface other than that opposite the mounting surface. One example of tombstoning is when a chip capacitor is picked up in such a way as to extend partially into the nozzle. Such condition is undesirable because the pick and place machine cannot correct the orientation in order to mount the tombstoned component. This condition is detected by measuring the length and width of the component and comparing such measurements to nominal dimensions. Generally the component dimensions will deviate slightly from nominal based upon manufacturing tolerances, however the deviation is very drastic when a component is tombstoned.
Fig. 23a and 23b are diagrammatic views of lens system 160 in accordance with an embodiment of the present invention. The system shown in figs. 23a and 23b includes a gradient index (GRIN) lens array. A gradient index lens is an optical element within which the refractive index is a smooth, but not constant, function of position and, as a result, the ray paths are curved. This is also known as a graded index lens. The ray curving characteristic of GRIN lens 270 is shown in Fig. 23b where rays emanating from object line 261 enter GRIN lens 270 and begin to curve as indicated. Rays exiting GRIN lens 270 converge and are focused at 262 upon linear detector 150. A GRIN lens array provides a large, compact field of view for imaging systems of embodiments of the invention. Although GRIN lens 270 is shown as an example of lens array 160, any suitable optical element capable of focusing object line 261 upon a linear detector can be used. The compact nature of the present invention with the GRIN lens array allows for the pick and place machine of the present invention to have a reduced nozzle " z" stroke. A reduced nozzle " z" stroke is essential to rapid placement of components, since each time a component is placed, the nozzle must be lifted in order to clear the sensor for scanning and then lowered by approximately the same distance to place the component .
Fig. 24 is an elevation view of sensor 278 in accordance with another embodiment of the present invention. Sensor 278 bears many similarities to sensor 106 (shown in Fig. 13) and like components are numbered similarly. Sensor 278 differs from sensor 106 because sensor 278 includes an additional linear detector 280 optically coupled to a portion of the component to be imaged via imaging optics 282. As shown in Fig. 24, imaging optics 282 includes a set of lenses 284 and beam splitting mirror 286. Beam splitting mirror 286 reflects a portion of the component image through lens 284 which reflected portion is focused upon second linear detector 280. A portion of the image also passes through beam splitting mirror 286 and is reflected by beam splitting mirror 156, which reflected image is then focused by imaging optics 156 upon first linear detector 150. The arrangement illustrated in Fig. 24 provides a dual field of view optical system that provides high magnification and high resolution as well as a large field of view. Although the dual- field of view embodiment is useful for a variety of applications, it is particularly useful for wire bonder applications.
As discussed above, eacn of nozzles 82 within tne pick and place machine can be adapted to pick and place a different type cf electrical component:. Examples of sucn different component types include flip-chips, ball grid arrays (BGA's), micro ball grid arrays, quad flat pack (QFP), connector, pin grid array, dual inline package, single inline package, plastic leaded chip carrier (PLCC), chip capacitors, and chip resistors. Moreover, each nozzle 82 can be independently adapted to pick and place a different type of component than other nozzles 82. Because different component types can require different image resolutions, embodiments of the present invention can preferably change image resolution by changing scan velocity based upon component type.
Fig. 25 is a chart of preferred image resolution for a variety of component types, and the associated scan time required to provide the desired resolution.
In order to image a flip-chip having 75 micrometer balls, it is preferred that an image resolution of 14 micrometers is used. Scanning 200mm at a 14 micrometer resolution takes approximately 1.6 seconds with the first image becoming available at 300 milliseconds .
In order to image a micro ball grid assembly having 150 micrometer balls, or a quad flat pack having .3 to .4mm lead pitch, it is preferred to use an image resolution of approximately 28 micrometers. Image acquisition at the desired resolution takes approximately 800 milliseconds to scan a 200mm scan length, and the first image becomes available at 200 milliseconds .
A 0603 metric chip cap or quad flat pack having a lead pitch between .5 and .625mm, is preferably imaged at a resolution of about 56 micrometers. Total scan time at the 56 micrometer image resolution takes approximately 400 milliseconds to scan 200mm, with the first image becoming available after 100 milliseconds.
A 1005 metric chip cap, ball grid assembly having a 1.25mm lead pitch, or a plastic leaded chip carrier having a 1.25mm lead pitch preferably is imaged at a resolution of approximately 112 micrometers. Total scan time for a scan length of 200mm takes about 200 milliseconds and the first image becomes available after approximately 50 milliseconds. Because various nozzles 82 may hold different types of electrical components, scan speed, and thus resolution, can be varied as the sensor traverses the entire scan length.
Figs. 26a-26c show various possible velocity profiles for the sensor. Fig. 26a shows scan velocity increasing linearly during an initial period and leveling off for an extended period of time for a fine pitch quad flat pack component. Fig. 26b shows a sample velocity profile for scanning a course pitch quad flat pack component. In comparison to Fig. 26a, scanning as shown in Fig. 26b provides higher sensor acceleration, and a higher scan velocity than that of Fig. 26a. Thus, image resolution is traded for faster scan time and thus increased throughput. Fig. 26c is a sample velocity profile when multiple types of components are used. Sensor velocity increases initially just as in Fig. 26a and maintains the selected velocity while the appropriate component (such as a fine pitch quad flat pack component) is scanned. As the sensor finishes scanning the component, sensor scan velocity is increased such as shown for a different type of component (such as a course pitch quad flat pack component) . After the second component is scanned, sensor velocity can again be changed such as the decrease shown in Fig. 26c to scan a third component, such as a micro ball grid array.
Figs. 27a-27h illustrate ways in which subsets of contiguous video data, or windows, can be used to provide enhanced video data for specific types of applications. The sensor and video formatter preferably include window circuitry that is adapted to provide one or more windows as the video output . The scan width, Wγ , is shown on each of Fig. 27a - 27h.
Fig. 27a illustrates an embodiment where a high resolution window 300 is provided as the video output. Window 300 encompasses component 96 and this particular embodiment is useful for small components. Thus, all features of the entire bottom surface of component 96 are imaged at high resolution and provided in window 300.
Fig. 27b illustrates an embodiment where two high resolution windows 302 are provided as the video output. Such configuration is particularly useful for identifying and precisely positioning corners of component 304 when component 304 has dimensions which exceed that of high resolution windows 302. As illustrated in Fig. 27c, additional high resolution windows can be provided to image all corners of component 304, and is particularly useful in precisely locating corners of a component.
Fig. 27d illustrates a large field of view window 306 encompassing all of component 304 . Such configuration is particularly useful for 100% presence inspection. Thus, when window 306 is used, the pick and place machine can ensure that all balls cf a ball grid array, for example, are present.
Fig. 27e shows an embodiment combining the windows of Figs. 27c and 27d. Thus, four high resolution windows 302 precisely locate corners of component 304 while window 306 is used for component inspection. The embodiment shown in this Fig. is useful for detecting balls.
Fig. 27f illustrates an embodiment where high resolution windows 302 image portions of component 310. The four high resolution windows 302 are then combined by the sensor to provide one EIA or CCIR composite image 312 as the sensor video output.
Fig. 27g illustrates an embodiment where multiple windows 314 are used to scan a long component such as a connector 316. Fig. 27h illustrates an embodiment where windows 318 are oriented at an angle relative to the scan direction to enhance corner detection. Preferably, windows 318 are oriented at 45° from the scan direction. To further enhance corner detection of large components (having a diagonal or diameter greater than the scan width, Wy) additional techniques can be employed. For example, once the part is picked up by nozzle 82, nozzle 82 can rotate the part 45° such that at least two of its corners should be within the scan width. For additional accuracy, once the component is scanned after the first 45° rotation, the component can be rotated an additional 90° to place the other two corners within the scan width, after which a second scan can be performed,, preferably in the opposite direction for efficiency. In this manner, all four corners can be precisely located and/or inspected. Further, for a very large components, any number of scans can be performed at any appropriate angular increments. Additionally, in some embodiments where the pick-up uncertainty is smaller than the pitch of the leaded component, only the center portion of a large component is measured for component orientation. These techniques are especially useful for placing and/or inspecting quad flat packs (QFP's) and ball grid arrays having a size greater than 25mm..
Fig. 27i illustrates another embodiment of a line scan sensor 151 where large components can be measured. Components larger than Ydet can be measured by offsetting nozzle 82 with respect to detector window 152. Detector 150 is shown within detector window 152, establishing a line of sight of a component 153. In Fig. 27i, parts having a dimension up to twice the Y2 dimension can be measured by scanning once, and then rotating the component 180 degrees and scanning again, preferably in the opposite direction. Components smaller than twice the Yi dimension can be measured in a single pass. The offset nozzle method can be combined with the CCD readout register multi-tap method. Such combination allows small parts to be measured by using only one or two of the taps, while larger parts are measured using all taps. This facilitates high speed, high resolution scanning of small components while retaining the ability to measure large components.
Although embodiments of the invention, thus far, have been described with respect to a pick and place machine, various embodiments have applicability to other electronic assembly devices such as wire bonders and screen printers.
Fig. 28 is a perspective view of a wire bonder in accordance with the prior art. Bonder 320 includes a bonder head 322 that is adapted to dispense and connect individual wires between die pads 324,324 and lead pads 326. Wire loop 323 has been bonded between a pad 326 and one of pads 324. Bonder 320 uses conventional imaging camera 328 to precisely locate the various pads in order to electrically couple them with wires. Camera 328 includes illuminators 330, lensing system 332, mirror 334 and area detector 336. As is known, illuminators 330 illuminate the pads to be bonded and lens system 332 and mirror 334 cooperate to focus an image of the pads upon area detector 336. Area detector 336 is coupled to additional electronics to process the image to thereby compute precise die pad and lead pad locations .
Fig. 29 is a top plan view of a wire bonder in accordance with an embodimenc of the present invention. Wire bonder 340 includes bonder head 322 and linescan camera 344 in accordance with an embodiment of the present invention. Linescan camera 344 is preferably constructed in accordance with any of the various embodiments described above with respect to pick and place machines. Although detector window 346 of line scan detector 344 is disposed at an angle (of approximately 45°) relative to the scan direction X, other embodiments are possible where the detector window is perpendicular to scan direction X.
However, orienting the detector window at an angle relative to the scan direction facilitates scanning all four sides of the die.
Fig. 30 shows a prior art printed circuit board solder paste application system 350. System 350 includes stencil 352 with stencil apertures 355 for applying solder paste through apertures 355 onto printed circuit board 354 to form corresponding solder bricks 357. After the solder paste is squeegeed onto board 354, the board is lowered, or stencil 352 is raised, and imaging system 356 is moved into place to inspect both stencil 352 and board 354 simultaneously.
Imaging system 356 includes mirror 358 for directing both images through a lens assembly 360 and onto CCD area detector 362. System 350 suffers from the limitations that it is relatively thick, and that its field of view is limited by the size of area detector 362.
Fig. 31 shows a portion of a printed circuit board solder paste application system in accordance with an embodiment of the present invention. Sensor 370 is adapted to move relative to stencil 352 and printed circuit board 354.- in the direction of arrow 371. Sensor 370 includes first linear detector 374 disposed within housing 372. First linear detector can be identical to detector 150 described above. First linear detector 374 is adapted to view stencil 352 through first imaging optics 376. Illuminators 378 are provided to illuminate stencil 352 or circuit board 354. A second linear detector 380 is also disposed within housing 372, and is adapted to view circuit board 354 through second imaging optics 382. Thus, in the embodiment shown in Fig. 31, the linescan camera of the above embodiments has essentially been duplicated such that both the stencil and the printed circui "board can be scanned .simultaneously. Since sensor 370 is thin and compact, it is able to fie in smaller spaces between stencil 372 and board 354 than prior systems. Thus, a screen printer in accordance with embodiments of the invention does not require as much time for the stencil and board to separate, thus reducing cycle time and increasing system throughput.
A screen printer in accordance with an embodiment of the present, invention provides a variety of inspections for the stencil, circuit board, or both. For example, the stencil can be inspected for clogged openings (see Fig. 22m) or solder paste adhered to the bottom of the stencil (see Fig. 22n) . The circuit board can be inspected for proper solder pad registration (with respect to the circuit board) , bridging between solder pads, and irregularly shaped solder pads.
Although the present invention has been described with reference to preferred embodiments, workers skilled in the art will recognize that changes may bo made in form and detail without departing from the spirit and scope of the invention. For example, the functions of disclosed blocks of circuitry or optics herein may be embodied in a combined block of circuitry or optics so as to maintain the function of the disclosed embodiment.

Claims

WHAT IS CLAIMED IS:
1. A pick and place machine, comprising: a head; at least one nozzle coupled to the head for releasably holding a component, the nozzle having a nozzle axis; an illuminator configured to illuminate at least a portion of the component; a sensor adapted to move relative to the component, the sensor including imaging optics and a linear detector, the imaging optics configured to form a series of partial images of the component on the linear detector as the sensor moves relative to the component, the sensor having a line of sight substantially parallel to the nozzle axis; and a controller configured to compute a current orientation of the component as a function of the series of partial images, the controller further configured to compare a desired orientation of the component to the current orientation and provide a correction instruction to adjust the current orientation of the component.
2. The machine of claim 1, wherein the controller further comprises a video processor adapted to provide the correction instruction to a motion controller, and wherein the head places the component on a workpiece based at least in part upon the correction instruction.
3. The machine of claim 1, wherein the at least one nozzle comprises a plurality of nozzles each adapted to releasably hold a respective component.
4. The machine of claim 3, wherein component placement order is different than an order in which the respective components are picked up by the head.
5. The machine of claim 1, wherein the line of sight is disposed proximate a leading edge of the sensor.
6. The machine of claim 1, wherein the illuminator is adapted to illuminate the component with at least one type of illumination selected from the group consisting of brightfield, darkfield, backlight and diffuse.
7. The machine of claim 1, wherein the sensor is adapted to image the component at a first preselected resolution in a direction perpendicular to a scan direction.
8. The machine of claim 7, wherein the sensor is adapted to image the component at a second preselected resolution in a direction parallel to the scan direction.
9. The machine of claim 8, wherein the first resolution is different than the second resolution.
10. The machine of claim 1, wherein the illuminator comprises a plurality of closely packed light emitting diodes.
11. The machine of claim 10, wherein the diodes are disposed in a ring.
12. The machine of claim 1, and further comprising a mirror disposed relative to the diodes to direct illumination to an area of interest on the component .
13. A sensor for providing an output signal representative of a series of partial images of a component, the component mounted to a nozzle in a pick and place machine, the nozzle having a nozzle axis, the sensor comprising: a sensor housing adapted to move relative to the component; a linear detector adapted to view the component from a line of sight substantially parallel to the nozzle axis; imaging optics .posi ioned between the component and the linear detector, the optics configured to form an image of the component on the detector; and wherein the linear detector provides an output representative of a series of partial images of at least a portion of the component as the optical sensor moves relative to the component.
14. The sensor of claim 13, wherein the illuminator is adapted to illuminate the component with at least one type of illumination selected from the group consisting of brightfield, darkfield, backlight and diffuse.
15. The sensor of claim 13, wherein the line of sight is disposed proximate a leading edge of the sensor.
PCT/US1999/026186 1998-11-05 1999-11-04 Electronics assembly apparatus with improved imaging system WO2000026640A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2000579970A JP2002529907A (en) 1998-11-05 1999-11-04 Electronic circuit assembly device with improved image forming system
DE19982498T DE19982498T1 (en) 1998-11-05 1999-11-04 Electronics mounting fixture with improved imaging system
KR1020007007461A KR20010040321A (en) 1998-11-05 1999-11-04 Electronics assembly apparatus with improved imaging system
GB0014999A GB2347741A (en) 1998-11-05 1999-11-04 Electronics assembly apparatus with improved imaging system

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
US10718898P 1998-11-05 1998-11-05
US10750598P 1998-11-06 1998-11-06
US13199699P 1999-04-30 1999-04-30
US14461699P 1999-07-20 1999-07-20
US14461499P 1999-07-20 1999-07-20
US60/131,996 1999-07-20
US60/107,505 1999-07-20
US60/107,188 1999-07-20
US60/144,614 1999-07-20
US60/144,616 1999-07-20

Publications (1)

Publication Number Publication Date
WO2000026640A1 true WO2000026640A1 (en) 2000-05-11

Family

ID=27537179

Family Applications (3)

Application Number Title Priority Date Filing Date
PCT/US1999/026186 WO2000026640A1 (en) 1998-11-05 1999-11-04 Electronics assembly apparatus with improved imaging system
PCT/US1999/026162 WO2000026850A1 (en) 1998-11-05 1999-11-05 Electronics assembly apparatus with stereo vision linescan sensor
PCT/US1999/026076 WO2000028278A1 (en) 1998-11-05 1999-11-05 Electronics assembly apparatus with height sensing sensor

Family Applications After (2)

Application Number Title Priority Date Filing Date
PCT/US1999/026162 WO2000026850A1 (en) 1998-11-05 1999-11-05 Electronics assembly apparatus with stereo vision linescan sensor
PCT/US1999/026076 WO2000028278A1 (en) 1998-11-05 1999-11-05 Electronics assembly apparatus with height sensing sensor

Country Status (6)

Country Link
US (2) US6608320B1 (en)
JP (3) JP2002529907A (en)
KR (3) KR20010040321A (en)
DE (3) DE19982498T1 (en)
GB (3) GB2347741A (en)
WO (3) WO2000026640A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005048675A2 (en) * 2003-11-07 2005-05-26 Cyberoptics Corporation Pick and place machine with improved setup and operation procedure
WO2010112613A1 (en) * 2009-04-03 2010-10-07 Singulus Technologies Ag Method and device for aligning substrates
EP2533622A1 (en) * 2011-06-09 2012-12-12 Yamaha Hatsudoki Kabushiki Kaisha (Yamaha Motor Co., Ltd.) Component imaging method, component imaging device, and component mounting device having component imaging device
US8388204B2 (en) 2009-09-22 2013-03-05 Cyberoptics Corporation High speed, high resolution, three dimensional solar cell inspection system
DE102009016288B4 (en) * 2009-01-02 2013-11-21 Singulus Technologies Ag Method and device for aligning substrates
US8670031B2 (en) 2009-09-22 2014-03-11 Cyberoptics Corporation High speed optical inspection system with camera array and compact, integrated illuminator
US8681211B2 (en) 2009-09-22 2014-03-25 Cyberoptics Corporation High speed optical inspection system with adaptive focusing
US8872912B2 (en) 2009-09-22 2014-10-28 Cyberoptics Corporation High speed distributed optical sensor inspection system
US8894259B2 (en) 2009-09-22 2014-11-25 Cyberoptics Corporation Dark field illuminator with large working area
JP2016006422A (en) * 2009-03-24 2016-01-14 オルボテック・リミテッド Multimode imaging
US11915563B2 (en) 2019-07-25 2024-02-27 Philip Morris Products S.A. Vending apparatus for aerosol generating articles

Families Citing this family (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7423734B1 (en) * 2000-04-25 2008-09-09 Ilmar Luik Combined video camera and toolholder with triangulation sensing
US7813559B2 (en) * 2001-11-13 2010-10-12 Cyberoptics Corporation Image analysis for pick and place machines with in situ component placement inspection
KR100540442B1 (en) * 2002-10-01 2006-01-16 가부시키가이샤 도쿄 웰드 Illuminating method and illuminating apparatus
CH696615A5 (en) * 2003-09-22 2007-08-15 Esec Trading Sa A method for adjustment of the bonding head of a die bonder.
US7559134B2 (en) * 2003-11-04 2009-07-14 Cyberoptics Corporation Pick and place machine with improved component placement inspection
US7706595B2 (en) 2003-11-07 2010-04-27 Cyberoptics Corporation Pick and place machine with workpiece motion inspection
DE102004052884B4 (en) * 2004-11-02 2010-05-20 Siemens Electronics Assembly Systems Gmbh & Co. Kg Illumination arrangement and optical measuring system for detecting objects
US7359068B2 (en) * 2004-11-12 2008-04-15 Rvsi Inspection Llc Laser triangulation method for measurement of highly reflective solder balls
DE102005054921A1 (en) * 2004-12-13 2006-06-22 Assembléon N.V. Surface mount technology system with non-contact interface for a component mounting machine and having two optical transceivers connected via a contact free connection
WO2007015561A1 (en) * 2005-08-02 2007-02-08 Matsushita Electric Industrial Co., Ltd. Electronic component mounter and mounting method
JP2007042766A (en) * 2005-08-02 2007-02-15 Matsushita Electric Ind Co Ltd Mounting device and mounting method of electronic component
GB0616410D0 (en) * 2006-08-21 2006-10-18 Anaglyph Ltd Visual aid for manufacturing assembly/component placement
JP4939391B2 (en) * 2007-03-28 2012-05-23 ヤマハ発動機株式会社 Mounting machine
JP5239561B2 (en) * 2008-07-03 2013-07-17 オムロン株式会社 Substrate appearance inspection method and substrate appearance inspection apparatus
SG173068A1 (en) * 2009-02-06 2011-08-29 Agency Science Tech & Res Methods for examining a bonding structure of a substrate and bonding structure inspection devices
US20100295935A1 (en) * 2009-05-06 2010-11-25 Case Steven K On-head component alignment using multiple area array image detectors
JP5780712B2 (en) * 2009-05-29 2015-09-16 富士機械製造株式会社 Imaging system and electronic circuit component mounting machine
JP2011080932A (en) * 2009-10-09 2011-04-21 Fujitsu Ltd Surface inspection device and method
WO2013069100A1 (en) * 2011-11-08 2013-05-16 株式会社メガトレード Apparatus for inspecting printed board
WO2013099981A1 (en) * 2011-12-27 2013-07-04 シーシーエス株式会社 Linear light irradiation device
JP5863547B2 (en) 2012-04-20 2016-02-16 ヤマハ発動機株式会社 Printed circuit board inspection equipment
US10083496B2 (en) 2012-05-22 2018-09-25 Cognex Corporation Machine vision systems and methods with predictive motion control
US9200890B2 (en) 2012-05-22 2015-12-01 Cognex Corporation Machine vision systems and methods with predictive motion control
TWI493201B (en) * 2012-11-09 2015-07-21 Ind Tech Res Inst Method and system for pins detection and insertion of electrical component
US11176635B2 (en) 2013-01-25 2021-11-16 Cyberoptics Corporation Automatic programming of solder paste inspection system
DE102013215430B4 (en) * 2013-08-06 2016-07-14 Lufthansa Technik Ag processing device
US9743527B2 (en) 2013-08-09 2017-08-22 CyberOptics Corporaiton Stencil programming and inspection using solder paste inspection system
EP3057390B1 (en) * 2013-10-09 2020-02-19 FUJI Corporation Loading position optimization program
US9704232B2 (en) 2014-03-18 2017-07-11 Arizona Board of Regents of behalf of Arizona State University Stereo vision measurement system and method
CN104236609A (en) * 2014-10-17 2014-12-24 珠海格力电工有限公司 Enameled wire detection head assembly
JP6485064B2 (en) * 2015-01-21 2019-03-20 株式会社ジェイテクト Sphere position measurement method
MY184276A (en) * 2015-02-16 2021-03-30 Exis Tech Sdn Bhd Device and method for conveying and flipping a component
US10575451B2 (en) * 2015-06-16 2020-02-25 Fuji Corporation Insertion component positioning inspection method and insertion component mounting method, and insertion component positioning inspection device and insertion component mounting device
WO2016203638A1 (en) * 2015-06-19 2016-12-22 ヤマハ発動機株式会社 Component mounting device, and component mounting method
JP6660774B2 (en) * 2016-03-08 2020-03-11 オリンパス株式会社 Height data processing device, surface shape measuring device, height data correction method, and program
US11877401B2 (en) 2017-05-31 2024-01-16 Fuji Corporation Work machine, and calculation method
JP2020527854A (en) 2017-07-12 2020-09-10 マイクロニック アクティエボラーグ Methods and systems for determining component lighting settings
CN110999566B (en) * 2017-08-09 2021-01-12 株式会社富士 Component mounting machine
US11367703B2 (en) * 2017-10-26 2022-06-21 Shinkawa Ltd. Bonding apparatus
US10842026B2 (en) 2018-02-12 2020-11-17 Xerox Corporation System for forming electrical circuits on non-planar objects

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4772125A (en) * 1985-06-19 1988-09-20 Hitachi, Ltd. Apparatus and method for inspecting soldered portions
US5208463A (en) * 1990-08-24 1993-05-04 Hitachi, Ltd. Method and apparatus for detecting deformations of leads of semiconductor device
US5278634A (en) * 1991-02-22 1994-01-11 Cyberoptics Corporation High precision component alignment sensor system
US5461480A (en) * 1993-06-14 1995-10-24 Yamaha Hatsudoki Kabushiki Kaisha Parts recognizing device for mounting machine
US5559727A (en) * 1994-02-24 1996-09-24 Quad Systems Corporation Apparatus and method for determining the position of a component prior to placement
US5619328A (en) * 1993-12-27 1997-04-08 Yamaha Hatsudoki Kabushiki Kaisha Component mounter and recognition method
US5999640A (en) * 1996-03-15 1999-12-07 Matsushita Electric Industrial Co. Ltd. Electronic part mounting apparatus with reflector which optimizes available light

Family Cites Families (101)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5148591A (en) 1981-05-11 1992-09-22 Sensor Adaptive Machines, Inc. Vision target based assembly
US4473842A (en) 1981-07-06 1984-09-25 Tokyo Shibaura Denki Kabushiki Kaisha Apparatus and method for examining printed circuit board provided with electronic parts
JPS58111705A (en) 1981-12-25 1983-07-02 Mitsutoyo Mfg Co Ltd Optical measuring device
US4578810A (en) 1983-08-08 1986-03-25 Itek Corporation System for printed circuit board defect detection
DE3474750D1 (en) 1983-11-05 1988-11-24 Zevatech Ag Method and device positioning elements on a work piece
JPS60217470A (en) 1984-04-13 1985-10-31 Hitachi Ltd System for estimating shape of cube from image pickup object picture
JP2537770B2 (en) 1984-08-31 1996-09-25 松下電器産業株式会社 How to mount electronic components
JPS61152100A (en) 1984-12-26 1986-07-10 ティーディーケイ株式会社 Apparatus and method for mounting electronic component
JPH0781846B2 (en) 1985-01-09 1995-09-06 株式会社東芝 Pattern edge measuring method and device
US4876728A (en) 1985-06-04 1989-10-24 Adept Technology, Inc. Vision system for distinguishing touching parts
US4727471A (en) 1985-08-29 1988-02-23 The Board Of Governors For Higher Education, State Of Rhode Island And Providence Miniature lightweight digital camera for robotic vision system applications
JPS62114289A (en) 1985-11-14 1987-05-26 松下電器産業株式会社 Mounting of electronic parts and apparatus for the same
US4782273A (en) 1986-08-08 1988-11-01 Control Data Corporation Automatic part location and mechanical testing of part insertion
US4811410A (en) 1986-12-08 1989-03-07 American Telephone And Telegraph Company Linescan inspection system for circuit boards
US4738025A (en) 1986-12-23 1988-04-19 Northern Telecom Limited Automated apparatus and method for positioning multicontact component
US4875778A (en) 1987-02-08 1989-10-24 Luebbe Richard J Lead inspection system for surface-mounted circuit packages
JPH0737892B2 (en) 1988-01-12 1995-04-26 大日本スクリーン製造株式会社 Pattern defect inspection method
US4969108A (en) 1988-04-08 1990-11-06 Cincinnati Milacron Inc. Vision seam tracking method and apparatus for a manipulator
JPH0218900A (en) 1988-07-06 1990-01-23 Hitachi Ltd Ion damp
DE3823836A1 (en) 1988-07-14 1990-01-18 Fraunhofer Ges Forschung METHOD FOR MEASURING THE EQUIPMENT OF CONSTRUCTION ELEMENTS, AND DEVICE FOR IMPLEMENTING THE METHOD
JP2688361B2 (en) 1988-08-02 1997-12-10 正己 山川 Photoelectric sensor
US5084962A (en) 1988-08-24 1992-02-04 Tdk Corporation Apparatus for and method of automatically mounting electronic component on printed circuit board
US5030008A (en) 1988-10-11 1991-07-09 Kla Instruments, Corporation Method and apparatus for the automated analysis of three-dimensional objects
US4920429A (en) 1989-01-24 1990-04-24 International Business Machines Exposure compensation for a line scan camera
JP2832992B2 (en) 1989-04-17 1998-12-09 松下電器産業株式会社 Electronic component mounting method
JP2822448B2 (en) 1989-05-22 1998-11-11 松下電器産業株式会社 Electronic component mounting method
JPH0824232B2 (en) 1989-05-29 1996-03-06 ローム株式会社 Chip parts front / back judgment device
US5342460A (en) 1989-06-13 1994-08-30 Matsushita Electric Industrial Co., Ltd. Outer lead bonding apparatus
JP2805854B2 (en) 1989-06-28 1998-09-30 松下電器産業株式会社 Electronic component mounting method
JP2803221B2 (en) 1989-09-19 1998-09-24 松下電器産業株式会社 IC mounting apparatus and method
JP2847801B2 (en) 1989-09-26 1999-01-20 松下電器産業株式会社 Electronic component mounting device
JPH03117898A (en) 1989-09-29 1991-05-20 Mitsubishi Electric Corp Control device
JP2773307B2 (en) 1989-10-17 1998-07-09 松下電器産業株式会社 Electronic component mounting method
US4980971A (en) 1989-12-14 1991-01-01 At&T Bell Laboratories Method and apparatus for chip placement
JPH03203399A (en) 1989-12-29 1991-09-05 Matsushita Electric Ind Co Ltd Parts mounting device
JP2876046B2 (en) 1990-03-15 1999-03-31 山形カシオ株式会社 Component mounting work equipment
JP2811899B2 (en) 1990-04-05 1998-10-15 松下電器産業株式会社 Electronic component mounting equipment
JP2858349B2 (en) 1990-04-11 1999-02-17 松下電器産業株式会社 Electronic component mounting method and device
US4959898A (en) 1990-05-22 1990-10-02 Emhart Industries, Inc. Surface mount machine with lead coplanarity verifier
JP2844489B2 (en) 1990-06-20 1999-01-06 松下電器産業株式会社 Electronic component mounting equipment
JP2870142B2 (en) 1990-07-17 1999-03-10 日本電気株式会社 Coplanarity measuring method and apparatus
US5096353A (en) 1990-07-27 1992-03-17 Motorola, Inc. Vision system for a robotic station
US5249349A (en) 1991-01-24 1993-10-05 Matsushita Electric Works, Ltd. Parts mounting device
JP2517178B2 (en) 1991-03-04 1996-07-24 松下電器産業株式会社 Electronic component mounting method
JPH04343178A (en) 1991-05-20 1992-11-30 Sony Corp Image processor
JP2554437Y2 (en) 1991-05-30 1997-11-17 株式会社ニコン Camera display device
JP3104300B2 (en) 1991-06-05 2000-10-30 石川島播磨重工業株式会社 Gas-liquid separation device
US5195234A (en) 1991-08-19 1993-03-23 Motorola, Inc. Method and apparatus for visual alignment of parts
JP2969401B2 (en) 1991-10-29 1999-11-02 株式会社新川 Bonding wire inspection device
US5237622A (en) 1991-12-04 1993-08-17 Micron Technology, Inc. Semiconductor pick-and-place machine automatic calibration apparatus
DE4304276A1 (en) 1992-02-17 1993-08-19 Galram Technology Ind Ltd Forming high resolution image of planar or three=dimensional object - combining sharp image data provided by detector matrix for successive scanning of object via optical imaging system.
JP2769947B2 (en) 1992-05-15 1998-06-25 株式会社椿本チエイン Manipulator position / posture control method
JP3114034B2 (en) 1992-06-05 2000-12-04 ヤマハ発動機株式会社 Component mounting method and component mounting device
TW223184B (en) 1992-06-18 1994-05-01 Matsushita Electron Co Ltd
US5309522A (en) 1992-06-30 1994-05-03 Environmental Research Institute Of Michigan Stereoscopic determination of terrain elevation
EP0578136B1 (en) 1992-07-01 1995-11-22 Yamaha Hatsudoki Kabushiki Kaisha Method for mounting components and an apparatus therefor
JP2554424B2 (en) 1992-08-04 1996-11-13 ヤマハ発動機株式会社 Parts mounting device
DE69300853T3 (en) 1992-07-01 1999-05-12 Yamaha Motor Co Ltd Method of assembling components and device therefor.
JP3289197B2 (en) 1992-08-31 2002-06-04 京セラ株式会社 Transmission power amplifier
US5878484A (en) 1992-10-08 1999-03-09 Tdk Corporation Chip-type circuit element mounting apparatus
JP2554431B2 (en) 1992-11-05 1996-11-13 ヤマハ発動機株式会社 Mounting device component suction state detection device
JP3110898B2 (en) 1992-11-17 2000-11-20 株式会社東芝 Inverter device
JP3261770B2 (en) 1992-11-19 2002-03-04 松下電器産業株式会社 Component mounting device
DE4332236A1 (en) 1992-11-26 1995-03-23 F E S Used Electronics Elektro Automatic removal system
JPH0715171A (en) 1993-06-28 1995-01-17 Matsushita Electric Ind Co Ltd Component mounting device
US5403140A (en) 1993-10-13 1995-04-04 Storage Technology Corporation Dynamic sweeping mechanism for a line scan camera
JPH07115296A (en) 1993-10-15 1995-05-02 Sanyo Electric Co Ltd Controller for component mounting machine
US5434629A (en) 1993-12-20 1995-07-18 Focus Automation Systems Inc. Real-time line scan processor
JPH07193397A (en) 1993-12-27 1995-07-28 Yamaha Motor Co Ltd Suction point correction device of mounting device
JP3090567B2 (en) 1993-12-29 2000-09-25 ヤマハ発動機株式会社 Component recognition method and device for mounting machine
JPH07212096A (en) 1994-01-21 1995-08-11 Yamaha Motor Co Ltd Component recognition apparatus for mounting machine
US5555090A (en) 1994-10-24 1996-09-10 Adaptive Optics Associates System for dimensioning objects
US6118538A (en) 1995-01-13 2000-09-12 Cyberoptics Corporation Method and apparatus for electronic component lead measurement using light based sensors on a component placement machine
JP2937785B2 (en) 1995-02-02 1999-08-23 ヤマハ発動機株式会社 Component state detection device for mounting machine
JP3117898B2 (en) 1995-05-29 2000-12-18 シャープ株式会社 Induction heating cooker
US5661561A (en) 1995-06-02 1997-08-26 Accu-Sort Systems, Inc. Dimensioning system
JP3402876B2 (en) 1995-10-04 2003-05-06 ヤマハ発動機株式会社 Surface mounting machine
KR0152879B1 (en) 1995-10-10 1998-12-15 이희종 Chip recognition method and apparatus for surface mounting device
US5671527A (en) 1995-11-01 1997-09-30 Fuji Machine Mfg. Co., Ltd. Electronic-component mounting system
SG52900A1 (en) 1996-01-08 1998-09-28 Matsushita Electric Ind Co Ltd Mounting apparatus of electronic components and mounting methods of the same
US5787577A (en) 1996-08-16 1998-08-04 Motorola, Inc. Method for adjusting an electronic part template
US5832107A (en) 1996-09-19 1998-11-03 Optical Gaging Products, Inc. Optical system for stereoscopically measuring feature heights based on lateral image offsets
JP3265198B2 (en) 1996-09-20 2002-03-11 松下電送システム株式会社 Structured document creation device, structured document creation method, communication device, and communication method
US5768759A (en) 1996-11-19 1998-06-23 Zevatech, Inc. Method and apparatus for reflective in-flight component registration
JP4067602B2 (en) 1996-12-09 2008-03-26 富士通株式会社 Height inspection method and height inspection apparatus for implementing the method
US5777746A (en) 1996-12-31 1998-07-07 Pitney Bowes Inc. Apparatus and method for dimensional weighing utilizing a mirror and/or prism
JP3769089B2 (en) 1997-01-20 2006-04-19 ヤマハ発動機株式会社 Component recognition device for mounting machine
JP3030499B2 (en) 1997-02-19 2000-04-10 大塚化学株式会社 Curing agent for epoxy resin
JPH11188914A (en) 1997-12-25 1999-07-13 Hitachi Cable Ltd Light emitting diode array
US6018865A (en) 1998-01-20 2000-02-01 Mcms, Inc. Method for calibrating the Z origin position
US6031242A (en) 1998-01-23 2000-02-29 Zevatech, Inc. Semiconductor die in-flight registration and orientation method and apparatus
JP3744179B2 (en) 1998-02-19 2006-02-08 松下電器産業株式会社 Electronic component mounting method
JP4303345B2 (en) 1998-03-12 2009-07-29 Juki株式会社 Surface mount component mounting machine
US6160348A (en) 1998-05-18 2000-12-12 Hyundai Electronics America, Inc. DC plasma display panel and methods for making same
US5999206A (en) 1998-05-22 1999-12-07 Futaba Denshi Kogyo Kabushiki Kaisha Device for expanding light-amount correction dynamic range
DE19826555A1 (en) 1998-06-15 1999-12-16 Martin Umwelt & Energietech Method of placing components on circuit boards
US6243164B1 (en) 1998-07-13 2001-06-05 Electro Scientific Industries Method and system for determining lead coplanarity
KR100635954B1 (en) 1998-08-04 2006-10-19 사이버옵틱스 코포레이션 Enhanced sensor
JP4260280B2 (en) 1999-04-13 2009-04-30 ヤマハ発動機株式会社 Component recognition system for surface mounters
JP4213292B2 (en) 1999-04-27 2009-01-21 ヤマハ発動機株式会社 Component recognition system for surface mounters
US6291816B1 (en) 1999-06-08 2001-09-18 Robotic Vision Systems, Inc. System and method for measuring object features with coordinated two and three dimensional imaging

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4772125A (en) * 1985-06-19 1988-09-20 Hitachi, Ltd. Apparatus and method for inspecting soldered portions
US5208463A (en) * 1990-08-24 1993-05-04 Hitachi, Ltd. Method and apparatus for detecting deformations of leads of semiconductor device
US5278634A (en) * 1991-02-22 1994-01-11 Cyberoptics Corporation High precision component alignment sensor system
US5461480A (en) * 1993-06-14 1995-10-24 Yamaha Hatsudoki Kabushiki Kaisha Parts recognizing device for mounting machine
US5619328A (en) * 1993-12-27 1997-04-08 Yamaha Hatsudoki Kabushiki Kaisha Component mounter and recognition method
US5559727A (en) * 1994-02-24 1996-09-24 Quad Systems Corporation Apparatus and method for determining the position of a component prior to placement
US5999640A (en) * 1996-03-15 1999-12-07 Matsushita Electric Industrial Co. Ltd. Electronic part mounting apparatus with reflector which optimizes available light

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005048675A3 (en) * 2003-11-07 2005-11-03 Cyberoptics Corp Pick and place machine with improved setup and operation procedure
WO2005048675A2 (en) * 2003-11-07 2005-05-26 Cyberoptics Corporation Pick and place machine with improved setup and operation procedure
DE102009016288B4 (en) * 2009-01-02 2013-11-21 Singulus Technologies Ag Method and device for aligning substrates
JP2016006422A (en) * 2009-03-24 2016-01-14 オルボテック・リミテッド Multimode imaging
WO2010112613A1 (en) * 2009-04-03 2010-10-07 Singulus Technologies Ag Method and device for aligning substrates
US8894259B2 (en) 2009-09-22 2014-11-25 Cyberoptics Corporation Dark field illuminator with large working area
US8388204B2 (en) 2009-09-22 2013-03-05 Cyberoptics Corporation High speed, high resolution, three dimensional solar cell inspection system
US8670031B2 (en) 2009-09-22 2014-03-11 Cyberoptics Corporation High speed optical inspection system with camera array and compact, integrated illuminator
US8681211B2 (en) 2009-09-22 2014-03-25 Cyberoptics Corporation High speed optical inspection system with adaptive focusing
US8872912B2 (en) 2009-09-22 2014-10-28 Cyberoptics Corporation High speed distributed optical sensor inspection system
EP2533622A1 (en) * 2011-06-09 2012-12-12 Yamaha Hatsudoki Kabushiki Kaisha (Yamaha Motor Co., Ltd.) Component imaging method, component imaging device, and component mounting device having component imaging device
US8654412B2 (en) 2011-06-09 2014-02-18 Yamaha Hatsudoki Kabushiki Kaisha Component imaging method, component imaging device, and component mounting device having component imaging device
US11915563B2 (en) 2019-07-25 2024-02-27 Philip Morris Products S.A. Vending apparatus for aerosol generating articles

Also Published As

Publication number Publication date
DE19982450T1 (en) 2001-05-17
WO2000026850A1 (en) 2000-05-11
GB2346693A (en) 2000-08-16
DE19982498T1 (en) 2001-02-22
KR100615912B1 (en) 2006-08-28
GB0014172D0 (en) 2000-08-02
KR20010033888A (en) 2001-04-25
GB2347741A (en) 2000-09-13
JP2002529711A (en) 2002-09-10
DE19982497T1 (en) 2001-02-01
WO2000028278A9 (en) 2002-08-22
US6608320B1 (en) 2003-08-19
KR20010040321A (en) 2001-05-15
JP2002529722A (en) 2002-09-10
GB0014999D0 (en) 2000-08-09
KR20010033900A (en) 2001-04-25
WO2000028278A1 (en) 2000-05-18
US6610991B1 (en) 2003-08-26
JP2002529907A (en) 2002-09-10
GB0015002D0 (en) 2000-08-09
GB2346970A (en) 2000-08-23

Similar Documents

Publication Publication Date Title
WO2000026640A1 (en) Electronics assembly apparatus with improved imaging system
US5559727A (en) Apparatus and method for determining the position of a component prior to placement
US5621530A (en) Apparatus and method for verifying the coplanarity of a ball grid array
US6144452A (en) Electronic component mounting apparatus
US5617209A (en) Method and system for triangulation-based, 3-D imaging utilizing an angled scaning beam of radiant energy
KR0185694B1 (en) A high precision component alignment sensor system
US6671397B1 (en) Measurement system having a camera with a lens and a separate sensor
US6141040A (en) Measurement and inspection of leads on integrated circuit packages
US6055055A (en) Cross optical axis inspection system for integrated circuits
EP1003212A2 (en) Method of and apparatus for bonding light-emitting element
US6118538A (en) Method and apparatus for electronic component lead measurement using light based sensors on a component placement machine
JP4315536B2 (en) Electronic component mounting method and apparatus
US5101442A (en) Three-dimensional imaging technique using sharp gradient of illumination
US20110175997A1 (en) High speed optical inspection system with multiple illumination imagery
KR102515369B1 (en) 3D measurement device
US6031242A (en) Semiconductor die in-flight registration and orientation method and apparatus
WO2009094489A1 (en) High speed optical inspection system with multiple illumination imagery
US6242756B1 (en) Cross optical axis inspection system for integrated circuits
US4875779A (en) Lead inspection system for surface-mounted circuit packages
US20040099710A1 (en) Optical ball height measurement of ball grid arrays
WO2011056976A1 (en) High speed optical inspection system with adaptive focusing
EP1014438A2 (en) A measurement system
JPH02187607A (en) Method of teaching reference data for mounted printed board inspection device
IES980649A2 (en) Solder paste measurement

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): DE GB JP KR

WWE Wipo information: entry into national phase

Ref document number: GB0014999.7

Country of ref document: GB

ENP Entry into the national phase

Ref document number: 2000 579970

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 1020007007461

Country of ref document: KR

RET De translation (de og part 6b)

Ref document number: 19982498

Country of ref document: DE

Date of ref document: 20010222

WWE Wipo information: entry into national phase

Ref document number: 19982498

Country of ref document: DE

WWP Wipo information: published in national office

Ref document number: 1020007007461

Country of ref document: KR

WWW Wipo information: withdrawn in national office

Ref document number: 1020007007461

Country of ref document: KR