EP1076221A2 - A robot with gauging system for determining three-dimensional measurement data - Google Patents

A robot with gauging system for determining three-dimensional measurement data Download PDF

Info

Publication number
EP1076221A2
EP1076221A2 EP00117126A EP00117126A EP1076221A2 EP 1076221 A2 EP1076221 A2 EP 1076221A2 EP 00117126 A EP00117126 A EP 00117126A EP 00117126 A EP00117126 A EP 00117126A EP 1076221 A2 EP1076221 A2 EP 1076221A2
Authority
EP
European Patent Office
Prior art keywords
contact sensor
position data
robot
sensor
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP00117126A
Other languages
German (de)
French (fr)
Other versions
EP1076221A3 (en
EP1076221B1 (en
Inventor
William D. Long
Charles C. Kingston
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perceptron Inc
Original Assignee
Perceptron Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perceptron Inc filed Critical Perceptron Inc
Publication of EP1076221A2 publication Critical patent/EP1076221A2/en
Publication of EP1076221A3 publication Critical patent/EP1076221A3/en
Application granted granted Critical
Publication of EP1076221B1 publication Critical patent/EP1076221B1/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Definitions

  • the present invention relates generally to non-contact gauging systems and, more particularly, a robot-based gauging system and method for determining three-dimensional measurement data of an object.
  • non-contact sensors such as optical sensors
  • the sensor may be moved in relation to the workpiece. Therefore, it is important to know the exact location and orientation of the moving item (either the sensor or the workpiece) each time the sensor acquires image data of the workpiece. This tends to be fairly straightforward for accurate motion devices (e.g., a CMM), since the reported position reflects the actual position of the device.
  • the motion controller of an accurate motion device may include various compensation factors that get applied to the motion control to ensure the commanded position matches the actual position of the device.
  • the high cost associated with these types of motion devices is prohibitive to them being used in a typical manufacturing environment.
  • a conventional industrial robot as the motion device in the gauging system.
  • the robot can be used to move the sensor in relation to the workpiece.
  • Industrial robots are well suited to perform complex motion sequences which are customized for the workpiece.
  • many standard industrial tools are available which allow the programming of the motion sequences to be performed off-line and subsequently downloaded to the robot controller.
  • Robot controllers generally utilize ideal kinematic models when instructing robot motion.
  • the physical characteristics of the robot arm vary slightly from the ideal model, and thus the actual movement of the robot arm varies slightly from the commanded motion sequence. Furthermore, these physical characteristics are likely to change as the robot is subject to wear and thermal changes.
  • Most robot controllers are not equipped with any additional means for translating and/or correcting the reported position into an actual position of the robot arm.
  • conventional industrial robots have not heretofore been used in non-contact gauging systems and other highly accurate manufacturing workstation applications.
  • the robot-based gauging system of the present invention accurately determines three-dimensional measurement data for a workpiece through the use of a synchronized scanning process without the need for an expensive motion device, such as a CMM.
  • the gauging system includes a robot having at least one movable member and a robot controller for controlling the movement of the movable member in relation to the surface of an object.
  • a non-contact sensor is coupled to the movable member of the robot for capturing image data representative of the surface of the object.
  • a position reporting device is used to report position data representative of the position of the non-contact sensor.
  • a synch signal generator is used to generate and transmit a synch signal to each of the non-contact sensor and the position reporting device, wherein the non-contact sensor reports image data in response to the synch signal and the position reporting device reports position data in response to the synch signal.
  • the gauging system further includes a vision system adapted to retrieve image data from the non-contact sensor and position data from the position reporting device.
  • a vision system adapted to retrieve image data from the non-contact sensor and position data from the position reporting device.
  • the vision system must be able to synchronize the image data with the position data.
  • a memory storage space is used to store position data so that the vision system can synchronize the image data with the position data. In this way, the robot-based gauging system of the present invention is able to quickly and accurately determine three-dimensional measurement data for the object.
  • FIG. 1 An exemplary robot-based gauging system of the type commonly employed in vehicle assembly lines or other automotive applications is shown in Figure 1.
  • a robot-based gauging system 10 is often used for ensuring that each vehicle body component is assembled within predefined tolerances.
  • the robot-based gauging system 10 may measure the door of a vehicle body 12.
  • the invention is not limited to automotive applications, an exemplary use for the robot-based gauging system 10 would be in an automotive assembly plant.
  • a single non-contact sensor 20 is mounted on a robot arm 22 of a robot 24.
  • the sensor 20 in the present embodiment is a contour sensor which is only capable of two dimensional (2D) measurements.
  • the sensor 20 emits a planar structured light pattern.
  • a characteristic reflected light pattern is achieved which can be detected by the sensor 20.
  • the contour sensor can measure (through triangulation) in two dimensions a plane on the reflecting workpiece.
  • Communication cables 28 connect the sensor 20 and the controller of the robot 24 to a machine vision computer 30 which includes a CRT (cathode ray tube) display 32.
  • a printer 34 is optionally provided with a typical machine vision computer.
  • the robot arm 22 is movable to a plurality of positions for measuring the periphery of a workpiece (e.g., a door).
  • the sensor 20 can continuously acquire data as the robot arm 22 traverses the surface of a workpiece.
  • three-dimensional measurement data for the workpiece may be acquired by "scanning" or continuously moving the two-dimensional measurement sensor in relation to the workpiece.
  • the gauging system 10 is also able to report the three-dimensional measurement data with respect to a predetermined reference frame or coordinate system associated, for example, with the vehicle body 12 to be measured, or with respect to an external reference frame associated with the gauging station.
  • Figure 2 illustrates the basic components associated with the robot-based gauging system 10.
  • a non-contact sensor 20 is affixed to the end of a robot arm 22 which in turn is connected to a robot controller 38.
  • the robot controller 38 is operative to control the movement of the robot arm 22 in relation to the surface of an object to be gauged by the system, and the non-contact sensor 20 is operative to capture image data representative of a portion of the surface of the object.
  • a position reporting device 24 is used to report position data representative of the position of the non-contact sensor 20.
  • the non-contact sensor 20 and the position reporting device 24 are each in turn electrically connected to a vision system 40 residing on the machine vision computer 30.
  • the position of the sensor 20 can be reported either directly or indirectly by the position reporting device 26.
  • the actual position of the sensor 20 is reported (e.g., by a photogrammetry system) to the vision system 40
  • the position of the sensor 20 is deduced from the position of the robot arm 22 which is reported (e.g., by the robot controller) to the vision system 40.
  • the vision system 40 synchronizes the image data with the position data, thereby determining three-dimensional measurement data for the workpiece.
  • the vision system 40 includes a synchronization module 42 and a timing signal generator 44.
  • the vision system 40 uses a timing signal or a hardware latch signal which is generated by the timing signal generator 44.
  • the timing signal is then simultaneously transmitted to the sensor 20 and the position reporting device 24.
  • the sensor 20 records image data for the workpiece and the position reporting device 24 records current position data for the non-contact sensor 20.
  • Image data and position data can then be requested by and delivered to the vision system 40.
  • the vision system 40 acquires the measurement data needed to construct a contour line representative of the surface of the workpiece. It should be noted that the vision system does not rely on the sensor motion being at a constant velocity in relation to the workpiece. On the contrary, it is envisioned that the velocity of the sensor may vary as it scans the workpiece.
  • FIG. 3 A first preferred embodiment of a gauging system which utilizes the indirect approach to acquire position data is shown in Figure 3.
  • the indirect approach deduces the position of the sensor from the position of the robot arm 22. To do so, the position data is reported by the robot controller 38 to the vision system 40. However, as previously described, the actual position of the robot arm varies from the commanded position data as reported by the robot controller 38.
  • the vision system 40 further incorporates a kinematic correction module 50 which applies a real time kinematic correction factor to the reported position data received from the robot controller 38.
  • the DynaCal Robot Cell Calibration System developed by Dynalog, Inc. of Bloomfield Hills, Michigan may be adapted to serve as the kinematic correction module 50.
  • the kinematic correction module 50 incorporates a procedure that determines physical deviations between the actual robot and its corresponding ideal model. These deviations are stored as robot-specific parameters.
  • the kinematic correction module 50 receives the reported position data from the robot controller 38. Since the robot controller 38 is unaware of the robot's physical deviations, the reported position data is based on the ideal model.
  • the kinematic correction module 50 translates the reported position data into actual position data by using the robot-specific parameters.
  • the actual position data of the robot arm is transformed to position data for the sensor.
  • the sensor position data is then provided to the synchronization module 42.
  • the non-contact sensor 20 is preferably a Tricam non-contact sensor which is manufactured by Perceptron, Inc. of Madison, Michigan. While the following description is provided with reference to the Tricam non-contact sensor, it is readily understood that the explanation is applicable to other non-contact sensors.
  • the timing signal generator 44 is a software module that resides on a circuit board which can easily be incorporated into the machine vision computer 30. Moreover, the timing signal generator 44 is associated with the sensor 20, such that the hardware latch signal is generated internal to the sensor 20. In operation, the sensor generates a hardware latch signal each time it acquires image data. The sensor 20 is further configured to simultaneously transmit the hardware latch signal to the robot controller 38.
  • the Tricam non-contact sensor may support an external input that allows for image acquisition to be initiated by an hardware latch signal which is generated external to the sensor 20.
  • the timing signal generator 44 may be associated with some external computing device (e.g., the robot controller).
  • the Tricam non-contact sensor In order to "scan" the sensor in relation to the workpiece, image data must be captured at a rate which allows for the continuous motion of the sensor.
  • the Tricam non-contact sensor In a continuous operation mode, the Tricam non-contact sensor is limited by the frame rate of its camera which operates at a frequency of 30 Hz.
  • the image data reported by the sensor 20 lags in time behind the position data reported by the robot controller 38 by two frame times (e.g., 2/30 second). Accordingly, the image data received from the sensor 20 requires synchronization with the position data received from the robot controller 38.
  • the present invention synchronizes the image data with the position data as shown in Figure 4.
  • a memory space internal to the vision system is used to store and process position data received from the robot controller 38.
  • the memory space is further defined into an image memory space for storing image data and at least three positional memory spaces for storing position data.
  • a first timing signal causes the vision system to retrieve position data(t 0 ) from the robot controller which is in turn stored in a first memory space 62.
  • the vision system copies the position data(t 0 ) in the first memory space 62 to a second memory space 64 and then retrieves the next available position data(t 1 ) into the first memory space 62.
  • a third timing signal causes the position data(t 0 ) in the second memory space 64 to be copied to a third memory space 66, the position data(t 1 ) in the first memory space 62 to be copied to the second memory space 64, and new position data(t 2 ) to be retrieved into the first memory space 62.
  • the vision system 40 is able to store the three most recent sets of position data received from the robot controller 38
  • image data(t 0 ) corresponding to the initial position data(t 0 ) is available from the sensor to the vision system.
  • this image data is read into the image memory space 68 of the vision system.
  • the vision system is able to accurately construct measurement data for the workpiece. For each additional timing signal, this synchronization process is repeated by the vision system.
  • a timing signal tells the vision system to retrieve the current image data from the sensor and to retrieve the current position data from the robot controller.
  • the first instance of this timing signal does not occur until after the first set of image data has been processed and is available to the vision system. Since there are two frame times between the laser strobe and this first timing signal, the available position data no longer matches the available image data. In other words, the position data which is retreived in response to the first timing signal corresponds to the image data retrieved in response to the third timing signal. As a result, the first two sets of image data are discarded, and synchronization begins with the third set of image data.
  • the position of the sensor is directly reported to the vision system 40 by an independent measurement system as shown in Figure 5. While the following description is provided with reference to a photogrammetry system 80, it is readily understood that other measurement devices may be suitably used in conjunction with the present invention.
  • one or more photogrammetric cameras are positioned at convenient vantage points in the gauging station.
  • Each photogrammetric camera operates under conventional photogrammetry techniques. That is, the photogrammetric camera emits a burst of quasi-monochromatic light and its camera detects the reflection of this light.
  • retroreflective markers may be placed on the surface of the non-contact sensor 20. When the markers are illuminated by infrared light from a camera, light is reflected by the markers, as bright light spots, back to a motion capture unit within the camera. The light spots are analyzed, such that their position and sizes are calculated by the camera, thereby determining the position of the markers. It is envisioned that the photogrammtric camera may operate using other conventional operational techniques, including but not limited to tracking of luminous targets.
  • the photogrammetry system will provide an accurate determination of the position of the sensor. In this way, the actual position of the sensor is reported to the vision system 40. It should be noted that since the photogrammetry system 80 reports the actual position of the sensor, there is no need for the kinematic correction module in this embodiment.
  • This alternative embodiment of the robot-based gauging system 10 can otherwise operate as previously discussed in relation to Figures 1-4.
  • the present invention provides a significant advance in the art of robot-based gauging systems.
  • the invention accurately determines three-dimensional measurement data for a workpiece through the use of a synchronized scanning process and without the need for expensive sensor positioning devices, such as CMMs.

Abstract

A robot-based gauging system (10) is provided for determining three-dimensional measurement data of the surface of an object (12). The robot-based gauging system includes a robot (24) having at least one movable member (22) and a robot controller for controlling the movement of the movable member (22) in relation to the surface of an object (12). A non-contact sensor (20) is coupled to the movable member (22) of the robot (24) for capturing image data representative of the surface of the object (12). A position reporting device reports position data representative of the position of the non-contact sensor. A synch signal generator for generating and transmitting a synch signal is electrically connected to each of the non-contact sensor and the position reporting device, such that the non-contact sensor reports image data in response to the synch signal and the position reporting device reports position data in response to the synch signal.

Description

    Background and Summary of the Invention
  • The present invention relates generally to non-contact gauging systems and, more particularly, a robot-based gauging system and method for determining three-dimensional measurement data of an object.
  • Demand for higher quality has pressed manufacturers of mass produced articles, such as automotive vehicles, to employ automated manufacturing techniques that were unheard of when assembly line manufacturing was first conceived. Today, robotic equipment is used to assemble, weld, finish, gauge and test manufactured articles with a much higher degree of quality and precision than has been heretofore possible. Computer-aided manufacturing techniques allow designers to graphically conceptualize and design a new product on a computer workstation and the automated manufacturing process ensures that the design is faithfully carried out precisely according to specification. Machine vision is a key part of today's manufacturing environment. Machine vision systems are used with robotics and computer-aided design systems to ensure quality is achieved at the lowest practical cost.
  • In a typical manufacturing environment, there may be a plurality of different non-contact sensors, such as optical sensors, positioned at various predetermined locations within the workpiece manufacturing, gauging or testing station. In order to obtain three-dimensional measurement data using a two-dimensional non-contact sensor, the sensor may be moved in relation to the workpiece. Therefore, it is important to know the exact location and orientation of the moving item (either the sensor or the workpiece) each time the sensor acquires image data of the workpiece. This tends to be fairly straightforward for accurate motion devices (e.g., a CMM), since the reported position reflects the actual position of the device. The motion controller of an accurate motion device may include various compensation factors that get applied to the motion control to ensure the commanded position matches the actual position of the device. However, the high cost associated with these types of motion devices is prohibitive to them being used in a typical manufacturing environment.
  • As an alternative to these more expensive motion devices, it is possible to use a conventional industrial robot as the motion device in the gauging system. By affixing the non-contact sensor to the industrial robot, the robot can be used to move the sensor in relation to the workpiece. Industrial robots are well suited to perform complex motion sequences which are customized for the workpiece. In addition, many standard industrial tools are available which allow the programming of the motion sequences to be performed off-line and subsequently downloaded to the robot controller.
  • The problem with a robot-based gauging system is that the conventional industrial robot is not an inherently accurate motion device. Robot controllers generally utilize ideal kinematic models when instructing robot motion. The physical characteristics of the robot arm vary slightly from the ideal model, and thus the actual movement of the robot arm varies slightly from the commanded motion sequence. Furthermore, these physical characteristics are likely to change as the robot is subject to wear and thermal changes. Most robot controllers are not equipped with any additional means for translating and/or correcting the reported position into an actual position of the robot arm. As a result, conventional industrial robots have not heretofore been used in non-contact gauging systems and other highly accurate manufacturing workstation applications.
  • To this end, the robot-based gauging system of the present invention accurately determines three-dimensional measurement data for a workpiece through the use of a synchronized scanning process without the need for an expensive motion device, such as a CMM.
  • The gauging system includes a robot having at least one movable member and a robot controller for controlling the movement of the movable member in relation to the surface of an object. A non-contact sensor is coupled to the movable member of the robot for capturing image data representative of the surface of the object. A position reporting device is used to report position data representative of the position of the non-contact sensor. In addition, a synch signal generator is used to generate and transmit a synch signal to each of the non-contact sensor and the position reporting device, wherein the non-contact sensor reports image data in response to the synch signal and the position reporting device reports position data in response to the synch signal.
  • The gauging system further includes a vision system adapted to retrieve image data from the non-contact sensor and position data from the position reporting device. As the non-contact sensor scans the surface of the object, there is a time latency between when the sensor strobes its laser to when processed image data is available to the vision system. In other words, the image data reported by the sensor lags in time behind the position data reported by the position reporting device. Therefore, the vision system must be able to synchronize the image data with the position data. A memory storage space is used to store position data so that the vision system can synchronize the image data with the position data. In this way, the robot-based gauging system of the present invention is able to quickly and accurately determine three-dimensional measurement data for the object.
  • For a more complete understanding of the invention, its objects and advantages, reference may be had to the following specification and to the accompanying drawings.
  • Brief Description of the Drawings
  • Figure 1 is a perspective view of a robot-based gauging station incorporated into an automotive door assembly line in accordance with the present invention;
  • Figure 2 illustrates the primary components of a robot-based gauging system in accordance with the present invention;
  • Figure 3 is a block diagram showing a first preferred embodiment of the robot-based gauging system of the present invention;
  • Figure 4 illustrates the memory configuration used to synchronize the image data with the position data in the robot-based gauging system of the present invention; and
  • Figure 5 is a block diagram showing a second preferred embodiment of a robot-based gauging system in accordance with the present invention.
  • Description of the Preferred Embodiment
  • An exemplary robot-based gauging system of the type commonly employed in vehicle assembly lines or other automotive applications is shown in Figure 1. A robot-based gauging system 10 is often used for ensuring that each vehicle body component is assembled within predefined tolerances. For example, the robot-based gauging system 10 may measure the door of a vehicle body 12. Although the invention is not limited to automotive applications, an exemplary use for the robot-based gauging system 10 would be in an automotive assembly plant.
  • A single non-contact sensor 20 is mounted on a robot arm 22 of a robot 24. The sensor 20 in the present embodiment is a contour sensor which is only capable of two dimensional (2D) measurements. During operation, the sensor 20 emits a planar structured light pattern. By illuminating the workpiece, a characteristic reflected light pattern is achieved which can be detected by the sensor 20. The contour sensor can measure (through triangulation) in two dimensions a plane on the reflecting workpiece. For further details regarding a suitable structured light sensor, reference may be had to U.S. Patent No. 4,645,348 to Dewar et al., assigned to the assignee of the present invention.
  • Communication cables 28 connect the sensor 20 and the controller of the robot 24 to a machine vision computer 30 which includes a CRT (cathode ray tube) display 32. A printer 34 is optionally provided with a typical machine vision computer.
  • In order to generate a three-dimensional representation of the workpiece, the robot arm 22 is movable to a plurality of positions for measuring the periphery of a workpiece (e.g., a door). In contrast to the static approach that requires the sensor to stop and settle for each image acquisition, the sensor 20 can continuously acquire data as the robot arm 22 traverses the surface of a workpiece. In other words, three-dimensional measurement data for the workpiece may be acquired by "scanning" or continuously moving the two-dimensional measurement sensor in relation to the workpiece. In addition, the gauging system 10 is also able to report the three-dimensional measurement data with respect to a predetermined reference frame or coordinate system associated, for example, with the vehicle body 12 to be measured, or with respect to an external reference frame associated with the gauging station.
  • Figure 2 illustrates the basic components associated with the robot-based gauging system 10. A non-contact sensor 20 is affixed to the end of a robot arm 22 which in turn is connected to a robot controller 38. In operation, the robot controller 38 is operative to control the movement of the robot arm 22 in relation to the surface of an object to be gauged by the system, and the non-contact sensor 20 is operative to capture image data representative of a portion of the surface of the object. A position reporting device 24 is used to report position data representative of the position of the non-contact sensor 20. The non-contact sensor 20 and the position reporting device 24 are each in turn electrically connected to a vision system 40 residing on the machine vision computer 30.
  • As will be further described below, the position of the sensor 20 can be reported either directly or indirectly by the position reporting device 26. In the direct approach, the actual position of the sensor 20 is reported (e.g., by a photogrammetry system) to the vision system 40, whereas in the indirect approach, the position of the sensor 20 is deduced from the position of the robot arm 22 which is reported (e.g., by the robot controller) to the vision system 40. Lastly, the vision system 40 synchronizes the image data with the position data, thereby determining three-dimensional measurement data for the workpiece.
  • More specifically, the vision system 40 includes a synchronization module 42 and a timing signal generator 44. To synchronize the image data received from the sensor 20 with the position data received from the position reporting device 24, the vision system 40 uses a timing signal or a hardware latch signal which is generated by the timing signal generator 44. The timing signal is then simultaneously transmitted to the sensor 20 and the position reporting device 24. In response to the timing signal, the sensor 20 records image data for the workpiece and the position reporting device 24 records current position data for the non-contact sensor 20. Image data and position data can then be requested by and delivered to the vision system 40. Thus, the vision system 40 acquires the measurement data needed to construct a contour line representative of the surface of the workpiece. It should be noted that the vision system does not rely on the sensor motion being at a constant velocity in relation to the workpiece. On the contrary, it is envisioned that the velocity of the sensor may vary as it scans the workpiece.
  • A first preferred embodiment of a gauging system which utilizes the indirect approach to acquire position data is shown in Figure 3. The indirect approach deduces the position of the sensor from the position of the robot arm 22. To do so, the position data is reported by the robot controller 38 to the vision system 40. However, as previously described, the actual position of the robot arm varies from the commanded position data as reported by the robot controller 38. Thus, the vision system 40 further incorporates a kinematic correction module 50 which applies a real time kinematic correction factor to the reported position data received from the robot controller 38. In a preferred embodiment, the DynaCal Robot Cell Calibration System developed by Dynalog, Inc. of Bloomfield Hills, Michigan may be adapted to serve as the kinematic correction module 50.
  • One skilled in the art will readily recognize that the kinematic correction module 50 incorporates a procedure that determines physical deviations between the actual robot and its corresponding ideal model. These deviations are stored as robot-specific parameters. In operation, the kinematic correction module 50 receives the reported position data from the robot controller 38. Since the robot controller 38 is unaware of the robot's physical deviations, the reported position data is based on the ideal model. The kinematic correction module 50 translates the reported position data into actual position data by using the robot-specific parameters. Next, the actual position data of the robot arm is transformed to position data for the sensor. The sensor position data is then provided to the synchronization module 42.
  • The non-contact sensor 20 is preferably a Tricam non-contact sensor which is manufactured by Perceptron, Inc. of Plymouth, Michigan. While the following description is provided with reference to the Tricam non-contact sensor, it is readily understood that the explanation is applicable to other non-contact sensors. In the case of the Tricam sensor, the timing signal generator 44 is a software module that resides on a circuit board which can easily be incorporated into the machine vision computer 30. Moreover, the timing signal generator 44 is associated with the sensor 20, such that the hardware latch signal is generated internal to the sensor 20. In operation, the sensor generates a hardware latch signal each time it acquires image data. The sensor 20 is further configured to simultaneously transmit the hardware latch signal to the robot controller 38. It is also envisioned that the Tricam non-contact sensor may support an external input that allows for image acquisition to be initiated by an hardware latch signal which is generated external to the sensor 20. Thus, the timing signal generator 44 may be associated with some external computing device (e.g., the robot controller).
  • In order to "scan" the sensor in relation to the workpiece, image data must be captured at a rate which allows for the continuous motion of the sensor. In a continuous operation mode, the Tricam non-contact sensor is limited by the frame rate of its camera which operates at a frequency of 30 Hz. However, there is a time latency between when the sensor strobes its laser to when processed image data is available to the vision system 40. As a result, the image data reported by the sensor 20 lags in time behind the position data reported by the robot controller 38 by two frame times (e.g., 2/30 second). Accordingly, the image data received from the sensor 20 requires synchronization with the position data received from the robot controller 38.
  • The present invention synchronizes the image data with the position data as shown in Figure 4. A memory space internal to the vision system is used to store and process position data received from the robot controller 38. The memory space is further defined into an image memory space for storing image data and at least three positional memory spaces for storing position data. A first timing signal causes the vision system to retrieve position data(t0) from the robot controller which is in turn stored in a first memory space 62. In response to a second timing signal, the vision system copies the position data(t0) in the first memory space 62 to a second memory space 64 and then retrieves the next available position data(t1) into the first memory space 62. Similarly, a third timing signal causes the position data(t0) in the second memory space 64 to be copied to a third memory space 66, the position data(t1) in the first memory space 62 to be copied to the second memory space 64, and new position data(t2) to be retrieved into the first memory space 62. In other words, the vision system 40 is able to store the three most recent sets of position data received from the robot controller 38
  • At this point, image data(t0) corresponding to the initial position data(t0) is available from the sensor to the vision system. In response to the third timing signal, this image data is read into the image memory space 68 of the vision system. By linking this image data with the position data stored in the third memory space 66, the vision system is able to accurately construct measurement data for the workpiece. For each additional timing signal, this synchronization process is repeated by the vision system.
  • As previously discussed, the operation of Tricam non-contact sensor vanes slightly from the description provided in relation to Figure 4. In the case of the Tricam sensor, a timing signal tells the vision system to retrieve the current image data from the sensor and to retrieve the current position data from the robot controller. The first instance of this timing signal does not occur until after the first set of image data has been processed and is available to the vision system. Since there are two frame times between the laser strobe and this first timing signal, the available position data no longer matches the available image data. In other words, the position data which is retreived in response to the first timing signal corresponds to the image data retrieved in response to the third timing signal. As a result, the first two sets of image data are discarded, and synchronization begins with the third set of image data.
  • Rather than determining the position of the sensor from the information provided by the robot controller, it is also envisioned that the position of the sensor is directly reported to the vision system 40 by an independent measurement system as shown in Figure 5. While the following description is provided with reference to a photogrammetry system 80, it is readily understood that other measurement devices may be suitably used in conjunction with the present invention.
  • In this alternative embodiment of the present invention, one or more photogrammetric cameras are positioned at convenient vantage points in the gauging station. Each photogrammetric camera operates under conventional photogrammetry techniques. That is, the photogrammetric camera emits a burst of quasi-monochromatic light and its camera detects the reflection of this light. For example, retroreflective markers may be placed on the surface of the non-contact sensor 20. When the markers are illuminated by infrared light from a camera, light is reflected by the markers, as bright light spots, back to a motion capture unit within the camera. The light spots are analyzed, such that their position and sizes are calculated by the camera, thereby determining the position of the markers. It is envisioned that the photogrammtric camera may operate using other conventional operational techniques, including but not limited to tracking of luminous targets.
  • In any event, as long as the sensor is within the field of view of the photogrammetric cameras, the photogrammetry system will provide an accurate determination of the position of the sensor. In this way, the actual position of the sensor is reported to the vision system 40. It should be noted that since the photogrammetry system 80 reports the actual position of the sensor, there is no need for the kinematic correction module in this embodiment. This alternative embodiment of the robot-based gauging system 10 can otherwise operate as previously discussed in relation to Figures 1-4.
  • From the foregoing, it will be appreciated that the present invention provides a significant advance in the art of robot-based gauging systems. The invention accurately determines three-dimensional measurement data for a workpiece through the use of a synchronized scanning process and without the need for expensive sensor positioning devices, such as CMMs.
  • While the invention has been described in its presently preferred form, it will be understood that the invention is capable of modification without departing from the spirit of the invention as set forth in the appended claims.

Claims (11)

  1. An apparatus for determining three-dimensional measurement data for the surface of an object (12), comprising:
    a movable member (22);
    a controller (38) for causing said movable member (22) to move in relation to the surface of the object (12);
    a non-contact sensor (20) disposed on said movable member (22) for collecting image data representative of a portion of the surface of the object (12);
    a position reporting device (26; 80) for reporting position data representative of the position of said non-contact sensor (20);
    characterized by:
    a timing signal generator (44) for periodically generating and transmitting a synch signal, said position reporting device (26; 80) electrically connected to said timing signal generator (44) and reporting position data in response to each of said synch signals, and said non-contact sensor (20) being electrically connected to said timing signal generator (44) and collecting image data in response to each of said synch signals; and
    a data processor (40) adapted to retrieve image data from said non-contact sensor (20) and position data from said position reporting device (26; 80), and being operable to synchronize the image data with the position data, thereby generating a three-dimensional representation of the object (12).
  2. The apparatus of claim 1, characterized in that said movable member (22) scans the surface of the object (12), such that the image data reported by said non-contact sensor (20) lags in time behind the position data reported by said position reporting device (26; 80).
  3. The apparatus of claim 1 or 2, characterized by a memory storage space (62, 64, 66) for storing position data from said position reporting device (26; 80), said data processor (40) connected to said memory storage space (62, 64, 66) for synchronizing the image data with the corresponding position data.
  4. The apparatus of any of claim 1 to 3, characterized in that said non-contact sensor (20) is a two-dimensional structural light sensor (20).
  5. The apparatus of any of claims 1 to 4, characterized in that said non-contact sensor (20) emits structured light in a predefined planar configuration and having an optical receiver for receiving reflected light emitted by said non-contact sensor (20), where the reflected light being indicative of the surface of the object (12).
  6. The apparatus of any of claims 1 to 5, characterized by a kinematic correction module (50) for applying a real-time correction to the position data received from the position reporting device (26) based on a kinematic model of the movable member (22).
  7. The apparatus of any of claims 1 to 5, characterized in that said position reporting device (80) is further defined as photogrammetric measurement system (80).
  8. A robot-based gauging system for determining three-dimensional measurement data of an object with respect to a reference frame, said gauging system comprising an apparatus of any of claims 1 to 7, characterized in that
    said non-contact sensor (20) is mounted to a movable member (22) of said robot for movement of the sensor (20) relative to the object (12),
    said controller is configured as a robot controller (38) for controlling the movement of said movable member (22);
    said timing signal generator (44) is electrically connected to said robot controller (38) for periodically generating and transmitting a synch signal, said robot controller (38) reporting a set of said position data in response to each of said synch signals, thereby generating a plurality of position data sets;
    said non-contact sensor (20) is responsive to each of said synch signals for capturing a set of image data representative of a portion of said object (12), thereby generating a corresponding plurality of image data sets representative of a plurality of portions of the object (12); and
    said data processor is configured as a vision system (40) adapted to retrieve said plurality of image data sets from said non-contact sensor (20) and to synchronize each image data set with a corresponding position data set, thereby generating a three-dimensional representation of the object (12).
  9. The gauging system of claim 8, characterized by a coordinate transformation system for transforming each position data set to a sensor reference frame, where the sensor reference frame is associated with said non-contact sensor (20).
  10. The gauging system of claim 8 or 9, characterized in that said robot controller (38) is configured for reporting position data representative of the position of said movable member (22).
  11. The gauging system of claim 8 or 9, characterized in that a photogrammetric measurement system (80) is provided for reporting position data representative of the position of said non-contact sensor (20) by having a calibration field of observation and being positionable at vantage point such that said non-contact sensor (20) is within the calibration field, said photogrammetric measurement system (80) electrically connects to said timing signal generator (44) and reporting a set of position data in response to each of said synch signals, thereby generating a plurality of position data sets.
EP00117126A 1999-08-12 2000-08-10 A robot with gauging system for determining three-dimensional measurement data Expired - Lifetime EP1076221B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/372,871 US6166811A (en) 1999-08-12 1999-08-12 Robot-based gauging system for determining three-dimensional measurement data
US372871 1999-08-12

Publications (3)

Publication Number Publication Date
EP1076221A2 true EP1076221A2 (en) 2001-02-14
EP1076221A3 EP1076221A3 (en) 2003-03-26
EP1076221B1 EP1076221B1 (en) 2010-04-14

Family

ID=23469960

Family Applications (1)

Application Number Title Priority Date Filing Date
EP00117126A Expired - Lifetime EP1076221B1 (en) 1999-08-12 2000-08-10 A robot with gauging system for determining three-dimensional measurement data

Country Status (5)

Country Link
US (1) US6166811A (en)
EP (1) EP1076221B1 (en)
CA (1) CA2315508C (en)
DE (1) DE60044170D1 (en)
ES (1) ES2340462T3 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10335472A1 (en) * 2003-08-02 2005-02-17 E. Zoller GmbH & Co. KG Einstell- und Messgeräte Measurement and adjustment device has a positioning device for moving an exchangeable measurement head with a camera in three dimensions, said positioning device having a pivoting articulated arm
FR2873204A1 (en) * 2004-07-19 2006-01-20 Plastic Omnium Cie MEASURING APPARATUS FOR THE CONTROL OF PAINTED BODY PARTS WITH AN ANTI-DETERIORATION DEVICE
FR2873205A1 (en) * 2004-07-19 2006-01-20 Plastic Omnium Cie MOTOR VEHICLE BODY PARTS PAINTING METHOD AND POST
EP1632775A3 (en) * 2004-07-19 2006-05-03 Compagnie Plastic Omnium A station for inspecting the painting of motor vehicle parts
DE10159574B4 (en) * 2001-10-15 2008-07-24 Tropf, Hermann, Dr.-Ing. Apparatus and method for correcting the movement of gripping and machining tools
DE10354078B4 (en) * 2003-11-19 2008-09-04 Daimler Ag Clamping device for workpieces for three-dimensional optical surface measurement
DE102004024378B4 (en) * 2004-05-17 2009-05-20 Kuka Roboter Gmbh Method for robot-assisted measurement of objects

Families Citing this family (94)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6705526B1 (en) 1995-12-18 2004-03-16 Metrologic Instruments, Inc. Automated method of and system for dimensioning objects transported through a work environment using contour tracing, vertice detection, corner point detection, and corner point reduction methods on two-dimensional range data maps captured by an amplitude modulated laser scanning beam
US20020014533A1 (en) 1995-12-18 2002-02-07 Xiaxun Zhu Automated object dimensioning system employing contour tracing, vertice detection, and forner point detection and reduction methods on 2-d range data maps
US6662083B2 (en) * 2000-10-31 2003-12-09 Progressive Tool & Industries Co. Multiple robotic workstation with multiple fixtures
US6567162B2 (en) 2001-05-31 2003-05-20 The Regents Of The University Of Michigan Reconfigurable apparatus and method for inspection during a manufacturing process
US7344082B2 (en) * 2002-01-02 2008-03-18 Metrologic Instruments, Inc. Automated method of and system for dimensioning objects over a conveyor belt structure by applying contouring tracing, vertice detection, corner point detection, and corner point reduction methods to two-dimensional range data maps of the space above the conveyor belt captured by an amplitude modulated laser scanning beam
FR2846413B1 (en) * 2002-10-24 2005-03-11 Luc Vergnaud DEVICE FOR NON-CONTACT SCANNING OF A 3-DIMENSION OBJECT
US7024032B2 (en) * 2002-10-31 2006-04-04 Perceptron, Inc. Method for assessing fit and alignment of a manufactured part
US20030137673A1 (en) * 2002-12-13 2003-07-24 Cox Cary B. Systems, and methods of use, employing distorted patterns to ascertain the shape of a surface, for road or runway profiling, or as input to control pro-active suspension systems
DE10311247B8 (en) * 2003-03-14 2008-05-08 Inos Automationssoftware Gmbh Portable device for detecting a position and dimensions of an object
WO2004096502A1 (en) * 2003-04-28 2004-11-11 Stephen James Crampton Cmm arm with exoskeleton
EP1524494A1 (en) * 2003-10-17 2005-04-20 inos Automationssoftware GmbH Method for calibrating a camera-laser-unit in respect to a calibration-object
US7693325B2 (en) 2004-01-14 2010-04-06 Hexagon Metrology, Inc. Transprojection of geometry data
DE102004046752B4 (en) * 2004-09-24 2010-10-21 GOM, Gesellschaft für optische Meßtechnik mit beschränkter Haftung Method for the three-dimensional detection of measurement objects
NZ581496A (en) 2005-03-16 2010-02-26 Lucasfilm Entertainment Compan Three-dimensional motion capture using orientation of virtual structures
US8111904B2 (en) 2005-10-07 2012-02-07 Cognex Technology And Investment Corp. Methods and apparatus for practical 3D vision system
EP2092269B1 (en) * 2006-11-20 2019-05-01 Hexagon Technology Center GmbH Coordinate measurement machine with improved joint
US8130225B2 (en) 2007-01-16 2012-03-06 Lucasfilm Entertainment Company Ltd. Using animation libraries for object identification
US8542236B2 (en) * 2007-01-16 2013-09-24 Lucasfilm Entertainment Company Ltd. Generating animation libraries
US8199152B2 (en) 2007-01-16 2012-06-12 Lucasfilm Entertainment Company Ltd. Combining multiple session content for animation libraries
US8126260B2 (en) * 2007-05-29 2012-02-28 Cognex Corporation System and method for locating a three-dimensional object using machine vision
US7743660B2 (en) * 2007-06-15 2010-06-29 The Boeing Company System and method for automated inspection of large-scale part
US8224121B2 (en) * 2007-06-15 2012-07-17 The Boeing Company System and method for assembling substantially distortion-free images
US8144153B1 (en) 2007-11-20 2012-03-27 Lucasfilm Entertainment Company Ltd. Model production for animation libraries
US7779548B2 (en) 2008-03-28 2010-08-24 Hexagon Metrology, Inc. Coordinate measuring machine with rotatable grip
US8122610B2 (en) * 2008-03-28 2012-02-28 Hexagon Metrology, Inc. Systems and methods for improved coordination acquisition member comprising calibration information
US20090326712A1 (en) * 2008-06-26 2009-12-31 Utica Enterprises, Inc. Calibration for vehicle body assembly
US8923602B2 (en) * 2008-07-22 2014-12-30 Comau, Inc. Automated guidance and recognition system and method of the same
DE102008063680A1 (en) * 2008-10-10 2010-04-15 Abb Ag Method for teaching (teaching) an industrial robot and a correspondingly equipped industrial robot
US7908757B2 (en) 2008-10-16 2011-03-22 Hexagon Metrology, Inc. Articulating measuring arm with laser scanner
US9734419B1 (en) 2008-12-30 2017-08-15 Cognex Corporation System and method for validating camera calibration in a vision system
US9142024B2 (en) * 2008-12-31 2015-09-22 Lucasfilm Entertainment Company Ltd. Visual and physical motion sensing for three-dimensional motion capture
DE102009015920B4 (en) 2009-03-25 2014-11-20 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9551575B2 (en) 2009-03-25 2017-01-24 Faro Technologies, Inc. Laser scanner having a multi-color light source and real-time color receiver
US9533418B2 (en) 2009-05-29 2017-01-03 Cognex Corporation Methods and apparatus for practical 3D vision system
WO2011008503A2 (en) 2009-06-30 2011-01-20 Hexagon Metrology Ab Coordinate measurement machine with vibration detection
US8099877B2 (en) 2009-11-06 2012-01-24 Hexagon Metrology Ab Enhanced position detection for a CMM
US9113023B2 (en) 2009-11-20 2015-08-18 Faro Technologies, Inc. Three-dimensional scanner with spectroscopic energy detector
US9529083B2 (en) 2009-11-20 2016-12-27 Faro Technologies, Inc. Three-dimensional scanner with enhanced spectroscopic energy detector
US9210288B2 (en) 2009-11-20 2015-12-08 Faro Technologies, Inc. Three-dimensional scanner with dichroic beam splitters to capture a variety of signals
DE102009057101A1 (en) 2009-11-20 2011-05-26 Faro Technologies, Inc., Lake Mary Device for optically scanning and measuring an environment
US20110213247A1 (en) * 2010-01-08 2011-09-01 Hexagon Metrology, Inc. Articulated arm with imaging device
US8630314B2 (en) * 2010-01-11 2014-01-14 Faro Technologies, Inc. Method and apparatus for synchronizing measurements taken by multiple metrology devices
US8832954B2 (en) 2010-01-20 2014-09-16 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
US9607239B2 (en) 2010-01-20 2017-03-28 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US8677643B2 (en) 2010-01-20 2014-03-25 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
US9628775B2 (en) 2010-01-20 2017-04-18 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
JP5615382B2 (en) * 2010-01-20 2014-10-29 ファロ テクノロジーズ インコーポレーテッド Portable articulated arm coordinate measuring machine using multibus arm technology
GB2489370B (en) 2010-01-20 2014-05-14 Faro Tech Inc Coordinate measuring machine having an illuminated probe end and method of operation
US9163922B2 (en) 2010-01-20 2015-10-20 Faro Technologies, Inc. Coordinate measurement machine with distance meter and camera to determine dimensions within camera images
CN102597895A (en) * 2010-01-20 2012-07-18 法罗技术股份有限公司 Portable articulated arm coordinate measuring machine and integrated environmental recorder
US8898919B2 (en) 2010-01-20 2014-12-02 Faro Technologies, Inc. Coordinate measurement machine with distance meter used to establish frame of reference
US8875409B2 (en) 2010-01-20 2014-11-04 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
US8615893B2 (en) 2010-01-20 2013-12-31 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine having integrated software controls
US9879976B2 (en) 2010-01-20 2018-01-30 Faro Technologies, Inc. Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features
USD643319S1 (en) 2010-03-29 2011-08-16 Hexagon Metrology Ab Portable coordinate measurement machine
DE102010018979A1 (en) * 2010-05-03 2011-11-03 Steinbichler Optotechnik Gmbh Method and device for determining the 3D coordinates of an object
DE102010020925B4 (en) 2010-05-10 2014-02-27 Faro Technologies, Inc. Method for optically scanning and measuring an environment
US9393694B2 (en) 2010-05-14 2016-07-19 Cognex Corporation System and method for robust calibration between a machine vision system and a robot
US8127458B1 (en) 2010-08-31 2012-03-06 Hexagon Metrology, Inc. Mounting apparatus for articulated arm laser scanner
WO2012033892A1 (en) 2010-09-08 2012-03-15 Faro Technologies, Inc. A laser scanner or laser tracker having a projector
US9168654B2 (en) 2010-11-16 2015-10-27 Faro Technologies, Inc. Coordinate measuring machines with dual layer arm
US9124873B2 (en) 2010-12-08 2015-09-01 Cognex Corporation System and method for finding correspondence between cameras in a three-dimensional vision system
USD668656S1 (en) * 2011-01-24 2012-10-09 Datalogic ADC, Inc. Tunnel scanner
US8948447B2 (en) 2011-07-12 2015-02-03 Lucasfilm Entertainment Companyy, Ltd. Scale independent tracking pattern
US9508176B2 (en) 2011-11-18 2016-11-29 Lucasfilm Entertainment Company Ltd. Path and speed based character control
US8763267B2 (en) 2012-01-20 2014-07-01 Hexagon Technology Center Gmbh Locking counterbalance for a CMM
DE102012100609A1 (en) 2012-01-25 2013-07-25 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9069355B2 (en) 2012-06-08 2015-06-30 Hexagon Technology Center Gmbh System and method for a wireless feature pack
US8997362B2 (en) 2012-07-17 2015-04-07 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine with optical communications bus
EP2705935A1 (en) 2012-09-11 2014-03-12 Hexagon Technology Center GmbH Coordinate measuring machine
US9513107B2 (en) 2012-10-05 2016-12-06 Faro Technologies, Inc. Registration calculation between three-dimensional (3D) scans based on two-dimensional (2D) scan data from a 3D scanner
US10067231B2 (en) 2012-10-05 2018-09-04 Faro Technologies, Inc. Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner
DE102012109481A1 (en) 2012-10-05 2014-04-10 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9250214B2 (en) 2013-03-12 2016-02-02 Hexagon Metrology, Inc. CMM with flaw detection system
WO2014182707A2 (en) * 2013-05-06 2014-11-13 Iphoton Solutions, Llc Volume reconstruction of an object using a 3d sensor and robotic coordinates
JP6217343B2 (en) * 2013-11-26 2017-10-25 セントラル硝子株式会社 Curved plate shape inspection device
US9163921B2 (en) 2013-12-18 2015-10-20 Hexagon Metrology, Inc. Ultra-portable articulated arm coordinate measurement machine
US9594250B2 (en) 2013-12-18 2017-03-14 Hexagon Metrology, Inc. Ultra-portable coordinate measurement machine
US9759540B2 (en) 2014-06-11 2017-09-12 Hexagon Metrology, Inc. Articulating CMM probe
US10656617B2 (en) 2014-07-16 2020-05-19 Faro Technologies, Inc. Measurement device for machining center
WO2016044658A1 (en) 2014-09-19 2016-03-24 Hexagon Metrology, Inc. Multi-mode portable coordinate measuring machine
ES2654335T3 (en) 2014-10-23 2018-02-13 Comau S.P.A. System to monitor and control an industrial installation
CN105043315A (en) * 2015-04-30 2015-11-11 天津菲利科电子技术有限公司 Detection system for detecting bucket-arm angle and bucket-arm relative position of non-mechanical bucket wheel machine
DE102015122844A1 (en) 2015-12-27 2017-06-29 Faro Technologies, Inc. 3D measuring device with battery pack
US10059478B2 (en) 2016-06-02 2018-08-28 Becton Dickinson Rowa Germany Gmbh Method and device for dividing a blister strip
US10635758B2 (en) 2016-07-15 2020-04-28 Fastbrick Ip Pty Ltd Brick/block laying machine incorporated in a vehicle
ES2899585T3 (en) 2016-07-15 2022-03-14 Fastbrick Ip Pty Ltd Material transport boom
US11441899B2 (en) 2017-07-05 2022-09-13 Fastbrick Ip Pty Ltd Real time position and orientation tracker
US11656357B2 (en) 2017-08-17 2023-05-23 Fastbrick Ip Pty Ltd Laser tracker with improved roll angle measurement
US20210016438A1 (en) 2017-08-17 2021-01-21 Fastbrick Ip Pty Ltd Interaction system configuration
AU2018348785A1 (en) 2017-10-11 2020-05-07 Fastbrick Ip Pty Ltd Machine for conveying objects and multi-bay carousel for use therewith
JP7073785B2 (en) * 2018-03-05 2022-05-24 オムロン株式会社 Image inspection equipment, image inspection method and image inspection program
IT201800005091A1 (en) 2018-05-04 2019-11-04 "Procedure for monitoring the operating status of a processing station, its monitoring system and IT product"
CN114851246A (en) * 2022-04-19 2022-08-05 深圳市大族机器人有限公司 Robot performance testing system and method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4575304A (en) 1982-04-07 1986-03-11 Hitachi, Ltd. Robot system for recognizing three dimensional shapes
WO1997005449A1 (en) 1995-07-26 1997-02-13 Crampton Stephen J Scanning apparatus and method

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE2746573C3 (en) * 1976-10-19 1981-12-17 Matsushita Electric Industrial Co., Ltd., Kadoma, Osaka Magnetic tape recording and reproducing apparatus for color television signals
US4254433A (en) * 1979-06-25 1981-03-03 General Motors Corporation Visual motion tracking system
US4841460A (en) * 1987-09-08 1989-06-20 Perceptron, Inc. Method and apparatus for calibrating a non-contact gauging sensor with respect to an external coordinate system
US5142160A (en) * 1990-08-31 1992-08-25 Frank Storbeck Method and device for determining the volume of an object on a flat background
AT396181B (en) * 1990-12-10 1993-06-25 Sprecher Energie Oesterreich DEVICE FOR DETECTING THE DIMENSIONS OF AN OBJECT MOVING OBJECT
IS1666B (en) * 1991-02-19 1997-11-14 Marel Hf Method and apparatus for determining the volume, shape and weight of fish or other parts
US5319443A (en) * 1991-03-07 1994-06-07 Fanuc Ltd Detected position correcting method
US5400638A (en) * 1992-01-14 1995-03-28 Korea Institute Of Science And Technology Calibration system for compensation of arm length variation of an industrial robot due to peripheral temperature change
US5517311A (en) * 1993-05-25 1996-05-14 Toyota Jidosha Kabushiki Kaisha Optical 3D measuring apparatus used for measuring chamber volume of a cylinder head and chamber volume correcting method for a cylinder head of an engine
US5561526A (en) * 1994-05-26 1996-10-01 Lockheed Missiles & Space Company, Inc. Three-dimensional measurement device and system
US5719678A (en) * 1994-07-26 1998-02-17 Intermec Corporation Volumetric measurement of a parcel using a CCD line scanner and height sensor
CN100524015C (en) * 1995-06-22 2009-08-05 3Dv系统有限公司 Method and apparatus for generating range subject distance image
US5699161A (en) * 1995-07-26 1997-12-16 Psc, Inc. Method and apparatus for measuring dimensions of objects on a conveyor
US5822486A (en) * 1995-11-02 1998-10-13 General Scanning, Inc. Scanned remote imaging method and system and method of determining optimum design characteristics of a filter for use therein
US5793900A (en) * 1995-12-29 1998-08-11 Stanford University Generating categorical depth maps using passive defocus sensing
US6044170A (en) * 1996-03-21 2000-03-28 Real-Time Geometry Corporation System and method for rapid shape digitizing and adaptive mesh generation
DE69632468T2 (en) * 1996-12-31 2005-05-12 Datalogic S.P.A., Lippo Di Calderara Di Reno Method for volume measurement of an object by means of a laser scanner and a CCD image sensor
WO1999012082A1 (en) * 1997-09-04 1999-03-11 Dynalog, Inc. Method for calibration of a robot inspection system
US5910845A (en) * 1997-12-02 1999-06-08 Brown; Thomas Mattingly Peripheral viewing optical scanner for three dimensional surface measurement

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4575304A (en) 1982-04-07 1986-03-11 Hitachi, Ltd. Robot system for recognizing three dimensional shapes
WO1997005449A1 (en) 1995-07-26 1997-02-13 Crampton Stephen J Scanning apparatus and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"proceedings of the International Conference on Robotics and Automation", vol. 2, 31 March 1987, IEEE COMP. SOC. PRESS, article "Modeling and Calibration of a Structured Light Scanner for 3-D Robot Vision", pages: 807 - 815

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10159574B4 (en) * 2001-10-15 2008-07-24 Tropf, Hermann, Dr.-Ing. Apparatus and method for correcting the movement of gripping and machining tools
DE10164944B4 (en) * 2001-10-15 2013-03-28 Hermann, Dr.-Ing. Tropf Apparatus and method for correcting the movement of gripping and machining tools
DE10159574B9 (en) * 2001-10-15 2009-04-30 Tropf, Hermann, Dr.-Ing. Apparatus and method for correcting the movement of gripping and machining tools
DE10335472A1 (en) * 2003-08-02 2005-02-17 E. Zoller GmbH & Co. KG Einstell- und Messgeräte Measurement and adjustment device has a positioning device for moving an exchangeable measurement head with a camera in three dimensions, said positioning device having a pivoting articulated arm
DE10354078B4 (en) * 2003-11-19 2008-09-04 Daimler Ag Clamping device for workpieces for three-dimensional optical surface measurement
DE102004024378B4 (en) * 2004-05-17 2009-05-20 Kuka Roboter Gmbh Method for robot-assisted measurement of objects
WO2006016065A1 (en) * 2004-07-19 2006-02-16 Compagnie Plastic Omnium Method and station for controlling paint work of motor vehicle bodywork parts
US7319516B2 (en) 2004-07-19 2008-01-15 Compagnie Plastic Omnium Measurement instrument for inspecting painted bodywork parts, the instrument being provided with an anti-damage device
US7277164B2 (en) 2004-07-19 2007-10-02 Compagnie Plastic Omnium Process and station for inspecting the painting of motor vehicle bodywork parts
EP1632775A3 (en) * 2004-07-19 2006-05-03 Compagnie Plastic Omnium A station for inspecting the painting of motor vehicle parts
WO2006016071A1 (en) * 2004-07-19 2006-02-16 Compagnie Plastic Omnium Measuring apparatus for controlling painted bodywork parts, provided with an anti-deterioration device
FR2873205A1 (en) * 2004-07-19 2006-01-20 Plastic Omnium Cie MOTOR VEHICLE BODY PARTS PAINTING METHOD AND POST
FR2873204A1 (en) * 2004-07-19 2006-01-20 Plastic Omnium Cie MEASURING APPARATUS FOR THE CONTROL OF PAINTED BODY PARTS WITH AN ANTI-DETERIORATION DEVICE

Also Published As

Publication number Publication date
CA2315508A1 (en) 2001-02-12
ES2340462T3 (en) 2010-06-04
DE60044170D1 (en) 2010-05-27
CA2315508C (en) 2008-10-14
US6166811A (en) 2000-12-26
EP1076221A3 (en) 2003-03-26
EP1076221B1 (en) 2010-04-14

Similar Documents

Publication Publication Date Title
US6166811A (en) Robot-based gauging system for determining three-dimensional measurement data
JP2511246B2 (en) Robot control method
US6285959B1 (en) Method and apparatus for calibrating a non-contact gauging sensor with respect to an external coordinate system
US6460004B2 (en) Method and apparatus for calibrating a non-contact gauging sensor with respect to an external coordinate system
US5748505A (en) Method and apparatus for calibrating a noncontact gauging sensor with respect to an external coordinate system
US6134507A (en) Method and apparatus for calibrating a non-contact gauging sensor with respect to an external coordinate system
CN101363717B (en) Method and measuring system for contactless coordinate measurement of the surface of an object
US5805287A (en) Method and system for geometry measurements
US20030048459A1 (en) Measurement system and method
US20090046895A1 (en) Method and measurement system for contactless coordinate measurement on an object surface
JP2001515236A (en) Method for calibrating a robot inspection system
JP2003505682A (en) A system for scanning the geometry of large objects
EP1447644A1 (en) Measurement of spatial coordinates
EP1091186A2 (en) Method and apparatus for calibrating a non-contact gauging sensor with respect to an external coordinate system
US5363185A (en) Method and apparatus for identifying three-dimensional coordinates and orientation to a robot
JPH11183134A (en) Method for measuring interval and displacement amount between facing parts
CN110017769A (en) Part detection method and system based on industrial robot
US5160977A (en) Position detection device
US11276198B2 (en) Apparatus for determining dimensional and geometric properties of a measurement object
WO2006114216A1 (en) Method and device for scanning an object using robot manipulated non-contact scannering means and separate position and orientation detection means
CN116294993A (en) Coordinate measuring system
Kyle et al. Robot calibration by optical methods
WO2001001068A1 (en) Measurement apparatus for measuring the position and orientation of a first part to be worked, inspected or moved
CN114964185B (en) Spatial data measuring device and measuring method thereof
Owens Robotrak: calibration on a shoestring

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

AX Request for extension of the european patent

Free format text: AL;LT;LV;MK;RO;SI

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

AX Request for extension of the european patent

Extension state: AL LT LV MK RO SI

RIC1 Information provided on ipc code assigned before grant

Ipc: 7G 01B 7/00 A

Ipc: 7G 01B 11/25 B

17P Request for examination filed

Effective date: 20030823

AKX Designation fees paid

Designated state(s): BE DE ES FR GB IT

17Q First examination report despatched

Effective date: 20080311

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): BE DE ES FR GB IT

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REF Corresponds to:

Ref document number: 60044170

Country of ref document: DE

Date of ref document: 20100527

Kind code of ref document: P

REG Reference to a national code

Ref country code: ES

Ref legal event code: FG2A

Ref document number: 2340462

Country of ref document: ES

Kind code of ref document: T3

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: ES

Payment date: 20100826

Year of fee payment: 11

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20100825

Year of fee payment: 11

Ref country code: FR

Payment date: 20100831

Year of fee payment: 11

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20100825

Year of fee payment: 11

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: BE

Payment date: 20100825

Year of fee payment: 11

26N No opposition filed

Effective date: 20110117

BERE Be: lapsed

Owner name: PERCEPTRON, INC.

Effective date: 20110831

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20110810

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20120430

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20110810

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20110831

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20110810

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20110831

REG Reference to a national code

Ref country code: ES

Ref legal event code: FD2A

Effective date: 20130605

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20110811

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20190828

Year of fee payment: 20

REG Reference to a national code

Ref country code: DE

Ref legal event code: R071

Ref document number: 60044170

Country of ref document: DE