US20050065649A1 - Manipulation of objects - Google Patents

Manipulation of objects Download PDF

Info

Publication number
US20050065649A1
US20050065649A1 US10/822,133 US82213304A US2005065649A1 US 20050065649 A1 US20050065649 A1 US 20050065649A1 US 82213304 A US82213304 A US 82213304A US 2005065649 A1 US2005065649 A1 US 2005065649A1
Authority
US
United States
Prior art keywords
objects
vehicle
sensing
controller
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/822,133
Inventor
Daniel Rosenfeld
Joel Kollin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
New York University NYU
Original Assignee
New York University NYU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by New York University NYU filed Critical New York University NYU
Priority to US10/822,133 priority Critical patent/US20050065649A1/en
Assigned to NEW YORK UNIVERSITY reassignment NEW YORK UNIVERSITY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KOLLIN, JOEL S.
Assigned to NEW YORK UNIVERSITY reassignment NEW YORK UNIVERSITY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PERLIN, KENNETH
Assigned to NEW YORK UNIVERSITY reassignment NEW YORK UNIVERSITY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ROSENFELD, DANIEL A.
Publication of US20050065649A1 publication Critical patent/US20050065649A1/en
Priority to US12/586,595 priority patent/US8725292B2/en
Priority to US14/273,151 priority patent/US9760093B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/04Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles

Definitions

  • the present invention is related to the 2D tracking of N objects, where N is an integer greater than or equal to 2. More specifically, the present invention is related to the 2D tracking of N objects, where N is an integer greater than or equal to 2 with a centralized control and 2D locating a controller.
  • the present invention is a practical, flexible and low cost planar manipulator display that can simultaneously move dozens of physical objects upon a surface under computer control and sense movement of those objects by users, as shown in FIGS. 1 and 3 .
  • Each of many physical objects can be moved quickly, accurately and independently upon a surface, (ii) the positions of the objects can be accurately sensed (iii) the mechanism scales gracefully to surfaces of large area, and (iv) the cost per movable object does not exceed several dollars, thereby enabling widespread application in laboratories, classrooms, and eventually the home.
  • Such a device could be used in conjunction with a projection display, which projects information upon the surface, and applications in which users' directives are recognized via vision-based gesture recognition and voice recognition. Work between two or more co-located collaborators can be aided by such an information tool, by making use of shared proprioception (body-space awareness).
  • Applications could include military scenario simulation, studies of the flow and interaction of people in social or emergency evacuation situations, automotive traffic testing and evaluation, interactive algorithms for arrangement of furniture or architecture, and interactive science education, one example being a physically responsive kit of optical components that can form themselves into functional arrangements under user control.
  • the Actuated Workbench at MIT is a table consisting of a packed array of electromagnets. By varying the relative strengths of these magnets, ferromagnetic objects can be manipulated by being handed along between neighboring magnets [Gian Pangaro, Dan Maynes-Aminzade, Hiroshi Ishii The Actuated Workbench: Computer-Controlled Actuation in Tabletop Tangible Interfaces UIST 2002 Symposium on User Interface Software and Technology Paris, France, Oct. 27-30, 2002, incorporated by reference herein].
  • the advantage of the actuated workbench is its lack of moving parts, and the fact that every location on the surface always contains an actuator.
  • the Virtual Vehicle is a tabletop packed with a checkerboard array of computer-controlled motor-actuated protruding rollers that alternate between two orthogonal directions (rollers at even squares are perpendicular to those at odd squares). Each protruding roller is independently drivable; objects are translated or rotated by varying the rotation of subsets of rollers [J. Luntz, W. Messner, and H. Choset Virtual Vehicle: Parcel Manipulation and Dynamics with a Distributed Actuator Array Proceedings of the SPIE International Symposium on Intelligent Systems and Advanced Manufacturing, Sensors and Controls for Advanced Manufacturing, Vol. SPIE 3201, 1997, incorporated by reference herein].
  • the Courier Robot project at CMU [A. Quaid and A. Rizzi Robust and Efficient Motion Planning for a Planar Robot Using Hybrid Control IEEE International Conference on Robotics and Automation 2000, Vol. 4, April, 2000, pp. 4021-4026; R. Hollis and A. Quaid An Architecture for Agile Assembly American Society of Precision Engineering 10th Annual Mtg, October, 1995, both of which are incorporated by reference herein] consists of a two directional planar (Sawyer) motor in which the stator is an entire tabletop and the rotor is a self-contained vehicle running a sophisticated closed-loop control that rides atop a very thin air gap. This approach allows extremely rapid and finely controlled movement.
  • the present invention pertains to a system for manipulation of objects.
  • the system comprises N objects, where N is greater than or equal to 2 and is an integer; and means for controlling and 2D locating of the N objects.
  • the present invention pertains to a method for manipulating objects.
  • the method comprises the steps of receiving information from N objects, where N is greater than or equal to 2 and is an integer, at a centrally controlling and 2D locating controller; determining 2D locations by the controller of the N objects; and transmitting from the controller directions to the N objects for the N objects to move.
  • the transmitting step includes the step of transmitting from the controller kinematic parameters to the N objects.
  • the present invention pertains to an apparatus for tracking.
  • the apparatus comprises N objects, where N is greater than or equal to 2 and is an integer, each object having an emitter which emits light; and means for 2D sensing of the N objects over time from the light emitted by each emitter.
  • the apparatus preferably includes a planar element on which the N objects are disposed, and wherein the sensing means includes at least 2 1-D sensors that sense the light emitted from the edge of the planar element on which the objects are disposed.
  • the present invention pertains to a method for tracking.
  • the method comprises the steps of emitting light from N objects, where N is greater than or equal to 2 and is an integer; and sensing 2D locations of the N objects over time from the emitted light from the N objects.
  • the sensing step includes the step of sensing 2D locations of the N objects over time from the emitted light from the N objects through an edge of a planar element on which the N objects are disposed.
  • FIG. 1 is a computer generated drawing showing the system of the present invention in use.
  • FIG. 2 is a block diagram of the system architecture.
  • FIG. 3 is a computer generated drawing of the system in use.
  • FIGS. 4 a , 4 b and 4 c show the relationship between the sensing and communications timing of the system.
  • FIG. 5 is a schematic representation of a position sensing detector.
  • FIGS. 6 a , 6 b , 6 c , 6 d , 6 e , 6 f and 6 g are schematic representations of components of the vehicle of the system.
  • FIGS. 7 a , 7 b , 7 c and 7 d are schematic representations of a table controller.
  • FIGS. 8 a , 8 b , 8 c and 8 d are schematic representations of the vehicle mechanical design.
  • FIG. 9 is a computer generated drawing of an edge view image.
  • FIG. 10 is a schematic representation of an edge viewing diagram.
  • FIG. 11 is a schematic representation of a sealed variant of the system.
  • FIG. 12 is a schematic representation to a holonomic vehicle.
  • FIG. 13 is a one-dimensional view of the position sensing detector.
  • FIGS. 2 and 10 there is shown a system for manipulation of objects.
  • the system comprises N objects, where N is greater than or equal to 2 and is an integer; and means for controlling and 2D locating of the N objects.
  • the controlling means 14 includes indicators disposed on the object 12 .
  • the controlling means 14 preferably includes sensing means 20 for locating the objects 12 .
  • the position indicators include emitters 18 which indicate a position of an object 12 .
  • the objects 12 are preferably vehicles 26 .
  • the controlling means 14 includes a vehicle controller 32 disposed with each vehicle 26 .
  • the vehicle controller 32 of each vehicle 26 preferably includes an MCU 34 .
  • the sensing means 20 includes sensors 24 .
  • the emitters 18 preferably include LEDs 30 .
  • the present invention pertains to a method for manipulating objects.
  • the method comprises the steps of receiving information from N objects, where N is greater than or equal to 2 and is an integer, at a centrally controlling and 2D locating controller; determining 2D locations by the controller of the N objects; and transmitting from the controller directions to the N objects for the N objects to move.
  • the transmitting step includes the step of transmitting from the controller kinematic parameters to the N objects.
  • the present invention pertains to an apparatus for tracking.
  • the apparatus comprises N objects, where N is greater than or equal to 2 and is an integer, each object having an emitter which emits light; and means for 2D sensing of the N objects over time from the light emitted by each emitter.
  • the apparatus preferably includes a planar element on which the N objects are disposed, and wherein the sensing means includes at least 2 1-D sensors that sense the light emitted from the edge of the planar element on which the objects are disposed.
  • the present invention pertains to a method for tracking.
  • the method comprises the steps of emitting light from N objects, where N is greater than or equal to 2 and is an integer; and sensing 2D locations of the N objects over time from the emitted light from the N objects.
  • the sensing step includes the step of sensing 2D locations of the N objects over time from the emitted light from the N objects through an edge of a planar element on which the N objects are disposed.
  • the basic approach is to move each object upon the surface by a small telemanipulated wheeled vehicle.
  • Each vehicle is inexpensive, unobtrusive and independently controllable.
  • Each vehicle is designed to be extremely simple and inexpensive.
  • a central processor telemanipulates all vehicles. This process is done within successive update cycles; the duration of each cycle is on the order of five or ten milliseconds.
  • each vehicle's current location and orientation are detected by a central processor; the central processor then specifies, for each vehicle, a velocity for each of the vehicle's drive wheels.
  • Each cycle is temporally divided into successive time slices; each time slice is assigned to a single vehicle, during which all communication between that vehicle and the central processor takes place.
  • Individual vehicles possess no model of their own position. Rather, a vehicle transmits its position and orientation to the central processor by successively flashing LEDs mounted upon the vehicle's chassis. The central processor computes the position and orientation of the vehicle from the measured positions of these LEDs.
  • the surface is a table, and each vehicle forms a mobile coaster upon which objects may be placed.
  • the display mechanism is quiet and rugged and its physical footprint less obtrusive. Power limitations are removed via an improved means for distributing electrical current to each vehicle, and to enable the display to be able to function as a vertical wall surface, if desired.
  • a bulky tracker is replaced by a form of position tracker which operates entirely within the table surface itself.
  • a sealed design places vehicles between top and bottom surfaces, with objects put upon thin ‘coasters’ on the top surface. This design eliminates the problem of vehicle recharging, reduces noise, and allows a for portable, self-contained device.
  • the architecture includes four major components: vehicles, including their mechanical and electrical subsystems, a table controller, position sensing opto-electronics, and a standard PC.
  • vehicles including their mechanical and electrical subsystems, a table controller, position sensing opto-electronics, and a standard PC.
  • the position sensing subsystem includes a lens, a position sensing device (PSD) and an amplifier.
  • PSD position sensing device
  • ADC analog to digital converter
  • a proportional, integral, derivative (PID) control system running on the table microcontroller (MCU) compares these values with a desired vehicle trajectory sent from the host PC.
  • the output from the control system is a set of speed commands for the two geared DC motors on the vehicle. Commands are sent via 115,200 bps infrared communication link based on the IRDA physical layer.
  • a communications protocol implemented between the host PC and the table controller allows a PC application to specify vehicle trajectories and position.
  • Software components with hard real-time requirements are implemented in the table MCU, whereas application code is implemented on the PC, benefitting from that platform's greater resources and superior development tools.
  • Both the table controller and the vehicle are designed around ATMEL AVRTM series 8-bit microcontrollers.
  • Vehicle position sensing and communication between the vehicle and table controller are interleaved within a system update cycle, as shown in FIGS. 4 a - 4 c .
  • every vehicle has its position sensed and receives new motor commands.
  • Each vehicle is assigned a separate ID [0, N-1], and this determines the intervals within the cycles (‘frames’) when it must pulse its locator LEDs and listen for motor commands.
  • frames intervals within the cycles
  • Tupdate is around 4.5 milliseconds.
  • the position sensing subsystem includes a lens, a position sensing detector (PSD)—also known as a lateral-effect photodiode—and a custom-built amplifier, as shown in FIG. 5 .
  • PSD position sensing detector
  • the lens situated roughly 80 cm below the lm square table, images the LEDs' outputs onto the surface of the PSD.
  • Low-level currents developed at the PSD are amplified and filtered to remove noise and interfering signals, and then read by the table controller via an analog to digital converter (ADC).
  • ADC analog to digital converter
  • the photoconductive mode of the PSD is used.
  • Each of the four PSD terminals is connected to a transimpedance amplifier stage with a gain of 2.2M V/A implemented with a low bias current FET Op Amp.
  • a subsequent stage provides an additional 15 ⁇ voltage gain.
  • An optical filter over the PSD removes light outside the infrared region.
  • a precision ADC on the table controller converts the output voltages of the PSD amplifier into digital values read by a microcontroller on the table controller board.
  • Non-linearities in the PSDs output as a function of position are unwarped by performing a 2D interpolation through a table generated from calibration data. The unwarped values are then used to compute X and Y positions for each diode and, in turn, an orientation for the vehicle.
  • the functions required of the vehicle's electronics are minimal—primarily reception of motor commands, varying motor voltage on the basis of those commands, and synchronizing with the system update cycle. No sensing and only minimal computation are performed locally, as shown in FIGS. 6 a - 6 g.
  • An IRDA endec IC converts the IRDA format signal from the transceiver into an asynchronous serial bit stream.
  • the MCU's internal UART recovers bytes from the stream for interpretation as motor commands.
  • Two H-bridge driver circuits enable bi-directional control of the vehicle's DC motors.
  • PWM control signals from the MCU set the average voltage seen by the motor by varying the duty cycle in proportion to the value of the received motor command.
  • Two LED control circuits enable the infrared LEDs to be driven at high current (approximately 300 mA) by low-current MCU output pins.
  • Both the motor and LED circuits are powered directly by a 3.6V (nominal) NiMH rechargeable battery.
  • a 3.0V low-dropout regulator powers the rest of the vehicle's electronics and helps to isolate the sensitive IRDA transceiver from power supply noise generated by the motor and large LED current pulses.
  • the primary components of the table controller are an Atmel ATMega128 MCU, a Maxim MAX1270 12-bit ADC, and an IRDA transceiver and endec, as shown in FIGS. 7 a - 7 d.
  • the ADC is connected to the four output channels (x1, x2, y1, y2) of the PSD amplifier.
  • the MCU directs the ADC to sample each of these channels once per LED locator pulse frame.
  • a serial protocol (SPI) is used for communication between the MCU and the ADC IC.
  • Motor commands from the MCU's control system are encoded into IRDA format by the endec IC and transmitted by the transceiver.
  • the table controller communicates to the PC via a RS232 serial link.
  • the link is implemented with the MCU's second UART, connected through a DS232 voltage level converter to the PC serial port.
  • the vehicle for the active table is built on a circular platform, with two driven wheels connected to small DC gear motors, as shown in FIGS. 8 a - 8 d .
  • Power is supplied by three 700 mAh AAA NiMH cells, which can power the vehicle for two to ten hours, depending on motor use.
  • the printed circuit board which contains the vehicle's circuitry also acts a chassis, providing a rigid frame onto which mechanical components are attached.
  • the vehicle is covered by a capped cylindrical shell, onto which the models used in a particular application are placed.
  • an object to be tracked travels across a surface made of acrylic plastic sheet which is doped with a fluorescent dye.
  • the object directs light from an attached LED, whose wavelength is in the absorption band of the dye, into the plastic sheet.
  • the absorbed light is reemitted at a longer wavelength, creating a narrow light source inside the plastic beneath the LED. Since the plastic is nearly transparent to light at its emission peak, little of the emitted light is absorbed by the material.
  • the sheet acts as light pipe, directing most of the emitted light to its edges, where it appears as tapered vertical band. This effect is visible in FIG. 9 .
  • the image was made by illuminating a sheet of orange-emitting fluorescent acrylic with a blue LED held against the sheet's surface, and then viewing the sheet from its edge.
  • This band is imaged through a wide angle lens onto the surface of one-dimensional position sensing detectors (PSDs) placed at two cut corners of the sheet.
  • PSDs position sensing detectors
  • the edge-emitted light pattern has the symmetric intensity pattern required to properly locate its center. Measurement of received optical power, indicate moderately weak signals will be present at the PSD and will therefore require a high-gain, low noise amplifier. Careful matching of fluorescent dye, LED, and PSD characteristics can be expected to improve signal strength.
  • An open variant of the design is a direct adaptation of the edge viewing method, where objects attach directly to mounting bases on vehicles.
  • This variant in its simplest form would include rechargeable batteries in vehicles, perhaps with a recharging ‘station’ to which a vehicle could go when its voltage was low.
  • Inductively coupled power delivery across the table surface is also used. This makes possible to eliminate or at least reduce the size of the on-board battery.
  • the sealed version, as shown in FIG. 11 places vehicles between top and bottom surfaces. Objects are put upon thin coasters on the top surface. These coasters are magnetically coupled to vehicles inside the table and travel on small rollers to minimize friction.
  • Vehicle power is supplied on fine metal meshes inside the top and bottom surfaces which are at different DC potentials.
  • Flexible metal contact brushes electrically connect the vehicle to the meshes, while reducing sensitivity to variations to in surface height.
  • High transparency steel mesh is used between the vehicle and the position sensing surface. http://www.twpinc.com/high_trans.html, incorporated by reference herein.
  • the sealed variant while more complex, could provide several important advantages over the unsealed version. Delivering power to directly to each vehicle eliminates the per-vehicle cost of rechargeable cells, and eliminates problem of recharging. It also eliminates trade-offs between vehicle performance, mass, and battery life. Vehicle performance becomes limited only by power density of available motors.
  • this architecture allows a “client/server” arrangement, in which there is a pool of available vehicles within the surface. These vehicles can be programmed to work cooperatively to move large or articulated objects. This decoupling also eliminates potential vehicle damage as users manipulate objects.
  • the methods available to sense user control of objects depend on the physical design of the table. In configurations where the object is mechanically coupled to the vehicle, it is possible to detect when users pick up and move objects by monitoring the error between the commanded and measured motion of the vehicles. In the simplest case, when a vehicle which has been commanded to stop is nonetheless in motion, it can be assumed that the user is moving the vehicle.
  • Direct tracking of coaster positions could be implemented by adding locator LEDs to coasters and putting the sensing surface (e.g. fluorescent acrylic sheet) between the vehicles and the coaster.
  • Coasters would need to carry batteries, but due to the very low duty cycle of the LED locator signal and consequently low average current, run time of approximately ten hours (continuous operation) should be possible between recharges for appropriately sized NiMH cells.
  • a second alternative for the sealed configuration is to use computer vision techniques to track objects and coasters. This could be developed in conjunction with the gesture tracking subsystem described elsewhere in this document.
  • the most direct approach is to make the length of the frames in which LEDs are pulsed shorter than the length of the communications frames. For example, changing the pulse time to ten microseconds—easily achievable with available ADCs—would allow 84 vehicles to be sensed and controlled at 100 hz.
  • the communications rate could be increased to as much 4 Mbps (with some cost impact), leading to a theoretical communications frame width as small as 2.8 microseconds. Though it might be difficult to synchronize vehicles and the table controller well enough to achieve quite this rate, ten microseconds per frame should be quite achievable.
  • the current scheme interleaves communication and sensing intervals to avoid optical interference between the two sub-systems, which both use infrared light. If the position sensing sub-system were to use light outside the infrared band, it would be possible to use optical filters to separate the two types of signals, and thus enable communication and position sensing to be overlapped. (In fact, the “2-Dimensional Position Sensing by Edge Viewing” method described elsewhere could use visible light.) Combining all three of these methods would allow tracking of 500 vehicles at 100 hz—a large safety margin beyond any physically practical number of vehicles.
  • the table controller is designed around a more powerful, 32 bit MCU such as the ARM 940 T. This will provide the computational resources required to run the larger number of vehicles envisioned.
  • holonomic motion refers to the ability of the vehicle to control orientation independently from direction.
  • holonomy greatly simplifies motion control and path planning.
  • Planar manipulator displays alone will provide a compelling medium for many applications.
  • the system's functionality as an output device can be reinforced, by making the tabletop surface itself a graphical display device, e.g. by projecting video onto the table from above.
  • Dynamic table graphics should provide a strong sense of context to the presentation provided by physical objects. Adding this, along with other obvious cues such as audio, should more effectively “complete” the simulation for the user.
  • the table's strengths are noted as a direct-interaction input device, it is appropriate to consider how the table would be integrated with the other non-contact forms of human input. For instance, what the user might be doing with his/her hand when it is not in contact with the coaster-objects can be involved. This is gesture recognition, and the most appropriately applicable form of this technology would be a passive system, e.g. computer-vision-based—an area which does have a fair amount of mature research [Segen, J. “Gest: A learning computer vision system that recognizes gestures,” Machine Learning IV, pp. 621-634, Morgan Kauffman, 1994, edited by Michalski et al.; Segen, J. and S. Kumar.
  • integrating a computer vision system may also address the coaster-tracking problem that arises when the table is implemented in its sealed variant, where coasters can possibly be decoupled from vehicles.
  • Another natural companion input mode is voice recognition. It could be useful for the Table because it would be used in concert with direct-interaction and gesture recognition.
  • miniature military figures can be strategically positioned for attack or defense. Personnel can be made to hide behind buildings, out of the line of sight of enemy combatants. Simulation of exhaustion or other disability can be simulated by limiting maximum speed of travel.
  • the system an be used for applications involving groups or crowds of people.
  • One application is emergency evacuation planning.
  • Another is simulation and examination of how people react in social groups, including social clustering and dominance behaviors.
  • Emergency evacuation scenarios can be played out, with direct physical visualization of potential areas of congestion.
  • Another application is study of traffic flow. This can involve study of strategies for avoiding congestion, of interaction between vehicles and pedestrians, and to determine the effects of variation in policy for city planning, such as sidewalk/crosswalk widths. Simulations of steering and parking strategies can be used to design optimal parking areas, or the effects of introducing oversized vehicles to an area. Physical simulation can be used to compare strategies for dealing with unsafe or DWI drivers.
  • the current PC programming interface for the planar manipulator display is relatively simple, providing access to vehicle position and orientation, and allowing path waypoints to be commanded.
  • the programming interface can be extended by implementing path planning using established techniques [STOUT, Bryan. The Basics of Path Planning. Game Programming Gems, pp. 254-263. Hingham, USA: Charles River Media, 2000; RABIN, Steve. Speed Optimizations. Game Programming Gems, pp. 272-287. Hingham, USA: Charles River Media, 2000; STERREN, William van der. Tactical Path-Finding. Game Programming Gems 3, pp. 294-306. Hingham, USA: Charles River Media, 2002, all of which are incorporated by reference herein], and by providing support for user-input detection.
  • a Position Sensing Detector is a type of photodiode whose output represents the position of incident light striking its surface.
  • a PSD consists of two photo-sensitive layers (the P- and N-type layers) separated by a transparent layer.
  • two electrodes positioned at opposite ends of the P-layer detect the photocurrent created by photoelectric conversion of light striking the layer.
  • the current at each electrode is inversely proportional to its distance from the incident position of the light. If X 1 and X 2 represent the current at each electrode, and x is the position of the incident light, then their relationship is described by (1) below.
  • PSD-derived positions depend only on the location of the centroid of the light spot, and are independent of the brightness of the spot or its sharpness. This allows a simple and inexpensive optical design to be implemented. This feature enables the “2-Dimensional Position Sensing by Edge Viewing” method, described herein, which depends on accurately locating the center of a diffuse light pattern.
  • PSDs are capable of very high-speed operation, with limits dictated primarily by the rise time of the element—often less than one microsecond. With properly designed interface electronics, they can achieve positional resolutions of one part in ten thousand.
  • a vehicle transmits its position and orientation to the central processor by successively flashing two LEDs mounted upon the vehicle's chassis. Light from the LEDs is imaged onto the surface of a two-dimensional lateral-effect photodiode, which through associated analog circuitry, produces voltages which depend on the location of the imaged light on the photodiode's surface.
  • These output voltages are sampled through an analog to digital converter in synchrony with the pattern of flashes from vehicle LEDs, and enable computation X and Y positions for each LED.
  • the second technique is to construct the sensing surface from a clear material, but with geometry designed to maximize total internal reflection inside the surface—i.e. to use it as a ‘light pipe’.
  • the vehicle emitter would be modified to emit light only in the range of angles which would be ‘captured’ internally by the sheet (i.e. rather than traveling straight through the surface or reflecting off the surface.)
  • tracking of objects described herein can be used whether the objects are moved under their own power, or if something or someone moves the objects.

Abstract

A system for manipulation of objects. The system includes N objects, where N is greater than or equal to 2 and is an integer; and a mechanism for controlling and 2D locating of the N objects. A method for manipulating objects. The method includes the steps of receiving information from N objects, where N is greater than or equal to 2 and is an integer, at a centrally controlling and 2D locating controller; determining 2D locations by the controller of the N objects; and transmitting from the controller directions to the N objects for the N objects to move. An apparatus for tracking. The apparatus includes N objects, where N is greater than or equal to 2 and is an integer, each object having an emitter which emits light; and a mechanism for 2D sensing of the N objects over time from the light emitted by each emitter. The present invention pertains to a method for tracking. The method includes the steps of emitting light from N objects, where N is greater than or equal to 2 and is an integer; and sensing 2D locations of the N objects over time from the emitted light from the N objects.

Description

    FIELD OF THE INVENTION
  • The present invention is related to the 2D tracking of N objects, where N is an integer greater than or equal to 2. More specifically, the present invention is related to the 2D tracking of N objects, where N is an integer greater than or equal to 2 with a centralized control and 2D locating a controller.
  • BACKGROUND OF THE INVENTION
  • Our human brains are particularly good at solving problems when we are able to make use of our physical and propriocentric intuition, yet current computer interfaces make little use of these abilities. The hypothesis underlying the present invention is that interaction mediated by active computer-controlled objects will improve understanding and collaboration in many types of simulations for which screen-based interaction is not optimal. Current methods to affect such a capability are either expensive or limited in important ways.
  • The present invention is a practical, flexible and low cost planar manipulator display that can simultaneously move dozens of physical objects upon a surface under computer control and sense movement of those objects by users, as shown in FIGS. 1 and 3. Specifically, (i) Each of many physical objects can be moved quickly, accurately and independently upon a surface, (ii) the positions of the objects can be accurately sensed (iii) the mechanism scales gracefully to surfaces of large area, and (iv) the cost per movable object does not exceed several dollars, thereby enabling widespread application in laboratories, classrooms, and eventually the home.
  • Such a device could be used in conjunction with a projection display, which projects information upon the surface, and applications in which users' directives are recognized via vision-based gesture recognition and voice recognition. Work between two or more co-located collaborators can be aided by such an information tool, by making use of shared proprioception (body-space awareness).
  • Applications could include military scenario simulation, studies of the flow and interaction of people in social or emergency evacuation situations, automotive traffic testing and evaluation, interactive algorithms for arrangement of furniture or architecture, and interactive science education, one example being a physically responsive kit of optical components that can form themselves into functional arrangements under user control.
  • There is ample precedent to show that a rethinking of the physical interface to the computer can lead to a profound change in the use of computers in society. This is logical: Having evolved as physical creatures, our reasoning skills are tightly coupled to our perceptual skills. For example, as computer output has shifted from low fidelity text displays to high resolution full color displays, there has been a corresponding shift not only in the way we interact with computers, but in our very uses of computers. A striking example of this has been the recent great increase in computer use by the general populace, and the rapid and widespread adoption of the World Wide Web that occurred as soon as hyperlinked images were implemented in browsers.
  • The benefits of passive (non-actuated) physical objects in a user interface have been demonstrated by many researchers, including [Robert J. K. Jacob, Hiroshi Ishii, Gian Pangaro, and James Patten, A Tangible Interface for Organizing Information Using a Grid CHI 2002 Conference on Human Factors in Computing Systems Minneapolis, Minn. 20-25 April 2002; James Patten, Hiroshi Ishii and Gian Pangaro: Sensetable: A Wireless Object Tracking Platform for Tangible User Interfaces CHI 2001 Conference on Human Factors in Computing Systems Seattle, Wash., USA Mar. 31-Apr. 5, 2001; Hiroshi Ishii, Brygg Ullmer: Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms. CHI 1997: 234-241; G. Fitzmaurice, H. Ishii, and W. Buxton, “Bricks: Laying the Foundations for Graspable User Interfaces, Proceedings of CHI '95 (1995), pp. 442-449, all of which are incorporated by reference herein.
  • Several other research groups have made enabling technologies for planar manipulation of physical objects. Each of these systems has a particular limitation or deficiency which precludes implementation at reasonable cost for simultaneous planar manipulation of many objects.
  • The only approach that has been demonstrated to work on simultaneous planar transport of multiple objects is the Universal Planar Manipulator by Dan Reznik at Berkeley [D. Reznik, “The Universal Planar Manipulator”, Ph.D. Thesis, UCBerkeley, EECS, October 2000; D. Reznik and J. Canny, “Universal Part Manipulation in the Plane with a Single Horizontally-Vibrating Plate”, 3rd International Workshop on Algorithmic Foundations of Robotics (WAFR), Houston, Tex., March, 1998, both of which are incorporated by reference herein.] This system creates small vibratory movement of the surface, exploiting the non-linearity of friction to “shake” objects along a table surface. By time-slicing the vibration patterns, each object can be addressed individually and made to follow an independent trajectory. The major advantages of this approach are that it works with any object, and that it is relatively inexpensive.
  • The deficiency of this approach is that maximum speed of object movement decreases linearly with the number of objects. Since the frequencies used require approximately 10 milliseconds to address one object, only a small number of objects can be simultaneously moved at interactively useful speeds. Also, the presence of an unavoidable continual vibration of the entire table precludes the use of this system where such vibration would be considered objectionable. The system is also inherently limited to be operable only on horizontal surfaces.
  • The Actuated Workbench at MIT is a table consisting of a packed array of electromagnets. By varying the relative strengths of these magnets, ferromagnetic objects can be manipulated by being handed along between neighboring magnets [Gian Pangaro, Dan Maynes-Aminzade, Hiroshi Ishii The Actuated Workbench: Computer-Controlled Actuation in Tabletop Tangible Interfaces UIST 2002 Symposium on User Interface Software and Technology Paris, France, Oct. 27-30, 2002, incorporated by reference herein]. The advantage of the actuated workbench is its lack of moving parts, and the fact that every location on the surface always contains an actuator.
  • Its deficiencies include a relatively weak effective force (only movement of carefully chosen lightweight objects has been demonstrated) and large weight of the packed electromagnets, which increases linearly with surface area.
  • The Virtual Vehicle is a tabletop packed with a checkerboard array of computer-controlled motor-actuated protruding rollers that alternate between two orthogonal directions (rollers at even squares are perpendicular to those at odd squares). Each protruding roller is independently drivable; objects are translated or rotated by varying the rotation of subsets of rollers [J. Luntz, W. Messner, and H. Choset Virtual Vehicle: Parcel Manipulation and Dynamics with a Distributed Actuator Array Proceedings of the SPIE International Symposium on Intelligent Systems and Advanced Manufacturing, Sensors and Controls for Advanced Manufacturing, Vol. SPIE 3201, 1997, incorporated by reference herein].
  • This mechanism shares with the Universal Planar Manipulator the property that unprepared objects may be transported—objects need not be ferromagnetic.
  • Disadvantages are that the mechanical complexity and cost are relatively high, increasing linearly with unit area. Only a small demonstration unit has been made; it is not clear that it would be practical to scale this device up to cover a large surface.
  • The Courier Robot project at CMU [A. Quaid and A. Rizzi Robust and Efficient Motion Planning for a Planar Robot Using Hybrid Control IEEE International Conference on Robotics and Automation 2000, Vol. 4, April, 2000, pp. 4021-4026; R. Hollis and A. Quaid An Architecture for Agile Assembly American Society of Precision Engineering 10th Annual Mtg, October, 1995, both of which are incorporated by reference herein] consists of a two directional planar (Sawyer) motor in which the stator is an entire tabletop and the rotor is a self-contained vehicle running a sophisticated closed-loop control that rides atop a very thin air gap. This approach allows extremely rapid and finely controlled movement. However, courier robots are really designed for the speed and precision required for precision assembly in miniature table-top factories. The high cost per vehicle, and per unit area of tabletop, as well as the power cabling needed for each vehicle, preclude their use in the user interface context which is the focus of this proposal.
  • SUMMARY OF THE INVENTION
  • The present invention pertains to a system for manipulation of objects. The system comprises N objects, where N is greater than or equal to 2 and is an integer; and means for controlling and 2D locating of the N objects.
  • The present invention pertains to a method for manipulating objects. The method comprises the steps of receiving information from N objects, where N is greater than or equal to 2 and is an integer, at a centrally controlling and 2D locating controller; determining 2D locations by the controller of the N objects; and transmitting from the controller directions to the N objects for the N objects to move. Preferably, the transmitting step includes the step of transmitting from the controller kinematic parameters to the N objects.
  • The present invention pertains to an apparatus for tracking. The apparatus comprises N objects, where N is greater than or equal to 2 and is an integer, each object having an emitter which emits light; and means for 2D sensing of the N objects over time from the light emitted by each emitter. The apparatus preferably includes a planar element on which the N objects are disposed, and wherein the sensing means includes at least 2 1-D sensors that sense the light emitted from the edge of the planar element on which the objects are disposed.
  • The present invention pertains to a method for tracking. The method comprises the steps of emitting light from N objects, where N is greater than or equal to 2 and is an integer; and sensing 2D locations of the N objects over time from the emitted light from the N objects. Preferably, the sensing step includes the step of sensing 2D locations of the N objects over time from the emitted light from the N objects through an edge of a planar element on which the N objects are disposed.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In the accompanying drawings, the preferred embodiment of the invention and preferred methods of practicing the invention are illustrated in which:
  • FIG. 1 is a computer generated drawing showing the system of the present invention in use.
  • FIG. 2 is a block diagram of the system architecture.
  • FIG. 3 is a computer generated drawing of the system in use.
  • FIGS. 4 a, 4 b and 4 c show the relationship between the sensing and communications timing of the system.
  • FIG. 5 is a schematic representation of a position sensing detector.
  • FIGS. 6 a, 6 b, 6 c, 6 d, 6 e, 6 f and 6 g are schematic representations of components of the vehicle of the system.
  • FIGS. 7 a, 7 b, 7 c and 7 d are schematic representations of a table controller.
  • FIGS. 8 a, 8 b, 8 c and 8 d are schematic representations of the vehicle mechanical design.
  • FIG. 9 is a computer generated drawing of an edge view image.
  • FIG. 10 is a schematic representation of an edge viewing diagram.
  • FIG. 11 is a schematic representation of a sealed variant of the system.
  • FIG. 12 is a schematic representation to a holonomic vehicle.
  • FIG. 13 is a one-dimensional view of the position sensing detector.
  • DETAILED DESCRIPTION
  • Referring now to the drawings wherein like reference numerals refer to similar or identical parts throughout the several views, and more specifically to FIGS. 2 and 10 thereof, there is shown a system for manipulation of objects. The system comprises N objects, where N is greater than or equal to 2 and is an integer; and means for controlling and 2D locating of the N objects.
  • Preferably, the controlling means 14 includes indicators disposed on the object 12. The controlling means 14 preferably includes sensing means 20 for locating the objects 12. Preferably, the position indicators include emitters 18 which indicate a position of an object 12. The objects 12 are preferably vehicles 26.
  • Preferably, the controlling means 14 includes a vehicle controller 32 disposed with each vehicle 26. The vehicle controller 32 of each vehicle 26 preferably includes an MCU 34. Preferably, the sensing means 20 includes sensors 24. The emitters 18 preferably include LEDs 30.
  • The present invention pertains to a method for manipulating objects. The method comprises the steps of receiving information from N objects, where N is greater than or equal to 2 and is an integer, at a centrally controlling and 2D locating controller; determining 2D locations by the controller of the N objects; and transmitting from the controller directions to the N objects for the N objects to move. Preferably, the transmitting step includes the step of transmitting from the controller kinematic parameters to the N objects.
  • The present invention pertains to an apparatus for tracking. The apparatus comprises N objects, where N is greater than or equal to 2 and is an integer, each object having an emitter which emits light; and means for 2D sensing of the N objects over time from the light emitted by each emitter. The apparatus preferably includes a planar element on which the N objects are disposed, and wherein the sensing means includes at least 2 1-D sensors that sense the light emitted from the edge of the planar element on which the objects are disposed.
  • The present invention pertains to a method for tracking. The method comprises the steps of emitting light from N objects, where N is greater than or equal to 2 and is an integer; and sensing 2D locations of the N objects over time from the emitted light from the N objects. Preferably, the sensing step includes the step of sensing 2D locations of the N objects over time from the emitted light from the N objects through an edge of a planar element on which the N objects are disposed.
  • In the operation of the invention, the basic approach is to move each object upon the surface by a small telemanipulated wheeled vehicle. Each vehicle is inexpensive, unobtrusive and independently controllable. Each vehicle is designed to be extremely simple and inexpensive.
  • A central processor telemanipulates all vehicles. This process is done within successive update cycles; the duration of each cycle is on the order of five or ten milliseconds. Within each update cycle, each vehicle's current location and orientation are detected by a central processor; the central processor then specifies, for each vehicle, a velocity for each of the vehicle's drive wheels.
  • Each cycle is temporally divided into successive time slices; each time slice is assigned to a single vehicle, during which all communication between that vehicle and the central processor takes place. Individual vehicles possess no model of their own position. Rather, a vehicle transmits its position and orientation to the central processor by successively flashing LEDs mounted upon the vehicle's chassis. The central processor computes the position and orientation of the vehicle from the measured positions of these LEDs.
  • In one preferred embodiment, the surface is a table, and each vehicle forms a mobile coaster upon which objects may be placed. The display mechanism is quiet and rugged and its physical footprint less obtrusive. Power limitations are removed via an improved means for distributing electrical current to each vehicle, and to enable the display to be able to function as a vertical wall surface, if desired.
  • A bulky tracker is replaced by a form of position tracker which operates entirely within the table surface itself.
  • A sealed design, places vehicles between top and bottom surfaces, with objects put upon thin ‘coasters’ on the top surface. This design eliminates the problem of vehicle recharging, reduces noise, and allows a for portable, self-contained device.
  • These improvements are discussed below.
  • More specifically, as shown in FIG. 2, the architecture includes four major components: vehicles, including their mechanical and electrical subsystems, a table controller, position sensing opto-electronics, and a standard PC.
  • The position sensing subsystem includes a lens, a position sensing device (PSD) and an amplifier. The positions of LEDs on the vehicles, imaged onto the PSD, generate corresponding voltages, which are read by the table controller via an analog to digital converter (ADC).
  • A proportional, integral, derivative (PID) control system running on the table microcontroller (MCU) compares these values with a desired vehicle trajectory sent from the host PC. The output from the control system is a set of speed commands for the two geared DC motors on the vehicle. Commands are sent via 115,200 bps infrared communication link based on the IRDA physical layer.
  • A communications protocol implemented between the host PC and the table controller allows a PC application to specify vehicle trajectories and position. Software components with hard real-time requirements are implemented in the table MCU, whereas application code is implemented on the PC, benefitting from that platform's greater resources and superior development tools.
  • Both the table controller and the vehicle are designed around ATMEL AVR™ series 8-bit microcontrollers.
  • Vehicle position sensing and communication between the vehicle and table controller are interleaved within a system update cycle, as shown in FIGS. 4 a-4 c. In each of these cycles, every vehicle has its position sensed and receives new motor commands. Each vehicle is assigned a separate ID [0, N-1], and this determines the intervals within the cycles (‘frames’) when it must pulse its locator LEDs and listen for motor commands. When a vehicle is placed on the table, it first synchronizes with the update cycle by waiting for a unique ‘sync byte’. A blank frame prior to the sync frame insures that the UART on the vehicle can find the start bit of the sync frame reliably.
  • The period of the update cycle is determined by the maximum number vehicles (N) in any configuration and the length of each frame (Tframe):
    Tupdate=Tframe*(3N+2)
  • With current values for Tframe (95.5 microseconds) and N (15), Tupdate is around 4.5 milliseconds.
  • Methods for increasing the maximum number of sensed vehicles are described below.
  • The position sensing subsystem includes a lens, a position sensing detector (PSD)—also known as a lateral-effect photodiode—and a custom-built amplifier, as shown in FIG. 5. Two infrared LEDs, located underneath the vehicle, are separately pulsed as described above. The lens, situated roughly 80 cm below the lm square table, images the LEDs' outputs onto the surface of the PSD. Low-level currents developed at the PSD are amplified and filtered to remove noise and interfering signals, and then read by the table controller via an analog to digital converter (ADC).
  • The photoconductive mode of the PSD is used. Each of the four PSD terminals is connected to a transimpedance amplifier stage with a gain of 2.2M V/A implemented with a low bias current FET Op Amp. A subsequent stage provides an additional 15×voltage gain. Low-pass filtering at each stage and a fifth-order Bessel switched capacitor filter IC, roll-off amplifier frequency response steeply after the first few harmonics of the locator pulses. An optical filter over the PSD removes light outside the infrared region.
  • A precision ADC on the table controller converts the output voltages of the PSD amplifier into digital values read by a microcontroller on the table controller board. Non-linearities in the PSDs output as a function of position are unwarped by performing a 2D interpolation through a table generated from calibration data. The unwarped values are then used to compute X and Y positions for each diode and, in turn, an orientation for the vehicle.
  • The functions required of the vehicle's electronics are minimal—primarily reception of motor commands, varying motor voltage on the basis of those commands, and synchronizing with the system update cycle. No sensing and only minimal computation are performed locally, as shown in FIGS. 6 a-6 g.
  • An IRDA endec IC, converts the IRDA format signal from the transceiver into an asynchronous serial bit stream. The MCU's internal UART recovers bytes from the stream for interpretation as motor commands.
  • Two H-bridge driver circuits enable bi-directional control of the vehicle's DC motors. PWM control signals from the MCU set the average voltage seen by the motor by varying the duty cycle in proportion to the value of the received motor command. Two LED control circuits enable the infrared LEDs to be driven at high current (approximately 300 mA) by low-current MCU output pins.
  • Both the motor and LED circuits are powered directly by a 3.6V (nominal) NiMH rechargeable battery. A 3.0V low-dropout regulator powers the rest of the vehicle's electronics and helps to isolate the sensitive IRDA transceiver from power supply noise generated by the motor and large LED current pulses.
  • The primary components of the table controller are an Atmel ATMega128 MCU, a Maxim MAX1270 12-bit ADC, and an IRDA transceiver and endec, as shown in FIGS. 7 a-7 d.
  • The ADC is connected to the four output channels (x1, x2, y1, y2) of the PSD amplifier. The MCU directs the ADC to sample each of these channels once per LED locator pulse frame. A serial protocol (SPI) is used for communication between the MCU and the ADC IC.
  • Motor commands from the MCU's control system are encoded into IRDA format by the endec IC and transmitted by the transceiver.
  • The table controller communicates to the PC via a RS232 serial link. The link is implemented with the MCU's second UART, connected through a DS232 voltage level converter to the PC serial port.
  • The vehicle for the active table is built on a circular platform, with two driven wheels connected to small DC gear motors, as shown in FIGS. 8 a-8 d. Power is supplied by three 700 mAh AAA NiMH cells, which can power the vehicle for two to ten hours, depending on motor use. The printed circuit board which contains the vehicle's circuitry also acts a chassis, providing a rigid frame onto which mechanical components are attached.
  • Turning is achieved by ‘differential steering’, whereby vehicle rotational rate is dictated by the difference in the velocity of the wheels. Spring loaded Teflon casters stabilize the vehicle vertically, while slipping sideways to allowing planar motion.
  • The vehicle is covered by a capped cylindrical shell, onto which the models used in a particular application are placed.
  • In the position sensing method, an object to be tracked travels across a surface made of acrylic plastic sheet which is doped with a fluorescent dye. The object directs light from an attached LED, whose wavelength is in the absorption band of the dye, into the plastic sheet. The absorbed light is reemitted at a longer wavelength, creating a narrow light source inside the plastic beneath the LED. Since the plastic is nearly transparent to light at its emission peak, little of the emitted light is absorbed by the material.
  • The sheet acts as light pipe, directing most of the emitted light to its edges, where it appears as tapered vertical band. This effect is visible in FIG. 9. (The image was made by illuminating a sheet of orange-emitting fluorescent acrylic with a blue LED held against the sheet's surface, and then viewing the sheet from its edge.) This band is imaged through a wide angle lens onto the surface of one-dimensional position sensing detectors (PSDs) placed at two cut corners of the sheet. The position of the centroid of the band imaged onto the PSD is computed from the currents measured at the PSDs terminals. FIG. 10 illustrates the overall configuration.
  • The edge-emitted light pattern has the symmetric intensity pattern required to properly locate its center. Measurement of received optical power, indicate moderately weak signals will be present at the PSD and will therefore require a high-gain, low noise amplifier. Careful matching of fluorescent dye, LED, and PSD characteristics can be expected to improve signal strength.
  • The primary benefit of this method is that it requires no additional depth beyond the thickness of the table surface, eliminating the need for a bulky tracker. It enables the construction of an Planar Manipulator Display as a self-contained ‘slab’ which can be placed on top of any suitable desk or table. Additionally, the sensing elements used are both simple and manufactured in high volume for other applications, and consequently are very inexpensive.
  • An open variant of the design is a direct adaptation of the edge viewing method, where objects attach directly to mounting bases on vehicles. This variant in its simplest form would include rechargeable batteries in vehicles, perhaps with a recharging ‘station’ to which a vehicle could go when its voltage was low. Inductively coupled power delivery across the table surface is also used. This makes possible to eliminate or at least reduce the size of the on-board battery.
  • The sealed version, as shown in FIG. 11, of the slab design places vehicles between top and bottom surfaces. Objects are put upon thin coasters on the top surface. These coasters are magnetically coupled to vehicles inside the table and travel on small rollers to minimize friction.
  • Vehicle power is supplied on fine metal meshes inside the top and bottom surfaces which are at different DC potentials. Flexible metal contact brushes electrically connect the vehicle to the meshes, while reducing sensitivity to variations to in surface height. High transparency steel mesh is used between the vehicle and the position sensing surface. http://www.twpinc.com/high_trans.html, incorporated by reference herein.
  • The sealed variant, while more complex, could provide several important advantages over the unsealed version. Delivering power to directly to each vehicle eliminates the per-vehicle cost of rechargeable cells, and eliminates problem of recharging. It also eliminates trade-offs between vehicle performance, mass, and battery life. Vehicle performance becomes limited only by power density of available motors.
  • By decoupling vehicles and the object platforms (‘coasters’), this architecture allows a “client/server” arrangement, in which there is a pool of available vehicles within the surface. These vehicles can be programmed to work cooperatively to move large or articulated objects. This decoupling also eliminates potential vehicle damage as users manipulate objects.
  • Finally, a sealed design would reduce noise, and allow us to build more self-contained, portable devices, which could simply be picked-up and moved as one unit.
  • The primary challenge that arises from the unsealed approach is that it creates the need to track coasters separately from vehicles. Potential solutions for this problem are addressed in the next section.
  • The methods available to sense user control of objects depend on the physical design of the table. In configurations where the object is mechanically coupled to the vehicle, it is possible to detect when users pick up and move objects by monitoring the error between the commanded and measured motion of the vehicles. In the simplest case, when a vehicle which has been commanded to stop is nonetheless in motion, it can be assumed that the user is moving the vehicle.
  • Where the object and vehicle can become uncoupled, alternate methods are necessary. Direct tracking of coaster positions could be implemented by adding locator LEDs to coasters and putting the sensing surface (e.g. fluorescent acrylic sheet) between the vehicles and the coaster. Coasters would need to carry batteries, but due to the very low duty cycle of the LED locator signal and consequently low average current, run time of approximately ten hours (continuous operation) should be possible between recharges for appropriately sized NiMH cells.
  • A second alternative for the sealed configuration is to use computer vision techniques to track objects and coasters. This could be developed in conjunction with the gesture tracking subsystem described elsewhere in this document.
  • There are several possible approaches to maintaining high update rates with larger number of vehicles up to 100 vehicles.
  • The most direct approach is to make the length of the frames in which LEDs are pulsed shorter than the length of the communications frames. For example, changing the pulse time to ten microseconds—easily achievable with available ADCs—would allow 84 vehicles to be sensed and controlled at 100 hz.
  • In addition, the communications rate could be increased to as much 4 Mbps (with some cost impact), leading to a theoretical communications frame width as small as 2.8 microseconds. Though it might be difficult to synchronize vehicles and the table controller well enough to achieve quite this rate, ten microseconds per frame should be quite achievable.
  • The current scheme interleaves communication and sensing intervals to avoid optical interference between the two sub-systems, which both use infrared light. If the position sensing sub-system were to use light outside the infrared band, it would be possible to use optical filters to separate the two types of signals, and thus enable communication and position sensing to be overlapped. (In fact, the “2-Dimensional Position Sensing by Edge Viewing” method described elsewhere could use visible light.) Combining all three of these methods would allow tracking of 500 vehicles at 100 hz—a large safety margin beyond any physically practical number of vehicles.
  • The table controller is designed around a more powerful, 32 bit MCU such as the ARM 940T. This will provide the computational resources required to run the larger number of vehicles envisioned.
  • Vehicles capable of holonomic motion can be used, as shown in FIG. 12. In this context, holonomic motion refers to the ability of the vehicle to control orientation independently from direction.
  • This is advantageous for two reasons. First, holonomy greatly simplifies motion control and path planning. Second, it allows direction to be changed much more quickly than is possible with differentially steered vehicles, removing limitations placed on the kinds of motion that can be effected.
  • For example, in a billiard ball simulation, an elastic collision should change the direction of a ball instantaneously. A differentially steered vehicle would have to rotate in place at the collision point before heading in a new direction—requiring, in effect, the simulation to be stopped momentarily—but a holonomic vehicle could proceed in the new direction immediately (of course subject to limits imposed by its inertia).
  • The cost associated with these benefits is a small increase in vehicle complexity. A well known approach to implementing a holonomic vehicle involves the use of three ‘omni-wheels’ (and associated motors), oriented at 120° intervals, as shown in FIG. 12. [G. Reshko, M. Mason, Rapid Prototyping of Small Robots, Carnegie Mellon University Technical report, 2000, incorporated by reference herein.]
  • Planar manipulator displays alone will provide a compelling medium for many applications.
  • The system's functionality as an output device can be reinforced, by making the tabletop surface itself a graphical display device, e.g. by projecting video onto the table from above. Dynamic table graphics should provide a strong sense of context to the presentation provided by physical objects. Adding this, along with other obvious cues such as audio, should more effectively “complete” the simulation for the user.
  • If the table's strengths are noted as a direct-interaction input device, it is appropriate to consider how the table would be integrated with the other non-contact forms of human input. For instance, what the user might be doing with his/her hand when it is not in contact with the coaster-objects can be involved. This is gesture recognition, and the most appropriately applicable form of this technology would be a passive system, e.g. computer-vision-based—an area which does have a fair amount of mature research [Segen, J. “Gest: A learning computer vision system that recognizes gestures,” Machine Learning IV, pp. 621-634, Morgan Kauffman, 1994, edited by Michalski et al.; Segen, J. and S. Kumar. “Gesture VR: gesture interface to spatial reality,” SIGGRAPH Conference abstracts and applications, page 130, 1998. Digital Pavilions; Michael Stark, Markus Kohler, and P. G. Zyklop. “Video Based Gesture Recognition for Human Computer Interaction,” International Workshop on Modeling—Virtual Worlds—Distributed Graphics”, 27.-28, November 1995, all of which are incorporated by reference herein]. One can now envision a scenario for the system where a user might point to an object on the Table, and move it to an opposite corner by merely motioning with his or her finger. This suggests a broad range of applications for users with disabilities.
  • Additionally, integrating a computer vision system may also address the coaster-tracking problem that arises when the table is implemented in its sealed variant, where coasters can possibly be decoupled from vehicles.
  • Another natural companion input mode is voice recognition. It could be useful for the Table because it would be used in concert with direct-interaction and gesture recognition.
  • The following example applications all require simultaneous movement of multiple physical objects upon a plane, under control of an interactive simulation algorithm. Some of them also can benefit from the presence of a front projection down onto the surface.
  • Military Simulation
  • In this scenario, miniature military figures can be strategically positioned for attack or defense. Personnel can be made to hide behind buildings, out of the line of sight of enemy combatants. Simulation of exhaustion or other disability can be simulated by limiting maximum speed of travel.
  • By combining with front-projection onto the surface, it is possible to show possible paths of attack or escape, areas of visibility by the enemy, and time-varying geographic features such as fog cover.
  • People Flow
  • The system an be used for applications involving groups or crowds of people. One application is emergency evacuation planning. Another is simulation and examination of how people react in social groups, including social clustering and dominance behaviors.
  • Emergency evacuation scenarios can be played out, with direct physical visualization of potential areas of congestion.
  • Vehicle Traffic
  • Another application is study of traffic flow. This can involve study of strategies for avoiding congestion, of interaction between vehicles and pedestrians, and to determine the effects of variation in policy for city planning, such as sidewalk/crosswalk widths. Simulations of steering and parking strategies can be used to design optimal parking areas, or the effects of introducing oversized vehicles to an area. Physical simulation can be used to compare strategies for dealing with unsafe or DWI drivers.
  • Furniture/Architecture Arranging
  • It is possible to look at algorithms for arranging furniture for optimal people flow through an interior. In one scenario, as the user moves a table, chairs rearrange themselves under algorithmic control. Things that can be examined include effects of walking routes and simulations of where people tend to congregate in a room.
  • When used in conjunction with projection onto the surface, it is possible to examine wind-flow around buildings, dispersal patterns of air contaminants, or how the strength of broadcast radio/microwave signals varies with different arrangements of buildings.
  • Interactive Optics Education Kit
  • In this scenario, when the user moves any one optical component, the other optical components shift to maintain optical paths in a simulation. A projection shows the variation in the simulated optical path as the other components are physically moved into place. In general, this approach is well matched to design and implementation of hands-on museum exhibits, encouraging an active “learn by doing” approach to K-12 children's science education.
  • The current PC programming interface for the planar manipulator display is relatively simple, providing access to vehicle position and orientation, and allowing path waypoints to be commanded. The programming interface can be extended by implementing path planning using established techniques [STOUT, Bryan. The Basics of Path Planning. Game Programming Gems, pp. 254-263. Hingham, USA: Charles River Media, 2000; RABIN, Steve. Speed Optimizations. Game Programming Gems, pp. 272-287. Hingham, USA: Charles River Media, 2000; STERREN, William van der. Tactical Path-Finding. Game Programming Gems 3, pp. 294-306. Hingham, USA: Charles River Media, 2002, all of which are incorporated by reference herein], and by providing support for user-input detection.
  • A Position Sensing Detector is a type of photodiode whose output represents the position of incident light striking its surface. In general, a PSD consists of two photo-sensitive layers (the P- and N-type layers) separated by a transparent layer.
  • One-Dimensional PSD
  • In the 1D case, as shown in FIG. 13, two electrodes positioned at opposite ends of the P-layer detect the photocurrent created by photoelectric conversion of light striking the layer. The current at each electrode is inversely proportional to its distance from the incident position of the light. If X1 and X2 represent the current at each electrode, and x is the position of the incident light, then their relationship is described by (1) below.
  • Two-Dimensional PSD
  • Several types of 2D PSD are available and are classified by the locations of their electrodes. The duo-lateral type uses two additional electrodes positioned at the edges of the N-layer (at 90° from those on the P-layer relative to the center of the PSD), thus enabling the spot to be located along a second axis. If Y1 and Y2 represent the current at each electrode, and y is the position of the light on the N-layer, then (2) describes their relationship: X 2 - X 1 X 1 + X 2 = 2 x L ( 1 ) Y 2 - Y 1 Y 1 + Y 2 = 2 y L ( 2 )
    PSD Features
  • PSD-derived positions depend only on the location of the centroid of the light spot, and are independent of the brightness of the spot or its sharpness. This allows a simple and inexpensive optical design to be implemented. This feature enables the “2-Dimensional Position Sensing by Edge Viewing” method, described herein, which depends on accurately locating the center of a diffuse light pattern. In addition, PSDs are capable of very high-speed operation, with limits dictated primarily by the rise time of the element—often less than one microsecond. With properly designed interface electronics, they can achieve positional resolutions of one part in ten thousand.
  • Individual vehicles possess no model of their own position. Rather, a vehicle transmits its position and orientation to the central processor by successively flashing two LEDs mounted upon the vehicle's chassis. Light from the LEDs is imaged onto the surface of a two-dimensional lateral-effect photodiode, which through associated analog circuitry, produces voltages which depend on the location of the imaged light on the photodiode's surface.
  • These output voltages are sampled through an analog to digital converter in synchrony with the pattern of flashes from vehicle LEDs, and enable computation X and Y positions for each LED.
  • Software in the central processor performs further computation as follows: The line connecting both the positions of the LEDs is translated to the origin of the coordinate system. The angle of this line with respect to the coordinate axis is then computed by:
    theta=arctan((y2−y1)/(x2−x1)),
    where (x1, y1) and (x2, y2) are the coordinates of the first and second LEDs, respectively, and arctan is the inverse tangent function.
  • There are two other techniques other than fluorescence to solve the problem of getting light to the edge of the sensing surface.
  • This first of these is to embed reflective particles—glitter—into the otherwise transparent sensing surface. In this technique, some of the light directed downwards by the vehicle-mounted emitters is reflected sideways by these particles and arrives at the edge where it can be sensed. (Of course, some of the light is scattered by other particles before it reaches the edges.)
  • The second technique is to construct the sensing surface from a clear material, but with geometry designed to maximize total internal reflection inside the surface—i.e. to use it as a ‘light pipe’. In this case, the vehicle emitter would be modified to emit light only in the range of angles which would be ‘captured’ internally by the sheet (i.e. rather than traveling straight through the surface or reflecting off the surface.)
  • It should be noted that the tracking of objects described herein can be used whether the objects are moved under their own power, or if something or someone moves the objects.
  • Although the invention has been described in detail in the foregoing embodiments for the purpose of illustration, it is to be understood that such detail is solely for that purpose and that variations can be made therein by those skilled in the art without departing from the spirit and scope of the invention except as it may be described by the following claims.

Claims (15)

1. A system for manipulation of objects comprising:
N objects, where N is greater than or equal to 2 and is an integer; and
means for controlling and 2D locating of the N objects.
2. A system as described in claim 1 wherein the controlling means includes indicators disposed on the object.
3. A system as described in claim 2 wherein the controlling means includes sensing means for locating the objects.
4. A system as described in claim 3 wherein position indicators include emitters which indicate a position of an object.
5. A system as described in claim 4 wherein the objects are vehicles.
6. A system as described in claim 5 wherein the controlling means includes a vehicle controller disposed with each vehicle.
7. A system as described in claim 6 wherein the vehicle controller of each vehicle includes an MCU.
8. A system as described in claim 7 wherein the sensing means includes sensors.
9. A system as described in claim 8 wherein the emitters include LEDs.
10. A method for manipulating objects comprising the steps of:
receiving information from N objects, where N is greater than or equal to 2 and is an integer, at a centrally controlling and 2D locating controller;
determining 2D locations by the controller of the N objects; and
transmitting from the controller directions to the N objects for the N objects to move.
11. A method as described in claim 10 wherein the transmitting step includes the step of transmitting from the controller kinematic parameters to the N objects.
12. An apparatus for tracking comprising:
N objects, where N is greater than or equal to 2 and is an integer, each object having an emitter which emits light; and
means for 2D sensing of the N objects over time from the light emitted by each emitter.
13. An apparatus as described in claim 12 including a planar element on which the N objects are disposed, and wherein the sensing means includes at least 2 1-D sensors that sense the light emitted from the edge of the planar element on which the objects are disposed.
14. A method for tracking comprising the steps of:
emitting light from N objects, where N is greater than or equal to 2 and is an integer; and
sensing 2D locations of the N objects over time from the emitted light from the N objects.
15. A method as described in claim 14 wherein the sensing step includes the step of sensing 2D locations of the N objects over time from the emitted light from the N objects through an edge of a planar element on which the N objects are disposed.
US10/822,133 2003-04-17 2004-04-09 Manipulation of objects Abandoned US20050065649A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US10/822,133 US20050065649A1 (en) 2003-04-17 2004-04-09 Manipulation of objects
US12/586,595 US8725292B2 (en) 2003-04-17 2009-09-24 Manipulation of objects
US14/273,151 US9760093B2 (en) 2003-04-17 2014-05-08 Manipulation of objects

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US46349603P 2003-04-17 2003-04-17
US10/822,133 US20050065649A1 (en) 2003-04-17 2004-04-09 Manipulation of objects

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/586,595 Continuation US8725292B2 (en) 2003-04-17 2009-09-24 Manipulation of objects

Publications (1)

Publication Number Publication Date
US20050065649A1 true US20050065649A1 (en) 2005-03-24

Family

ID=33310786

Family Applications (3)

Application Number Title Priority Date Filing Date
US10/822,133 Abandoned US20050065649A1 (en) 2003-04-17 2004-04-09 Manipulation of objects
US12/586,595 Active 2025-04-13 US8725292B2 (en) 2003-04-17 2009-09-24 Manipulation of objects
US14/273,151 Active US9760093B2 (en) 2003-04-17 2014-05-08 Manipulation of objects

Family Applications After (2)

Application Number Title Priority Date Filing Date
US12/586,595 Active 2025-04-13 US8725292B2 (en) 2003-04-17 2009-09-24 Manipulation of objects
US14/273,151 Active US9760093B2 (en) 2003-04-17 2014-05-08 Manipulation of objects

Country Status (2)

Country Link
US (3) US20050065649A1 (en)
WO (1) WO2004095170A2 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100042258A1 (en) * 2003-04-17 2010-02-18 Kenneth Perlin Manipulation of objects
US20120078417A1 (en) * 2010-09-28 2012-03-29 International Business Machines Corporartion Detecting Energy and Environmental Leaks In Indoor Environments Using a Mobile Robot
US20130253815A1 (en) * 2012-03-23 2013-09-26 Institut Francais Des Sciences Et Technologies Des Transports, De L'amenagement System of determining information about a path or a road vehicle
US20150221230A1 (en) * 2014-02-03 2015-08-06 Immersive Technologies, Pty. Ltd. Simulation Training System
US20170232611A1 (en) * 2016-01-14 2017-08-17 Purdue Research Foundation Educational systems comprising programmable controllers and methods of teaching therewith
US20180165816A1 (en) * 2016-07-27 2018-06-14 Seikowave, Inc. Method and Apparatus for Motion Tracking of an Object and Acquisition of Three-Dimensional Data Over Large Areas
US10133370B2 (en) 2015-03-27 2018-11-20 Tampereen Yliopisto Haptic stylus
US20200042032A1 (en) * 2018-08-02 2020-02-06 Texas Instruments Incorporated High Speed FlexLED Digital Interface

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7930454B2 (en) * 2005-04-08 2011-04-19 Achim Rausenberger Thin-client terminal and client/server-system having such a terminal
DE102008063452A1 (en) * 2008-12-17 2010-07-15 Siemens Aktiengesellschaft Method for improving the simulation of object streams by means of brake classes
US9623319B2 (en) 2012-10-10 2017-04-18 Kenneth C. Miller Games played with robots
US9795868B2 (en) 2012-10-10 2017-10-24 Kenneth C. Miller Games played with robots
US8998652B2 (en) * 2012-12-18 2015-04-07 Pascal Martineau Interactive pin array device
CN103760920B (en) * 2014-01-23 2017-01-18 宏泰集团(厦门)有限公司 Intelligent sound field control system
USD795936S1 (en) 2015-08-24 2017-08-29 Kenneth C. Miller Robot
US11543931B2 (en) * 2021-01-27 2023-01-03 Ford Global Technologies, Llc Systems and methods for interacting with a tabletop model using a mobile device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5252991A (en) * 1991-12-17 1993-10-12 Hewlett-Packard Company Media edge sensor utilizing a laser beam scanner
US5719762A (en) * 1995-11-06 1998-02-17 The United States Of America As Represented By The Secretary Of The Navy Method of controlling a vehicle to make a combination of arbitrary translational and rotational motions
US6687571B1 (en) * 2001-04-24 2004-02-03 Sandia Corporation Cooperating mobile robots
US20040030449A1 (en) * 2002-04-22 2004-02-12 Neal Solomon Methods and apparatus for multi robotic system involving coordination of weaponized unmanned underwater vehicles
US20040148058A1 (en) * 2001-04-02 2004-07-29 Svein Johannessen Industrial robot comprising a portable operating unit which a movable key device for identification of the robot
US6950788B2 (en) * 2000-09-27 2005-09-27 Ardeshir Faghri Computer-implemented system and method for simulating motor vehicle and bicycle traffic
US7082351B2 (en) * 2001-11-20 2006-07-25 Sharp Kabushiki Kaisha Group robot system, and sensing robot and base station used therefor

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4119900A (en) * 1973-12-21 1978-10-10 Ito Patent-Ag Method and system for the automatic orientation and control of a robot
US4398720A (en) * 1981-01-05 1983-08-16 California R & D Center Robot computer chess game
US4796198A (en) * 1986-10-17 1989-01-03 The United States Of America As Represented By The United States Department Of Energy Method for laser-based two-dimensional navigation system in a structured environment
US4987540A (en) * 1989-05-30 1991-01-22 Whs Robotics, Inc. Automatic guided vehicle system having communication and traffic controller with unguided paths
US5629594A (en) * 1992-12-02 1997-05-13 Cybernet Systems Corporation Force feedback system
US5257787A (en) * 1993-01-28 1993-11-02 Miccio Joseph A Chess-like game
CA2167304C (en) * 1993-07-16 1998-04-21 Louis B. Rosenberg Multi degree of freedom human-computer interface with tracking and forcefeedback
US6292713B1 (en) * 1999-05-20 2001-09-18 Compaq Computer Corporation Robotic telepresence system
US6213110B1 (en) * 1999-12-16 2001-04-10 Odyssey Paintball Products, Inc. Rapid feed paintball loader
US7117067B2 (en) * 2002-04-16 2006-10-03 Irobot Corporation System and methods for adaptive control of robotic devices
US6981700B2 (en) * 2002-10-03 2006-01-03 Syed Omar A Strategic board game
JP4007947B2 (en) * 2002-12-20 2007-11-14 シャープ株式会社 Group robot system, sensing robot included in group robot system, base station included in group robot system, and control robot included in group robot system
WO2004095170A2 (en) * 2003-04-17 2004-11-04 New York University Manipulation of objects
DE102006013065A1 (en) * 2006-03-22 2007-09-27 GM Global Technology Operations, Inc., Detroit Control method for electronically controlled damping system in Cabriolet motor vehicle
US20090267741A1 (en) * 2008-04-25 2009-10-29 Eric Chun-Yip Li RFID Floor Tags for Machine Localization and Delivery of Visual Information
US8384789B2 (en) * 2008-07-23 2013-02-26 Pixart Imaging Inc. Sensor array module with wide angle, and image calibration method, operation method and application for the same

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5252991A (en) * 1991-12-17 1993-10-12 Hewlett-Packard Company Media edge sensor utilizing a laser beam scanner
US5719762A (en) * 1995-11-06 1998-02-17 The United States Of America As Represented By The Secretary Of The Navy Method of controlling a vehicle to make a combination of arbitrary translational and rotational motions
US6950788B2 (en) * 2000-09-27 2005-09-27 Ardeshir Faghri Computer-implemented system and method for simulating motor vehicle and bicycle traffic
US20040148058A1 (en) * 2001-04-02 2004-07-29 Svein Johannessen Industrial robot comprising a portable operating unit which a movable key device for identification of the robot
US6687571B1 (en) * 2001-04-24 2004-02-03 Sandia Corporation Cooperating mobile robots
US7082351B2 (en) * 2001-11-20 2006-07-25 Sharp Kabushiki Kaisha Group robot system, and sensing robot and base station used therefor
US20040030449A1 (en) * 2002-04-22 2004-02-12 Neal Solomon Methods and apparatus for multi robotic system involving coordination of weaponized unmanned underwater vehicles

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9760093B2 (en) * 2003-04-17 2017-09-12 New York University Manipulation of objects
US8725292B2 (en) * 2003-04-17 2014-05-13 New York University Manipulation of objects
US20140244035A1 (en) * 2003-04-17 2014-08-28 New York University Manipulation of Objects
US20100042258A1 (en) * 2003-04-17 2010-02-18 Kenneth Perlin Manipulation of objects
US20120078417A1 (en) * 2010-09-28 2012-03-29 International Business Machines Corporartion Detecting Energy and Environmental Leaks In Indoor Environments Using a Mobile Robot
US20130253815A1 (en) * 2012-03-23 2013-09-26 Institut Francais Des Sciences Et Technologies Des Transports, De L'amenagement System of determining information about a path or a road vehicle
US20150221230A1 (en) * 2014-02-03 2015-08-06 Immersive Technologies, Pty. Ltd. Simulation Training System
US10133370B2 (en) 2015-03-27 2018-11-20 Tampereen Yliopisto Haptic stylus
US20170232611A1 (en) * 2016-01-14 2017-08-17 Purdue Research Foundation Educational systems comprising programmable controllers and methods of teaching therewith
US10456910B2 (en) * 2016-01-14 2019-10-29 Purdue Research Foundation Educational systems comprising programmable controllers and methods of teaching therewith
US20180165816A1 (en) * 2016-07-27 2018-06-14 Seikowave, Inc. Method and Apparatus for Motion Tracking of an Object and Acquisition of Three-Dimensional Data Over Large Areas
US10706565B2 (en) * 2016-07-27 2020-07-07 Seikowave, Inc. Method and apparatus for motion tracking of an object and acquisition of three-dimensional data over large areas
US20200042032A1 (en) * 2018-08-02 2020-02-06 Texas Instruments Incorporated High Speed FlexLED Digital Interface
US11048291B2 (en) * 2018-08-02 2021-06-29 Texas Instruments Incorporated High speed FlexLED digital interface

Also Published As

Publication number Publication date
WO2004095170A2 (en) 2004-11-04
US20140244035A1 (en) 2014-08-28
US8725292B2 (en) 2014-05-13
WO2004095170A3 (en) 2005-09-15
US9760093B2 (en) 2017-09-12
US20100042258A1 (en) 2010-02-18

Similar Documents

Publication Publication Date Title
US9760093B2 (en) Manipulation of objects
Ben-Ari et al. Elements of robotics
Wang et al. Enabling independent navigation for visually impaired people through a wearable vision-based feedback system
CN108673501B (en) Target following method and device for robot
Borgolte et al. Architectural concepts of a semi-autonomous wheelchair
Nourbakhsh et al. Human-robot teaming for search and rescue
US20180136319A1 (en) Detector for an optical detection of at least one object
WO2020034973A1 (en) All-road-condition multi-caster platform robot
CN104391578B (en) A kind of real-time gesture control method of 3-dimensional image
CN109343537A (en) Full autonomous driving racing trolley and operation method
Rosenfeld et al. Physical objects as bidirectional user interface elements
Umetani et al. Rapid development of a mobile robot for the nakanoshima challenge using a robot for intelligent environments
CN111993435B (en) Modular extensible education robot
Mazo et al. Experiences in assisted mobility: the SIAMO project
Kleiner et al. Robocuprescue-robot league team rescuerobots freiburg (germany)
Gomi et al. The development of an intelligent wheelchair
CN212947827U (en) Service robot
Khan et al. Electronic guidance cane for users having partial vision loss disability
Nagatani et al. Sensor information processing in robot competitions and real world robotic challenges
Burton et al. Twinkle box: a three-dimensional computer input device
Kundur et al. Active vision-based control schemes for autonomous navigation tasks
Langner Effort reduction and collision avoidance for powered wheelchairs: SCAD assistive mobility system
KR102069765B1 (en) Moving robot
CN209078757U (en) A kind of human-computer interaction device
Sonkrot et al. Smart shopping cart for people with disabilities or limited mobility

Legal Events

Date Code Title Description
AS Assignment

Owner name: NEW YORK UNIVERSITY, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PERLIN, KENNETH;REEL/FRAME:015382/0811

Effective date: 20041102

Owner name: NEW YORK UNIVERSITY, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KOLLIN, JOEL S.;REEL/FRAME:015382/0944

Effective date: 20041102

Owner name: NEW YORK UNIVERSITY, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ROSENFELD, DANIEL A.;REEL/FRAME:015382/0881

Effective date: 20041102

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION