WO2014125272A1 - Touch sensing systems - Google Patents

Touch sensing systems Download PDF

Info

Publication number
WO2014125272A1
WO2014125272A1 PCT/GB2014/050406 GB2014050406W WO2014125272A1 WO 2014125272 A1 WO2014125272 A1 WO 2014125272A1 GB 2014050406 W GB2014050406 W GB 2014050406W WO 2014125272 A1 WO2014125272 A1 WO 2014125272A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
touch sensing
sensing system
data
pair
Prior art date
Application number
PCT/GB2014/050406
Other languages
French (fr)
Inventor
Euan Christopher Smith
Adrian James Cable
Peter William Tudor Mash
Gareth John Mccaughan
Paul Richard Routley
Original Assignee
Light Blue Optics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Light Blue Optics Ltd filed Critical Light Blue Optics Ltd
Publication of WO2014125272A1 publication Critical patent/WO2014125272A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • G06F3/0423Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen using sweeping light beams, e.g. using rotating or vibrating mirror
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04166Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving

Definitions

  • This invention relates touch sensing systems and methods which, in embodiments, can be used to provide a virtual touch sensing surface just above a display screen, whiteboard or the like.
  • Figure 1 shows an example touch sensitive image projection device 100 comprising an image projection module 200 and a touch sensing system 250, 258, 260 in a housing 102.
  • a proximity sensor 104 may be employed to selectively power-up the device on detection of proximity of a user to the device.
  • the image projection module 200 is configured to project downwards and outwards onto a flat surface such as a tabletop; boundaries of the light forming the displayed image 150 are indicated by lines 150a, b.
  • the touch sensing system 250, 258, 260 comprises an infrared laser illumination system 250 configured to project a sheet of infrared light 256 just above the surface of the displayed image 150 (for example ⁇ 1 mm above, although in principle the displayed image could be distant from the touch sensing surface).
  • the laser illumination system 250 may comprise an IR LED or laser 252, collimated then expanded in one direction by light sheet optics 254 such as a cylindrical lens.
  • a CMOS imaging sensor (touch camera) 260 is provided with an IR-pass lens 258 and captures light scattered by touching the displayed image 150, with an object such as a finger, through the sheet of infrared light 256 (the boundaries of the CMOS imaging sensor field of view are indicated by lines 257, 257a, b).
  • the touch camera 260 provides an output to touch detect signal processing circuitry as described further later. These techniques may be employed with any type of image projection system.
  • Figure 2 shows plan and side views of an example interactive whiteboard touch sensitive image display device 400 incorporating such a system.
  • IR fan sources 402, 404, 406 each providing a respective light fan 402a, 404a, 406a spanning approximately 120° together defining a single, continuous sheet of light just above display area 410.
  • the fans overlap on the display area (which is economical as shadowing is most likely in the central region of the display area).
  • a display area 410 may be of order 1 m by 2m.
  • the side view of the system illustrates a combined projector 420 and touch image capture camera 422 either aligned side-by-side or sharing a portion of the projection optics.
  • the optical path between the projector/camera and display area is folded by a mirror 424.
  • the sheet of light generated by fans 402a, 404a, 406a is preferably close to the display area, for example less than 1 cm or 0.5cm above the display area.
  • the camera and projector 422, 420 are supported on a support 450 and may project light from a distance of up to around 0.5m from the display area.
  • US2009/0091553 One type of touch detection is described in US2009/0091553, in which a laser is raster scanned across a display from behind, beneath the display surface.
  • Another typical scanning-type touch panel is described in US2001/0028344, employing a polygonal mirror to angularly scan laser light for touch detection.
  • the system uses light scanned over the surface of a display screen and employs scanning angle/timing for touch detection but provides no distance information along the scanned beam, in a direction perpendicular to the sweep.
  • US4, 81 1 ,004 which employs a pair of swept beams and detectors to sense an interruption in the beams.
  • a touch sensing system comprising: one or more optical beam sources to provide a pair of optical beams; a pair of controllable beam deflectors comprising at least first and second beam deflectors, wherein said first beam deflector is configured to deflect a first said optical beam through a first angle towards a touch sensing region, and wherein said second beam deflector is configured to deflect a second said optical beam through a second angle towards said touch sensing region; first and second optical detectors to detect first and second scattered light from each of said first and second beams, wherein said first and second optical detectors are configured to detect scattered light counter-propagating substantially along said first and second optical beams; and wherein said first and second beam deflectors are controlled in tandem to scan said touch sensing region.
  • the optical deflectors comprise facets of a rotating polygonal mirror, but alternatively a pair of synchronised, steerable mirrors may be employed, more particularly synchronised MEMS deflectors.
  • the facets providing the first and second beam deflectors, this defines a fixed angle between the mirror facets.
  • the beam deflectors comprise MEMS deflectors they may be synchronised, for example, by synchronising phase locked loop driver circuits for the MEMS deflectors.
  • the beam deflectors scan a pair of beams across the touch sensing region to define a plane (or other shape), to provide a stereoscopic sensor with a rather short baseline.
  • the beams are scanned in tandem with one another the optical configuration, and in particular the physical separation of the beam deflectors, means that generally the beams will scan at the same point within the touch sensing region at different times.
  • at least one of the beams is provided with means for defining a reference direction for the beam which, conveniently, may be a photodetector located within or towards one extreme of a beam's sweep.
  • a multi-touch detection system may comprise two or more touch sensing systems each as described above.
  • each touch sensing system scans a beam to provide angular information on the location of a touch object and then triangulation can be employed to determine the touch location.
  • each system scans a pair of beams in tandem so that each provides distance as well as angular information, which is employed by the touch signal processing system.
  • light counter-propagating along a beam may be slightly displaced from the beam or at a slight angle to the beam depending, for example, on the effective size of the 'aperture' collecting the light - which may be determined by the lateral dimension of a beam deflector, for example the size of a facet of the polygonal mirror.
  • the optical architecture of the touch sensing system comprises a pair of collimated optical beam sources, for example lasers, each illuminating a respective beam deflector.
  • a beam splitter is provided in each optical path between the source and deflector to split a return signal from the deflector into a photodetector such as a photodiode.
  • An imaging device such as a lens or mirror is provided to image the scattered light onto the photodetector; this may be focused at infinity but may have a depth of field sufficient to focus light from a closest design position of a touch object to the sensing system.
  • Embodiments of the system further comprise a signal processor in either hardware, for example an ASIC (application specific integrated circuit), software, or a combination of the two, the process signals from the detectors to determine touch position data defining an angular position of a touch and distance information, in particular defining an approximate distance between the touch and a beam deflector.
  • the touch position data may define a touch position in any convenient co-ordinate system, for example either polar or rectangular co-ordinates - the skilled person will appreciate that even when in the form of rectangular co-ordinates the touch data will nonetheless define an angle (to a reference direction) and a distance (from a fiducial point such as the centre of a beam deflector).
  • Embodiments of the signal processor provide touch position data for multiple touches within the touch sensing region.
  • a preferred embodiment of a multi-touch sensing system comprises a pair of touch sensing systems each as previously described together with signal processing to identity matched pairs of touches in the respective touch sensing regions of the systems (which are arranged to overlap). Further, the signal processing is configured to disambiguate angular positions of multiple simultaneous touches using the (approximate) distance information for the multiple touches from each of the pair of combined touch sensing systems.
  • the skilled person will appreciate that such a multi- touch system may have separate signal processing for each touch sensing system of the pair followed by an additional stage of signal processing to combine the touch position data from the pair of sensing systems, or the touch signal data from all of the detectors in the pair of touch sensing systems may be provided to a common signal processing system which processes the signals from these detectors in combination.
  • a multi-touch sensing system comprising: a pair of stereoscopic touch sensors, each configured to scan a touch sensing region with a pair of beams using a common beam deflection system and configured to detect a pair of return beams scattered from a plurality of touch objects to provide stereoscopic touch object location data defining a location of a touch object in 2D with respect to a said sensor; and at least one signal processor to process said touch object location data from said pair of stereoscopic touch sensors to match said touch objects detected by said pair of stereoscopic touch sensors dependent on said distance of said touch objects from said respective sensors, and to determine location data for said touch object from said angular positions from each of said sensors for said matched touch objects.
  • the distance between a touch object and a stereoscopic touch sensor has a greater error than an angular position of the touch object with respect to a reference direction
  • the signal processor processes the touch object location data to determine refined location data for the touch objects in which the distance information is more accurate than that provided by each stereoscopic touch sensor alone.
  • the angular error in position determined by a stereoscopic touch sensor may be defined in terms of a distance by defining the angular spread as a (circumferential) distance at a radial distance of the touch object; alternatively the error in radial distance may be compared with the error in angular position by simply comparing each as a percentage error.
  • each touch sensor may comprise a rotating polygonal mirror or a synchronised pair of MEMS deflectors.
  • the touch object location data from a stereoscopic sensor prior to processing comprises data defining, for a single touch position, a pair of peaks on a representation of signal against beam deflection angle.
  • the separation of such peaks defines the distance to a touch object and a timing of a peak defines the angular position of the object.
  • Embodiments of the system described above may determine additional distance information for a touch object which may be used in a number of different ways.
  • the additional distance information may be combined with the distance information derived from the stereoscopic view of a touch sensing system to increase the accuracy of the distance information. Additionally or alternatively this additional distance information may be employed when other distance information is not available - for example when one touch object or finger obscures another (when a nearby object interrupts the scanning beam a further object in the same direction may not be visible). Further additionally or alternatively the additional information may be included in a combined cost function for use when matching pairs of touches detected by first and second stereoscopic touch sensing systems.
  • One source of additional distance information is the brightness of light scattered from a touch object which falls off with approximately the inverse of the distance of the object from the deflector.
  • Another is the angular spread of light scattered from a touch object. This may be determined by a photodetector arranged to be sensitive to this angular spread by, for example, having two or more light sensitive regions one displaced away from an optic axis at which the other is located. Alternatively this may be considered as the angle subtended by the return beam of the scattered light at an aperture viewing the return beam, detected by detecting the size of the spot where the return beam is focused onto the detector. Further additionally or alternatively distance information may be obtained from the width of a peak in a representation of signal against beam angle as this peak width again varies approximately inversely with distance.
  • the touch sensing region will define one or more planes, but in principle this region may have other shapes.
  • the touch sensing region defines a plane a short distance away from an interactive physical surface, for example the surface of an interactive whiteboard or electronic display.
  • the touch sensing region may define a touch surface in space (a 'virtual' touch surface), and where employed to provide an interactive physical surface preferably this touch surface is located within 1 cm or less of the physical surface.
  • a small correction mirror may be moved up and down in front of the beam deflectors as the beam is scanned and thereby define a touch surface approximating the surface of a frustrocone in embodiments.
  • the touch sensing region may comprise multiple touch sensitive planes (or other surfaces) to define a three dimensional touch sensing region.
  • the second beam deflector is laterally displaced from the first beam deflector, but the lateral displacement may be small in order that the touch sensing system is small.
  • the distance information may be relatively inaccurate (the inaccuracy increasing with increasing distance from the beam deflectors).
  • the system provides only an approximate value for the distance of the touch object, and this distance may be accompanied by error, standard deviation or other probability data defining a degree of accuracy of the distance determination (the error being larger with increasing distance).
  • embodiments of the system can be of great utility, in particular for multi-touch detection: With multi-touch detection two (or more) embodiments of the system may be employed to provide accurate object angle data, which can be used to determine an object position by triangulation. However with multi-touch detection it can be problematic to link a set of angles from one touch detector with the corresponding angles for the same touch objects from another touch detector because there is ambiguity in the data (two different configurations of multi-touch pattern can give rise to the same sets of angular data from the sensors). In embodiments the distance information, albeit approximate, can be employed to disambiguate the angular touch data. More particularly the distance information can be used to determine which sets of angular measurements should be paired to identify corresponding touch objects seen by the respective sensors.
  • the system may be arranged to project two (or more) planes of light at different angles to extend the touch sensing region into a third dimension perpendicular to the touch plane.
  • this may be achieved, for example, by providing a wedge or other optical deflector on pairs of adjacent mirrors (facets) so that as the mirror rotates the scanning beam is directed up (or down) at an angle and the imaging system is arranged by the corresponding wedge on the facet capturing light for the imaging system, to look in a direction tilted upwards (or downwards) along the scanning beam.
  • the imaging system looks precisely along the scanning beam as, in general, the detector will have a viewing angle perpendicular to the scanning plane which may be sufficient to encompass small tilts of this plane up and/or down (the detector will be responsive to light scattered from a "slice" having some thickness in the Z-direction).
  • the detector will be responsive to light scattered from a "slice" having some thickness in the Z-direction.
  • the detector may be extended perpendicularly to its longitudinal direction and/or a two dimensional array may be employed to provide some additional field of view in a Z-direction perpendicular to the plane/parallel to the scanning axis.
  • a similar approach may be employed in a system with MEMS beam deflectors, when one or both deflectors may be arranged to tilt the scanning beam out of a plane perpendicular to the scanning axis.
  • Each of these approaches provides additional touch functionality, for example to detect degree and/or speed of approach, gestures and the like.
  • embodiments of the system include a signal processor to output data defining a location of one or more touch objects in two or three dimensions.
  • the signal processor may be shared amongst the touch systems and may also use angular information for triangulation to locate touches and use distance information for touch disambiguation as previously described.
  • the signal processor is coupled to the detector to receive an imaging signal and, from this, to determine an approximate distance to a touch object. This is described in more detail later.
  • Embodiments of the signal processor also receive a timing signal input for a timing signal defining a fiducial position of a beam deflector.
  • the timing signal defining the position of a beam deflector may be obtained from the drive electronics of a MEMS beam deflector or in the case of a polygonal mirror, from a detector such as a photodiode positioned to intersect the scanning beam at one edge of the touch sensitive region. Where multiple touch surfaces are defined by deflecting the scanning beam in the Z-direction the same or another photodetector may be employed to define the reference position of the scanning beam (since the tilt angle will in general be known).
  • a touch sensing system may be provided with a retro-reflector to facilitate such a procedure.
  • a retro-reflector may be employed with any type of touch sensing system which is able remotely to determine the location of one or more touch objects.
  • the touch sensing system may be provided with a power and/or safety control system to reduce an average optical output from the laser when a speed of the scanning is reduced.
  • the system may default to only switching the laser on, or up to its usual operating power, when scanning at full operating speed.
  • the touch sensing system may pulse the optical beam on and off during the scanning process, for example between successive mirror facets, to determine a background level of illumination which may then be subtracted from the signal, when scanning, for improved performance.
  • the invention provides a method of processing data from a touch sensing system as recited in any preceding claim, the method comprising: inputting first and second sets of touch data from candidate touch objects identified by respective stereoscopic touch sensors/sensing systems, said touch data defining touch object distance and angle data for a plurality of said touch objects; matching touch objects in said first and second sets of touch data; and using said angle data to determine object location data for each said touch object.
  • the matching comprises determining a set of match cost data for potential pairings of the touch objects detected by the sensors/sensing systems, and matching touch objects to reduce a total match cost defined by this data.
  • the cost can factor in other distance information such as that described previously.
  • one or more unmatched touch objects may be identified and positions determined for these for example using the distance information from one stereoscopic touch sensing system optionally supplemented by one or more additional sources of distance information as previously described and, in some preferred implementations, also supplemented by object tracking data.
  • a history of motion of each touch object is tracked, for example using Kalman filtering or the like, and then if, say, one object is briefly occluded by another the history of motion of an object may be employed to determine the objects location, albeit with decreasing probability of correctness over time.
  • object tracking is also employed to disambiguate multiple touch objects - for example should two objects as seen by the touch sensing system briefly coalesce and then separate motion tracking may be employed to follow the tracked objects (as we have previously described, for example in WO 2012/172364, hereby incorporated by reference).
  • object tracking is also employed to disambiguate multiple touch objects - for example should two objects as seen by the touch sensing system briefly coalesce and then separate motion tracking may be employed to follow the tracked objects (as we have previously described, for example in WO 2012/172364, hereby incorporated by reference).
  • three or more such systems may be employed depending, for example, on the size and/or shape of the touch sensing region.
  • the invention further provides processor control code to implement the above- described systems and methods, for example on a general purpose computer system or on a digital signal processor (DSP).
  • the code is provided on a physical data carrier such as a disk, CD- or DVD-ROM, programmed memory such as non-volatile memory (e.g. Flash) or read-only memory (Firmware).
  • Code (and/or data) to implement embodiments of the invention may comprise source, object or executable code in a conventional programming language (interpreted or compiled) such as C, or assembly code, or code for a hardware description language. As the skilled person will appreciate such code and/or data may be distributed between a plurality of coupled components in communication with one another.
  • Figures 1a and 1 b show, respectively, a vertical cross section view through an example touch sensitive image display device, and details of a sheet of light-based touch sensing system for the device;
  • Figures 2a and 2b show, respectively, a plan view and a side view of an interactive whiteboard incorporating a touch sensitive image display
  • Figure 3 shows a touch sensing system using a scanning laser beam and space-time object localisation
  • Figure 4 shows a schematic illustration of object distance perception in the system of figure 3
  • Figure 5 illustrates a stereoscopic touch sensing system using scanning laser beams, according to an embodiment of the invention, and its operation
  • Figure 6 illustrates techniques for improving the operation of the system of Figure 5
  • Figure 7 shows a flow diagram of a signal processing procedure for the system of figure 5, including multi-touch processing steps
  • Figure 8 shows an example of an interactive touch sensitive display incorporating a pair of touch sensing systems each as illustrated in figure 5, schematically illustrating multi-touch sensing.
  • FIG 3 shows a touch sensing system 300 comprising a polygonal rotating scanning mirror 302 illuminated by a laser 304 followed by optional collimation optics 306 providing a collimated beam 308 of, for example, infrared light.
  • Rotation of the polygonal scanner (mirror) 302 generates a scanned beam 310, the locus or envelope of the scanned beam defining a surface 312, typically a plane, for touch sensing.
  • a display surface 314 such as a wall or white board onto which an image may be projected, a flat panel display, and the like.
  • the beam 310 scans through the touch surface 312, over the display surface 314, light 320 from one or more objects 318 on the surface, for example one or more fingers, is scattered back towards scanner 302.
  • This light hits an adjacent face or facet 302b of the scanner to the face or facet 302a which directed the scanning beam, and reflects the scattered light through the same angle towards imaging optics 322 which image the scattered light onto a photodiode array 324 tilted with respect to an optical axis 322a of the imaging system.
  • the photodiode array comprises a line of 16 sensors.
  • the photodiode array provides light sensing signals to a bank of analogue-to-digital converters 326 (or a multiplexed converter) which in turn provides signals to a digital signal processor 328 which processes the photodiode array information in conjunction with timing data, to provide location data for object 318 as described in more detail later.
  • the digital signal processor 328 may be implemented in hardware (circuitry), software, or a combination of the two.
  • a further photodetector 330 is directed towards the illumination face of the polygonal scanner 302 to allow measurement of the scanning angle at which the object 318 is detected. More particularly, photodetector 330 defines a reference direction 332 for scanning beam 310 and thus allows the scanning angle of the beam, ⁇ , to be determined from the timing of the illumination of the photodiode array (knowing the rotation rate of scanner 302 or similar scan rate information). The skilled person will appreciate that a reference angle for scanner 302 may be determined in many different ways based on, for example, drive to and/or timing information from the scanner. In an example embodiment with a six-sided polygonal scanner spinning at 10,000rpm there are 60,000 faces or sweeps per minute, 1 ,000 sweeps per second.
  • Data may be captured at, for example, a 1 KHz capture rate by a bank of 16 A/Ds, defining 1 Msamples/sec for DSP 328 to process.
  • the optics and photodiode array are configured such that, in addition to the signal timing information from the scanner providing angular information ⁇ , scatter from the touch object is re-imaged onto the surface of the photodiode array 324.
  • d PD provides information related to the distance between the intersection of the scanning beam 310 with the respective polygon face 302a, and the touch object in the touch surface 312.
  • the angle of the photodiode array to the optic axis 322a can be adjusted such that the position on the photodiode array is substantially inversely proportional to the distance along the scanning beam to the touch object.
  • the arrangement of figure 3 can provide 2D position information as the beam scans for one or multiple touch events in a touch surface, either in measured, polar co-ordinates (d, ⁇ ) or in real (x, y) space in the touch surface, optionally referenced to the display surface 314.
  • Figure 3 illustrates an embodiment of the system employing a polygonal scanner 302
  • the rotating facets 302a, b of the scanner may be replaced by two synchronised MEMS (microelectromechanical system) scanners, one to scan the laser beam through the touch surface and the second to direct or 'scan' light scattered by a touch object onto the photodiode array 324.
  • MEMS scanning devices rather than a polygonal scanner can help to reduce the system size and acoustic noise.
  • the MEMS scanners may be synchronised by providing a synchronised electronic drive to the scanners, for example employing a phased locked loop to keep the scanners at the same frequency, preferably a resonant frequency of the scanners, and in phase synchronisation.
  • Figure 4 this shows a simplified, schematic illustration of a system of the type shown in Figure 3 but with an octagonal scanner for the purpose of simplifying the explanation.
  • the scanning and return beams 310, 320 are scanned together by scanner 302, pointing in the same direction as the scanner rotates.
  • the return, Imaging' beam is displaced to one side of the scanning beam and thus the imaging system, more particularly the photodiode array, sees the object at a different angle depending upon the distance of the object from the scanner.
  • the photodiode array images a one dimensional strip along the scanning beam starting at a closest observable object position, for example object position 318a, and extending towards infinity.
  • the beam 310 scans, there is a point at which it intersects the touch object which flashes its radial position onto the photodiode array, rather in the manner of a lighthouse beam. There is an approximate inverse proportionality between the touch object distance and the position of the imaged light scattered from the object.
  • the scanning and imaging directions are swept together by a pair of mirrors which keeps the imaging system looking down the scanned direction.
  • the number of mirror faces for a polygonal scanner may be varied according to the desired scan angle, fewer faces resulting in a larger scan angle. For a f 2 ⁇ ⁇
  • this shows an embodiment of a touch sensing system 500 according to the invention employing a polygonal mirror scanner 502 rotating rapidly as described previously to scan a pair of laser beams 504a, b over a touch sensitive region 506.
  • the beams scan over greater than 90°; this facilitates using a photodiode 508 to set a reference direction for the scanned beams.
  • mirror 502 has six facets but in principle any number of facets may be employed.
  • a pair of lasers 510a, b provide beams to respective beam splitters 512a, b which provide beams 504 to the touch area via mirror 502; the beam splitters 512 pass scattered light returning along the respective beam directions to respective lenses 514a, b which image the scattered light onto respective photodiodes 516a, b.
  • the photodiodes provide inputs to analogue-to-digital convertors 518 which provide a digitised version of the received scattered light from the scanned region to a digital signal processor 520 which outputs position data for one or more touches in rectangular and/or polar coordinates.
  • An example touch object is illustrated at 318; beams 504a, b make respective angles a and ⁇ with an arbitrarily defined 0° reference direction.
  • the contours 522 in figure 5a illustrate equal probability contours for the position of object 318 showing, schematically, that the angular position of the object is more precise than the radio distance from mirror 502.
  • the dotted contours 524 illustrate corresponding contours for a second stereoscopic touch sensing system (not shown in figure 5a), for example at a second corner of an interactive whiteboard.
  • Figure 5b illustrates, schematically, signals obtained from beams 504a (dotted line) and 504b (solid line), each of which is referenced to the 0° direction, showing the received signal on the vertical axis.
  • a touch object generates a pair of pulses with a separation A which determines the distance to object 318.
  • the timing, or equivalently angle, of a peak determines the angle of a beam to the object.
  • FIG. 5c shows determination of object location from the touch data from the system of Figure 5a: locations 302a, b mark the locations where beams 504a, b impinges on mirror 502; these have separation s.
  • the location of object 318 can either be geometrically determined from the shape of the triangle illustrated or, more straightforwardly, because locations 302a, b are known together with the directions of lines 504a, b, intersection point 318 can be determined from the intersection point of the two lines, that is from the equations of the lines.
  • Figure 5d illustrates, schematically, multi-touch disambiguation. Two genuine touch events are shown at circles 318 but with only angular information these would be indistinguishable from an equivalent pair of touch events shown by squares 318'. It can be appreciated that this ambiguity can be resolved if even only approximate distance information for a touch object is available.
  • one simple approach is to determine that if the distance between touch events detected by the two stereoscopic systems are within a threshold distance of one another they constitute the same touch event.
  • the object touch data from each system 500 may (but need not necessarily be) converted into a common rectangular co-ordinate system.
  • a simple multi- touch system steps through each candidate touch position from a first stereoscopic sensing system checking whether, for each, there is a matching touch detected by the second (and/or a third or subsequent) system, to identify matched pairs.
  • the system may step through all the candidate pairings to find the best matched pairings.
  • a combinatorial optimisation algorithm may be employed, for example the Kuhn-Munkres algorithm (Hungarian assignment algorithm) may be employed.
  • an array is constructed for example of the type illustrated in figure 5e in which m events are seen by a first stereoscopic sensing system A and n events are seen by a second stereoscopic system B.
  • the matrix is then populated by a set of matched costs where events can be paired and a set of unmatched costs (reflected in the two diagonally opposite quarters of the array) where an event seen by one system does not match with the other.
  • the remaining portion of the matrix is populated by zeroes.
  • a variant of the system may be operated with the detector data from the stereoscopic touch sensing systems without first processing this to provide location data from each system - that is, for example, the separate peak positions shown in figure 5b may provide inputs to the algorithm.
  • the procedure may be run multiple times, for example once to identify peak pairings in each stereoscopic touch sensing system, to determine pairings for each system, and then a further time to determine pairings for the combination of stereoscopic touch sensing systems.
  • Figure 6a illustrates, conceptually, combining angular information from a stereo pair of sensors where the solid lines indicate a tolerance of the angular information and the cross-hatched area the uncertainty of the overlap. It can be seen that there is still some residual uncertainty. Optionally this can be reduced by including additional sources of distance information. Additionally or alternatively such additional sources of distance information may be used for multi-touch disambiguation where, as illustrated in figure 6b, one object 318a partially or wholly obscures a second object 318b. As illustrated in the accompanying curves, in such a situation the scatter light seen by the photo detector is the envelope (solid line) of what would be seen for each object by itself (dotted lines). In many cases separate peaks can be resolved but this is not always the case. Where distance information is only available from one stereoscopic device it can be helpful to attempt to determine a more accurate distance using just one device.
  • One technique is to rely on the brightness of the scattered light, which varies inversely with distance. With the assumption that, say, the reflective object is a finger this can provide a helpful level of uncertainty reduction.
  • the width of the signal versus angle peak also varies approximately inversely with distance, which again can be used to derive additional distance information.
  • a still further source of information can be derived from the size of the spot imaged onto the photo detector 516: although the laser beam 504 is collimated, in effect the size of the spot on the object looks larger when the object is closer (again approximately inverse proportionality to distance).
  • a detector of the form illustrated in figure 6c in which a central photo detector 516 is provided with one or more additional photo detectors 602 for example in the form of one or more rings around photo detector 516, then the spot size imaged by lens 514 can be measured.
  • circle 610 is the locus of all points forming the same angle ( ⁇ - ⁇ ) with deflection points 302a, b.
  • ⁇ - ⁇ the angle which are equal to or greater than that defined by circle 610 are unphysical.
  • detection of such angles can be flagged as errors.
  • other constraints on signal brightness and/or width may also be applied.
  • the sensitivity to a touch object varies with the angle of beam 504 - the sensitivity tends to be lower at grazing angles of incidence onto mirror 502 (and this affects the two beams in different ways).
  • the sensitivity of the system may be characterised and sensitivity characterising data stored so that, in operation, the system can be compensated for variations in sensitivity with beam angle. It will be appreciated that the system design is characterised, rather than each individual system needing characterisation.
  • Figure 6e shows a variant of the scanner 502 in which adjacent pairs of facets 620a, b; are provided with a wedge or other structure (such as a diffraction grating) to change the angle of the scanning beam away from the touch surface 506/display surface 314.
  • a wedge or other structure such as a diffraction grating
  • pairs of facets 620 define a second touch surface or plane at an angle to the first touch surface or plane, which allows sensing of objects further from display surface 314 to provide, for example, gesture recognition or a 'hover touch' capability.
  • Other (pairs of) facets 622 direct the scanning beam within the first touch surface.
  • Figure 7 shows a flow diagram of a procedure for determining an object location using the system of Figure 5, which may be implemented in software on a storage medium 710 and/or hardware in digital signal processor 520.
  • the procedure inputs illumination data from photodiodes 516 and uses this data to determine 714, for each touch event, respective angle and distance information, di ... d N , ⁇ ... ⁇ ⁇ , as previously described. It will be appreciated that this may be represented in either rectangular coordinates.
  • the system inputs 720 data from photodiode 508 and from the pulse timing determines 722 a reference direction for angles ⁇ .
  • this information provides two-dimensional position data for one touch object and this may be combined 716 with two-dimensional position information for one or more other touch objects to match corresponding positions in 2D using the approximate distance information, then determining accurate distance information by triangulation using the azimuth angle data.
  • This multitouch 2D data may then optionally be tracked 718, for example using one or more tracking filters such as a Kalman filter to provide trapped multi-touch object location data.
  • the tracking may employ additional data from the touch sensing system, for example photodiode illumination duration data which may be employed to give an indication of the size of a touch object - a large object may indicate, for example, a pair of adjacent fingers; similarly a two peaked illumination pattern may indicate multiple objects.
  • the size of an object as 'viewed' by the scanning beam may be determined from a knowledge of the beam width, angular velocity and illumination duration.
  • Embodiments of the multi-touch object tracking may include tracking to provide continuity between touch locations where one touch object briefly occludes another in the scanning beam direction, then, for example, filling in by interpolation or extrapolation.
  • Figure 8 shows, schematically, a pair of sensing systems 500 of the type shown in Figure 5 used to locate two simultaneous touch objects 318 where, again, without distance information there would be ambiguity as to which touch was at which angle.
  • Figure 8 illustrates, schematically, the display surface 314 of an interactive whiteboard 800 (image projector not shown). It is helpful to locate one of the touch sensing modules near the top left corner of the display surface and one near the top right corner, to help reduce the risk of occlusion when there are multiple touch objects; optionally a third system may be located centrally along the top of the touch sensitive region. Since there is relatively low positional accuracy on a line directly between the touch sensing modules, preferably these are displaced laterally from the edge of the display surface so that touch events do not lie directly on a line between the modules.
  • the signal processing includes a calibration procedure 750.
  • the location of a touch object is determined by calculation as previously described, and then a correction is applied to compensate for small errors, for example caused by errors in the positioning of the scanning mirrors and other misalignments.
  • the correction may comprise, for example, an arbitrary polynomial mapping to a corrected position (x',y r ) ⁇ - ( x ,y) or a translation/scaling/rotation, some other correction, for example to correct for a response of the imaging optics, to calibrate a rotation or scanning rate of the scanner, and the like.
  • the calibration procedure may involve touching a known location in the touch surface.
  • a calibration may be applied for each separate mirror facet by recording the calibration for each facet, identifying the facets by a fiducial mark or other coding on the scanner - this can help to correct for period variation in the detected location.
  • the system may be self-calibrating/learning.
  • multiple modules may be placed on a surface and detect one another's respective positions in 2D, calibrating automatically by communicating this data between themselves. This may be facilitated by provision of a retro reflector 802 on a touch sensing module.
  • the photodiode 508 of Figure 5 used to establish a reference angle/timing may also be used for power saving and safety.
  • the system may be shut down if the lack of a regular pulse at photodiode 508 is detected.
  • the laser may be operated in a default, eye safe mode where the laser is operated at low power (or with a short duty cycle) only switching the laser fully on when scanning at full speed.
  • the laser may operate at reduced power during an initial spin-up phase.
  • the power or duty cycle of the laser may be reduced when there are no active touch events, to provide a reduced power standby mode.
  • the rotation rate of the scanner may be reduced at the same time.
  • the laser may be pulsed on and off; more particularly the laser may be switched off during the transition between mirror facets (or in some similar portion of the cycle of a MEMS scanner) to provide a background illumination signal level from the photodiode array which may then be subtracted to reduce the background noise level.
  • Preferred implementations of the touch sensing system employ an infrared rather than a visible laser. Shorter wavelengths, for example 785 ⁇ , are generally preferable for reduced cost, but longer wavelengths, for example 1.3 ⁇ or longer are preferable for improved eye safety. Whichever wavelength is employed, embodiments of the system include a band pass filter, for example a narrowband interference filter, in the optical path prior to the photodiode array to select for the wavelength of the laser and reduce the ambient light signal.
  • the scanning laser may be pulsed on and off. Then during the off period a background reading from the photodiode may be taken and subtracted from subsequent readings when the scanning laser is active. This subtraction may be carried out in the analogue and/or digital signal processing domain.
  • time-of-flight detection may be employed to provide further additional accuracy.
  • the time of flight to the photodetector may be employed for an additional measurement of distance to the one or more touch objects, and this additional distance information may be employed to augment that obtained from the offset on the photodiode array to provide improved two-dimensional position detection accuracy.
  • sufficient accuracy may be obtained, depending upon the requirements of the touch system, by a single touch sensing module rather than the pair of modules illustrated in Figure 8.
  • Alternatively a combination of time-of-flight and/or triangulation and/or photodetector-based position measurements may be employed with multiple touch sensing modules.
  • time- of-flight may be employed for greater distances where the offset-based approach is less accurate, and triangulation for shorter distances, in this way providing better performance than either alone.
  • the systems we describe may nonetheless provide sufficient 2D position accuracy on their own in some applications without the need for triangulation.
  • Touch sensing systems of the type we have described provide a number of advantages. They can be manufactured very inexpensively, even for systems covering a large LCD screen or interactive whiteboard (a larger display simply requires a slightly higher power laser); they have a relatively low profile, for example less than 10mm or less than 5mm above the display surface or potentially even thinner, and can provide a high refresh rate, for example 200Hz-1 KHz or greater. Further apart from small touch scanning modules positioned near the corners of the display surface no other hardware is required and, in particular, there is no need for a bezel around the display area.
  • a diffractive optical element may be employed in principle other types of scanning device, for example a diffractive optical element may be employed. However in this latter case the timing requirements make it preferable for the diffractive scanner to provide a comb of scanning beams rather than a single scanning beam.
  • the techniques we have described are particularly useful for providing implementing large touch-sensitive regions (say >0.5m in one direction), for example for large touch- sensitive displays. Such displays may be based on any display technology including flat screen and projection technology.
  • the techniques we have described are particularly useful for an interactive whiteboard but also have advantages in smaller scale touch sensitive displays.

Abstract

We describe a touch sensing system comprising: one or more optical beam sources to provide a pair of optical beams; a pair of controllable beam deflectors comprising at least first and second beam deflectors, wherein said first beam deflector is configured to deflect a first said optical beam through a first angle towards a touch sensing region, and wherein said second beam deflector is configured to deflect a second said optical beam through a second angle towards said touch sensing region; first and second optical detectors to detect first and second scattered light from each of said first and second beams, wherein said first and second optical detectors are configured to detect scattered light counter-propagating along said first and second optical beams; and wherein said first and second beam deflectors are controlled in tandem to scan said touch sensing region.

Description

Touch sensing systems
FIELD OF THE INVENTION
This invention relates touch sensing systems and methods which, in embodiments, can be used to provide a virtual touch sensing surface just above a display screen, whiteboard or the like.
BACKGROUND TO THE INVENTION
We have previously described touch sensing systems employing a plane or sheet of light, for example as shown in Figures 1 and 2. These techniques may be employed for detecting touches or proximate to a surface.
Figure 1 shows an example touch sensitive image projection device 100 comprising an image projection module 200 and a touch sensing system 250, 258, 260 in a housing 102. A proximity sensor 104 may be employed to selectively power-up the device on detection of proximity of a user to the device. The image projection module 200 is configured to project downwards and outwards onto a flat surface such as a tabletop; boundaries of the light forming the displayed image 150 are indicated by lines 150a, b. The touch sensing system 250, 258, 260 comprises an infrared laser illumination system 250 configured to project a sheet of infrared light 256 just above the surface of the displayed image 150 (for example ~1 mm above, although in principle the displayed image could be distant from the touch sensing surface). The laser illumination system 250 may comprise an IR LED or laser 252, collimated then expanded in one direction by light sheet optics 254 such as a cylindrical lens. A CMOS imaging sensor (touch camera) 260 is provided with an IR-pass lens 258 and captures light scattered by touching the displayed image 150, with an object such as a finger, through the sheet of infrared light 256 (the boundaries of the CMOS imaging sensor field of view are indicated by lines 257, 257a, b). The touch camera 260 provides an output to touch detect signal processing circuitry as described further later. These techniques may be employed with any type of image projection system. Figure 2, this shows plan and side views of an example interactive whiteboard touch sensitive image display device 400 incorporating such a system. In the illustrated example there are three IR fan sources 402, 404, 406, each providing a respective light fan 402a, 404a, 406a spanning approximately 120° together defining a single, continuous sheet of light just above display area 410. The fans overlap on the display area (which is economical as shadowing is most likely in the central region of the display area). Typically such a display area 410 may be of order 1 m by 2m. The side view of the system illustrates a combined projector 420 and touch image capture camera 422 either aligned side-by-side or sharing a portion of the projection optics. The optical path between the projector/camera and display area is folded by a mirror 424. The sheet of light generated by fans 402a, 404a, 406a is preferably close to the display area, for example less than 1 cm or 0.5cm above the display area. However the camera and projector 422, 420 are supported on a support 450 and may project light from a distance of up to around 0.5m from the display area.
There is, however, a desire for alternative/improved techniques. For example, in particular for large LCD display screens, it would be advantageous to be able to provide touch detection from a relatively low height above the display surface, for example less than 1 cm rather than by employing a camera looking down on the display surface. It is also desirable to reduce cost and improve the user experience, for example by improving the 'refresh rate' and reducing the noise.
One type of touch detection is described in US2009/0091553, in which a laser is raster scanned across a display from behind, beneath the display surface. Another typical scanning-type touch panel is described in US2001/0028344, employing a polygonal mirror to angularly scan laser light for touch detection. The system uses light scanned over the surface of a display screen and employs scanning angle/timing for touch detection but provides no distance information along the scanned beam, in a direction perpendicular to the sweep. A still further technique is described in US4, 81 1 ,004, which employs a pair of swept beams and detectors to sense an interruption in the beams.
It is desirable to improve upon these known techniques. We have previously described one improved approach, in our GB1214465.5 filed on 14 Aug 2012. We now describe a further technique. SUMMARY OF THE INVENTION
According to the present invention there is therefore provided a touch sensing system comprising: one or more optical beam sources to provide a pair of optical beams; a pair of controllable beam deflectors comprising at least first and second beam deflectors, wherein said first beam deflector is configured to deflect a first said optical beam through a first angle towards a touch sensing region, and wherein said second beam deflector is configured to deflect a second said optical beam through a second angle towards said touch sensing region; first and second optical detectors to detect first and second scattered light from each of said first and second beams, wherein said first and second optical detectors are configured to detect scattered light counter-propagating substantially along said first and second optical beams; and wherein said first and second beam deflectors are controlled in tandem to scan said touch sensing region.
In embodiments the optical deflectors comprise facets of a rotating polygonal mirror, but alternatively a pair of synchronised, steerable mirrors may be employed, more particularly synchronised MEMS deflectors. Where a polygonal mirror is employed, the facets providing the first and second beam deflectors, this defines a fixed angle between the mirror facets. Where the beam deflectors comprise MEMS deflectors they may be synchronised, for example, by synchronising phase locked loop driver circuits for the MEMS deflectors.
Broadly speaking the beam deflectors scan a pair of beams across the touch sensing region to define a plane (or other shape), to provide a stereoscopic sensor with a rather short baseline. The skilled person will appreciate that although the beams are scanned in tandem with one another the optical configuration, and in particular the physical separation of the beam deflectors, means that generally the beams will scan at the same point within the touch sensing region at different times. Preferably at least one of the beams is provided with means for defining a reference direction for the beam which, conveniently, may be a photodetector located within or towards one extreme of a beam's sweep.
The relatively short baseline for the pair of scanned beams results in relatively inaccurate information on distance to a touch object, but accurate angular information. However this is a useful form of information for a multi-touch sensing system, as described later, because the distance information, although of low accuracy, can be used to disambiguate multiple touches. Thus, broadly speaking, such as a multi-touch detection system may comprise two or more touch sensing systems each as described above. In broad terms each touch sensing system scans a beam to provide angular information on the location of a touch object and then triangulation can be employed to determine the touch location. However at a finer level of detail each system scans a pair of beams in tandem so that each provides distance as well as angular information, which is employed by the touch signal processing system.
When a scanning beam intersects a touch object in general light is scattered through a range of different directions, but in preferred implementations only light scattering substantially counter-propagating along a beam is detected. This provides an inherent resilience to background noise from ambient light (we will later describe systems which in effect look at the beam from the side, but these tend to have larger ambient background noise). It will be appreciated that, depending upon the optical arrangement, light counter-propagating along a beam may be slightly displaced from the beam or at a slight angle to the beam depending, for example, on the effective size of the 'aperture' collecting the light - which may be determined by the lateral dimension of a beam deflector, for example the size of a facet of the polygonal mirror.
In one preferred implementation the optical architecture of the touch sensing system comprises a pair of collimated optical beam sources, for example lasers, each illuminating a respective beam deflector. A beam splitter is provided in each optical path between the source and deflector to split a return signal from the deflector into a photodetector such as a photodiode. An imaging device such as a lens or mirror is provided to image the scattered light onto the photodetector; this may be focused at infinity but may have a depth of field sufficient to focus light from a closest design position of a touch object to the sensing system.
Embodiments of the system further comprise a signal processor in either hardware, for example an ASIC (application specific integrated circuit), software, or a combination of the two, the process signals from the detectors to determine touch position data defining an angular position of a touch and distance information, in particular defining an approximate distance between the touch and a beam deflector. The touch position data may define a touch position in any convenient co-ordinate system, for example either polar or rectangular co-ordinates - the skilled person will appreciate that even when in the form of rectangular co-ordinates the touch data will nonetheless define an angle (to a reference direction) and a distance (from a fiducial point such as the centre of a beam deflector). Embodiments of the signal processor provide touch position data for multiple touches within the touch sensing region.
A preferred embodiment of a multi-touch sensing system comprises a pair of touch sensing systems each as previously described together with signal processing to identity matched pairs of touches in the respective touch sensing regions of the systems (which are arranged to overlap). Further, the signal processing is configured to disambiguate angular positions of multiple simultaneous touches using the (approximate) distance information for the multiple touches from each of the pair of combined touch sensing systems. The skilled person will appreciate that such a multi- touch system may have separate signal processing for each touch sensing system of the pair followed by an additional stage of signal processing to combine the touch position data from the pair of sensing systems, or the touch signal data from all of the detectors in the pair of touch sensing systems may be provided to a common signal processing system which processes the signals from these detectors in combination. The signal data from these detectors may be combined in many different ways, for example by a cost matching process aiming to minimise a cost function for paired sensed touch positions (as described later) and/or using other techniques, for example Bayesian techniques. In a related aspect of the invention there is therefore provided a multi-touch sensing system, comprising: a pair of stereoscopic touch sensors, each configured to scan a touch sensing region with a pair of beams using a common beam deflection system and configured to detect a pair of return beams scattered from a plurality of touch objects to provide stereoscopic touch object location data defining a location of a touch object in 2D with respect to a said sensor; and at least one signal processor to process said touch object location data from said pair of stereoscopic touch sensors to match said touch objects detected by said pair of stereoscopic touch sensors dependent on said distance of said touch objects from said respective sensors, and to determine location data for said touch object from said angular positions from each of said sensors for said matched touch objects. In embodiments the distance between a touch object and a stereoscopic touch sensor has a greater error than an angular position of the touch object with respect to a reference direction, and the signal processor processes the touch object location data to determine refined location data for the touch objects in which the distance information is more accurate than that provided by each stereoscopic touch sensor alone. Thus the distance error in, say, percentage terms is reduced. The angular error in position determined by a stereoscopic touch sensor may be defined in terms of a distance by defining the angular spread as a (circumferential) distance at a radial distance of the touch object; alternatively the error in radial distance may be compared with the error in angular position by simply comparing each as a percentage error. In either case embodiments of the system provide multi-touch disambiguation as well as improved accuracy location information. As previously described each touch sensor may comprise a rotating polygonal mirror or a synchronised pair of MEMS deflectors.
In embodiments the touch object location data from a stereoscopic sensor prior to processing, either alone or in combination with the data from a second sensor) comprises data defining, for a single touch position, a pair of peaks on a representation of signal against beam deflection angle. The separation of such peaks defines the distance to a touch object and a timing of a peak defines the angular position of the object. (The physical separation of the deflectors will result in slightly different angles being observed by the two detectors but it will be appreciated that in principle the position of a only a single peak and the separation of the pair of peaks need be known).
Embodiments of the system described above may determine additional distance information for a touch object which may be used in a number of different ways. The additional distance information may be combined with the distance information derived from the stereoscopic view of a touch sensing system to increase the accuracy of the distance information. Additionally or alternatively this additional distance information may be employed when other distance information is not available - for example when one touch object or finger obscures another (when a nearby object interrupts the scanning beam a further object in the same direction may not be visible). Further additionally or alternatively the additional information may be included in a combined cost function for use when matching pairs of touches detected by first and second stereoscopic touch sensing systems. One source of additional distance information is the brightness of light scattered from a touch object which falls off with approximately the inverse of the distance of the object from the deflector. Another is the angular spread of light scattered from a touch object. This may be determined by a photodetector arranged to be sensitive to this angular spread by, for example, having two or more light sensitive regions one displaced away from an optic axis at which the other is located. Alternatively this may be considered as the angle subtended by the return beam of the scattered light at an aperture viewing the return beam, detected by detecting the size of the spot where the return beam is focused onto the detector. Further additionally or alternatively distance information may be obtained from the width of a peak in a representation of signal against beam angle as this peak width again varies approximately inversely with distance.
In general the touch sensing region will define one or more planes, but in principle this region may have other shapes. In embodiments the touch sensing region defines a plane a short distance away from an interactive physical surface, for example the surface of an interactive whiteboard or electronic display. Thus the touch sensing region may define a touch surface in space (a 'virtual' touch surface), and where employed to provide an interactive physical surface preferably this touch surface is located within 1 cm or less of the physical surface. In such a case, if the physical surface is for example slightly bowed, a small correction mirror may be moved up and down in front of the beam deflectors as the beam is scanned and thereby define a touch surface approximating the surface of a frustrocone in embodiments. Optionally the touch sensing region may comprise multiple touch sensitive planes (or other surfaces) to define a three dimensional touch sensing region. As previously described, the second beam deflector is laterally displaced from the first beam deflector, but the lateral displacement may be small in order that the touch sensing system is small. Thus the distance information may be relatively inaccurate (the inaccuracy increasing with increasing distance from the beam deflectors). In embodiments the system provides only an approximate value for the distance of the touch object, and this distance may be accompanied by error, standard deviation or other probability data defining a degree of accuracy of the distance determination (the error being larger with increasing distance). Nonetheless embodiments of the system can be of great utility, in particular for multi-touch detection: With multi-touch detection two (or more) embodiments of the system may be employed to provide accurate object angle data, which can be used to determine an object position by triangulation. However with multi-touch detection it can be problematic to link a set of angles from one touch detector with the corresponding angles for the same touch objects from another touch detector because there is ambiguity in the data (two different configurations of multi-touch pattern can give rise to the same sets of angular data from the sensors). In embodiments the distance information, albeit approximate, can be employed to disambiguate the angular touch data. More particularly the distance information can be used to determine which sets of angular measurements should be paired to identify corresponding touch objects seen by the respective sensors. Optionally the system may be arranged to project two (or more) planes of light at different angles to extend the touch sensing region into a third dimension perpendicular to the touch plane. In a system with a polygonal mirror this may be achieved, for example, by providing a wedge or other optical deflector on pairs of adjacent mirrors (facets) so that as the mirror rotates the scanning beam is directed up (or down) at an angle and the imaging system is arranged by the corresponding wedge on the facet capturing light for the imaging system, to look in a direction tilted upwards (or downwards) along the scanning beam. However it is not essential that the imaging system looks precisely along the scanning beam as, in general, the detector will have a viewing angle perpendicular to the scanning plane which may be sufficient to encompass small tilts of this plane up and/or down (the detector will be responsive to light scattered from a "slice" having some thickness in the Z-direction). Thus in a system with a polygonal mirror, as the mirror rotates there will be times when one scanning or imaging facet has a beam-tilting wedge and the other does not - but nonetheless useful signals can still be obtained at these times. Additionally or alternatively the detector may be extended perpendicularly to its longitudinal direction and/or a two dimensional array may be employed to provide some additional field of view in a Z-direction perpendicular to the plane/parallel to the scanning axis. A similar approach may be employed in a system with MEMS beam deflectors, when one or both deflectors may be arranged to tilt the scanning beam out of a plane perpendicular to the scanning axis. Each of these approaches provides additional touch functionality, for example to detect degree and/or speed of approach, gestures and the like.
As previously described, embodiments of the system include a signal processor to output data defining a location of one or more touch objects in two or three dimensions. In a multi-touch system the signal processor may be shared amongst the touch systems and may also use angular information for triangulation to locate touches and use distance information for touch disambiguation as previously described. In embodiments the signal processor is coupled to the detector to receive an imaging signal and, from this, to determine an approximate distance to a touch object. This is described in more detail later. Embodiments of the signal processor also receive a timing signal input for a timing signal defining a fiducial position of a beam deflector. This can be used to determine the azimuthal angle of the scanning beam, together the distance and azimuth angle defining the location of a touch object in polar coordinates relative to the touch sensor. Optionally this may be converted to rectangular (or other) coordinates. The timing signal defining the position of a beam deflector may be obtained from the drive electronics of a MEMS beam deflector or in the case of a polygonal mirror, from a detector such as a photodiode positioned to intersect the scanning beam at one edge of the touch sensitive region. Where multiple touch surfaces are defined by deflecting the scanning beam in the Z-direction the same or another photodetector may be employed to define the reference position of the scanning beam (since the tilt angle will in general be known). Potentially, in a multi-touch system similar signal processing may be employed by one touch sensing system to determine the location(s) of one or more other touch sensing system(s) and the touch sensing systems may then communicate with one another and/or a central controller so that the relative positions of the set of touch sensing systems are known to the system. This enables the system to be self-calibrating/self- learning. Optionally a touch sensing system may be provided with a retro-reflector to facilitate such a procedure. In principle such an approach may be employed with any type of touch sensing system which is able remotely to determine the location of one or more touch objects. In embodiments the touch sensing system may be provided with a power and/or safety control system to reduce an average optical output from the laser when a speed of the scanning is reduced. Thus, for example, the system may default to only switching the laser on, or up to its usual operating power, when scanning at full operating speed.
Optionally in embodiments the touch sensing system may pulse the optical beam on and off during the scanning process, for example between successive mirror facets, to determine a background level of illumination which may then be subtracted from the signal, when scanning, for improved performance.
In a related aspect the invention provides a method of processing data from a touch sensing system as recited in any preceding claim, the method comprising: inputting first and second sets of touch data from candidate touch objects identified by respective stereoscopic touch sensors/sensing systems, said touch data defining touch object distance and angle data for a plurality of said touch objects; matching touch objects in said first and second sets of touch data; and using said angle data to determine object location data for each said touch object.
In some preferred implementations the matching comprises determining a set of match cost data for potential pairings of the touch objects detected by the sensors/sensing systems, and matching touch objects to reduce a total match cost defined by this data. Optionally the cost can factor in other distance information such as that described previously. In addition one or more unmatched touch objects may be identified and positions determined for these for example using the distance information from one stereoscopic touch sensing system optionally supplemented by one or more additional sources of distance information as previously described and, in some preferred implementations, also supplemented by object tracking data. Thus in preferred embodiments a history of motion of each touch object is tracked, for example using Kalman filtering or the like, and then if, say, one object is briefly occluded by another the history of motion of an object may be employed to determine the objects location, albeit with decreasing probability of correctness over time. Preferably object tracking is also employed to disambiguate multiple touch objects - for example should two objects as seen by the touch sensing system briefly coalesce and then separate motion tracking may be employed to follow the tracked objects (as we have previously described, for example in WO 2012/172364, hereby incorporated by reference). The skilled person will appreciate that although, for simplicity, we have described combining pairs of stereoscopic touch sensing systems, three or more such systems may be employed depending, for example, on the size and/or shape of the touch sensing region.
The invention further provides processor control code to implement the above- described systems and methods, for example on a general purpose computer system or on a digital signal processor (DSP). The code is provided on a physical data carrier such as a disk, CD- or DVD-ROM, programmed memory such as non-volatile memory (e.g. Flash) or read-only memory (Firmware). Code (and/or data) to implement embodiments of the invention may comprise source, object or executable code in a conventional programming language (interpreted or compiled) such as C, or assembly code, or code for a hardware description language. As the skilled person will appreciate such code and/or data may be distributed between a plurality of coupled components in communication with one another.
In one preferred application a system as described above is incorporated into an interactive whiteboard, but many other applications may be envisaged.
BRIEF DESCRIPTION OF THE DRAWINGS
These and other aspects of the invention will now be further described, by way of example only, with reference to the accompanying figures in which:
Figures 1a and 1 b show, respectively, a vertical cross section view through an example touch sensitive image display device, and details of a sheet of light-based touch sensing system for the device;
Figures 2a and 2b show, respectively, a plan view and a side view of an interactive whiteboard incorporating a touch sensitive image display;
Figure 3 shows a touch sensing system using a scanning laser beam and space-time object localisation; Figure 4 shows a schematic illustration of object distance perception in the system of figure 3; Figure 5 illustrates a stereoscopic touch sensing system using scanning laser beams, according to an embodiment of the invention, and its operation;
Figure 6 illustrates techniques for improving the operation of the system of Figure 5; Figure 7 shows a flow diagram of a signal processing procedure for the system of figure 5, including multi-touch processing steps; and
Figure 8 shows an example of an interactive touch sensitive display incorporating a pair of touch sensing systems each as illustrated in figure 5, schematically illustrating multi-touch sensing.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS Broadly speaking we will describe touch detection systems based upon a scanning laser beam for detecting touches on, or proximate to, a surface.
By way of background we first outline again the system we described in our GB1214465.5 filed on 14 Aug 2012. Thus referring to figure 3, this shows a touch sensing system 300 comprising a polygonal rotating scanning mirror 302 illuminated by a laser 304 followed by optional collimation optics 306 providing a collimated beam 308 of, for example, infrared light. Rotation of the polygonal scanner (mirror) 302 generates a scanned beam 310, the locus or envelope of the scanned beam defining a surface 312, typically a plane, for touch sensing. As illustrated, in a typical application this is located just above a display surface 314 such as a wall or white board onto which an image may be projected, a flat panel display, and the like. As the beam 310 scans through the touch surface 312, over the display surface 314, light 320 from one or more objects 318 on the surface, for example one or more fingers, is scattered back towards scanner 302. This light hits an adjacent face or facet 302b of the scanner to the face or facet 302a which directed the scanning beam, and reflects the scattered light through the same angle towards imaging optics 322 which image the scattered light onto a photodiode array 324 tilted with respect to an optical axis 322a of the imaging system. In one embodiment the photodiode array comprises a line of 16 sensors. The photodiode array provides light sensing signals to a bank of analogue-to-digital converters 326 (or a multiplexed converter) which in turn provides signals to a digital signal processor 328 which processes the photodiode array information in conjunction with timing data, to provide location data for object 318 as described in more detail later. The digital signal processor 328 may be implemented in hardware (circuitry), software, or a combination of the two.
A further photodetector 330 is directed towards the illumination face of the polygonal scanner 302 to allow measurement of the scanning angle at which the object 318 is detected. More particularly, photodetector 330 defines a reference direction 332 for scanning beam 310 and thus allows the scanning angle of the beam, Θ, to be determined from the timing of the illumination of the photodiode array (knowing the rotation rate of scanner 302 or similar scan rate information). The skilled person will appreciate that a reference angle for scanner 302 may be determined in many different ways based on, for example, drive to and/or timing information from the scanner. In an example embodiment with a six-sided polygonal scanner spinning at 10,000rpm there are 60,000 faces or sweeps per minute, 1 ,000 sweeps per second. Data may be captured at, for example, a 1 KHz capture rate by a bank of 16 A/Ds, defining 1 Msamples/sec for DSP 328 to process. In figure 3 the optics and photodiode array are configured such that, in addition to the signal timing information from the scanner providing angular information Θ, scatter from the touch object is re-imaged onto the surface of the photodiode array 324. With this arrangement the centroid position on the photodiode array, dPD provides information related to the distance between the intersection of the scanning beam 310 with the respective polygon face 302a, and the touch object in the touch surface 312. In embodiments the angle of the photodiode array to the optic axis 322a can be adjusted such that the position on the photodiode array is substantially inversely proportional to the distance along the scanning beam to the touch object. Thus the arrangement of figure 3 can provide 2D position information as the beam scans for one or multiple touch events in a touch surface, either in measured, polar co-ordinates (d, Θ) or in real (x, y) space in the touch surface, optionally referenced to the display surface 314.
Although Figure 3 illustrates an embodiment of the system employing a polygonal scanner 302, in alternative arrangements the rotating facets 302a, b of the scanner may be replaced by two synchronised MEMS (microelectromechanical system) scanners, one to scan the laser beam through the touch surface and the second to direct or 'scan' light scattered by a touch object onto the photodiode array 324. Employing MEMS scanning devices rather than a polygonal scanner can help to reduce the system size and acoustic noise. The MEMS scanners may be synchronised by providing a synchronised electronic drive to the scanners, for example employing a phased locked loop to keep the scanners at the same frequency, preferably a resonant frequency of the scanners, and in phase synchronisation. Figure 4 this shows a simplified, schematic illustration of a system of the type shown in Figure 3 but with an octagonal scanner for the purpose of simplifying the explanation. Conceptually the scanning and return beams 310, 320 are scanned together by scanner 302, pointing in the same direction as the scanner rotates. However although the beams 'look' in the same direction, the return, Imaging' beam is displaced to one side of the scanning beam and thus the imaging system, more particularly the photodiode array, sees the object at a different angle depending upon the distance of the object from the scanner. In effect the photodiode array images a one dimensional strip along the scanning beam starting at a closest observable object position, for example object position 318a, and extending towards infinity. As the beam 310 scans, there is a point at which it intersects the touch object which flashes its radial position onto the photodiode array, rather in the manner of a lighthouse beam. There is an approximate inverse proportionality between the touch object distance and the position of the imaged light scattered from the object. The scanning and imaging directions are swept together by a pair of mirrors which keeps the imaging system looking down the scanned direction. The number of mirror faces for a polygonal scanner may be varied according to the desired scan angle, fewer faces resulting in a larger scan angle. For a f 2π λ
polygonal scanner the (maximum) scan angle is given by 2I— I (where the factor of 2 arises because the mirror reflection effectively doubles the scanned angle). Multiple stereoscopic optical touch sensors for multi-touch disambiguation
We now describe the use of multiple scanning beam touch sensors for providing accurate angle information for accurate touch location detection by triangulation, together with approximate distance information from each for multi-touch disambiguation. We will describe an embodiment of the system based on a polygonal scanner in some respect similar to that previously described, and again providing approximately 1000 sweeps per second. However rather than employing an extended photodiode array, for improved signal-to-noise ratio the system we now describe employs a pair of beams and, in embodiments, single optical detectors looking for scattered light arriving in a direction back along the beam, (though light will in general be scattered in other directions too). The photo detectors employed are preferably point-light with the scattered light being focussed onto the detector but, in principle, large area detectors may alternatively be employed. It will also be appreciated from the later description that although the detected light is substantially counter propagating to the scanned beams, there will be a small angular tolerance to either side of this direction. Thus referring to figure 5a, this shows an embodiment of a touch sensing system 500 according to the invention employing a polygonal mirror scanner 502 rotating rapidly as described previously to scan a pair of laser beams 504a, b over a touch sensitive region 506. Preferably the beams scan over greater than 90°; this facilitates using a photodiode 508 to set a reference direction for the scanned beams. As illustrated mirror 502 has six facets but in principle any number of facets may be employed.
As illustrated a pair of lasers 510a, b provide beams to respective beam splitters 512a, b which provide beams 504 to the touch area via mirror 502; the beam splitters 512 pass scattered light returning along the respective beam directions to respective lenses 514a, b which image the scattered light onto respective photodiodes 516a, b. The photodiodes provide inputs to analogue-to-digital convertors 518 which provide a digitised version of the received scattered light from the scanned region to a digital signal processor 520 which outputs position data for one or more touches in rectangular and/or polar coordinates. An example touch object is illustrated at 318; beams 504a, b make respective angles a and β with an arbitrarily defined 0° reference direction. The contours 522 in figure 5a illustrate equal probability contours for the position of object 318 showing, schematically, that the angular position of the object is more precise than the radio distance from mirror 502. The dotted contours 524 illustrate corresponding contours for a second stereoscopic touch sensing system (not shown in figure 5a), for example at a second corner of an interactive whiteboard.
In operation the beams 504 sweep the touch sensing area, though not at precisely the same time (one then another - though this does not effect the operation of the device). Figure 5b illustrates, schematically, signals obtained from beams 504a (dotted line) and 504b (solid line), each of which is referenced to the 0° direction, showing the received signal on the vertical axis. A touch object generates a pair of pulses with a separation A which determines the distance to object 318. The timing, or equivalently angle, of a peak determines the angle of a beam to the object. In the illustrated example there are two touch events and the second (Δ2) is nearer than the first - the further away the touch event from mirror 502 the smaller the peak separation and the narrower the overall angular spread of the peak. It will be appreciated that for a single touch event the peaks of the two beams are at different angles because these beams are viewing the object from different positions. Figure 5c shows determination of object location from the touch data from the system of Figure 5a: locations 302a, b mark the locations where beams 504a, b impinges on mirror 502; these have separation s. The location of object 318 can either be geometrically determined from the shape of the triangle illustrated or, more straightforwardly, because locations 302a, b are known together with the directions of lines 504a, b, intersection point 318 can be determined from the intersection point of the two lines, that is from the equations of the lines.
Figure 5d illustrates, schematically, multi-touch disambiguation. Two genuine touch events are shown at circles 318 but with only angular information these would be indistinguishable from an equivalent pair of touch events shown by squares 318'. It can be appreciated that this ambiguity can be resolved if even only approximate distance information for a touch object is available.
In one approach to matching touch objects detected by a pair of stereoscopic sensing systems 500 are shown in figure 5b one simple approach is to determine that if the distance between touch events detected by the two stereoscopic systems are within a threshold distance of one another they constitute the same touch event. To make this assessment the object touch data from each system 500 may (but need not necessarily be) converted into a common rectangular co-ordinate system. Then a simple multi- touch system steps through each candidate touch position from a first stereoscopic sensing system checking whether, for each, there is a matching touch detected by the second (and/or a third or subsequent) system, to identify matched pairs. In a refinement the system may step through all the candidate pairings to find the best matched pairings.
In a more sophisticated approach a combinatorial optimisation algorithm may be employed, for example the Kuhn-Munkres algorithm (Hungarian assignment algorithm) may be employed. In such an approach an array is constructed for example of the type illustrated in figure 5e in which m events are seen by a first stereoscopic sensing system A and n events are seen by a second stereoscopic system B. The matrix is then populated by a set of matched costs where events can be paired and a set of unmatched costs (reflected in the two diagonally opposite quarters of the array) where an event seen by one system does not match with the other. The remaining portion of the matrix is populated by zeroes. This corresponds to assigning one row of the array to each column representing a touch event seen by system A and one column of the array to each row representing a touch event seen by system B. Any of a variety of cost functions may be employed but, broadly speaking, the cost defines how much it hurts the system to match, say, event Ai with event B2. One example cost function which may be employed is the square of the distance between two touch events. Another more sophisticated cost function is the integral of the product of the probability distributions (contours in figure 5a) of the two touch events. The events are matched to minimise total costs by an assignment algorithm of a type previously described (see, for example, Wikipedia™ - Hungarian algorithm). We have described an approach in which touch events from separate stereoscopic touch sensing systems A, B (and possibly additional stereoscopic touch sensing systems) are matched but alternatively a variant of the system may be operated with the detector data from the stereoscopic touch sensing systems without first processing this to provide location data from each system - that is, for example, the separate peak positions shown in figure 5b may provide inputs to the algorithm. Alternatively the procedure may be run multiple times, for example once to identify peak pairings in each stereoscopic touch sensing system, to determine pairings for each system, and then a further time to determine pairings for the combination of stereoscopic touch sensing systems.
Figure 6a illustrates, conceptually, combining angular information from a stereo pair of sensors where the solid lines indicate a tolerance of the angular information and the cross-hatched area the uncertainty of the overlap. It can be seen that there is still some residual uncertainty. Optionally this can be reduced by including additional sources of distance information. Additionally or alternatively such additional sources of distance information may be used for multi-touch disambiguation where, as illustrated in figure 6b, one object 318a partially or wholly obscures a second object 318b. As illustrated in the accompanying curves, in such a situation the scatter light seen by the photo detector is the envelope (solid line) of what would be seen for each object by itself (dotted lines). In many cases separate peaks can be resolved but this is not always the case. Where distance information is only available from one stereoscopic device it can be helpful to attempt to determine a more accurate distance using just one device.
One technique is to rely on the brightness of the scattered light, which varies inversely with distance. With the assumption that, say, the reflective object is a finger this can provide a helpful level of uncertainty reduction. The width of the signal versus angle peak also varies approximately inversely with distance, which again can be used to derive additional distance information. A still further source of information can be derived from the size of the spot imaged onto the photo detector 516: although the laser beam 504 is collimated, in effect the size of the spot on the object looks larger when the object is closer (again approximately inverse proportionality to distance). Thus by using a detector of the form illustrated in figure 6c in which a central photo detector 516 is provided with one or more additional photo detectors 602 for example in the form of one or more rings around photo detector 516, then the spot size imaged by lens 514 can be measured.
Additional constraints can be applied to the detected touch positions. One such example is conceptually illustrated in figure 6d: circle 610 is the locus of all points forming the same angle (β-α) with deflection points 302a, b. Thus it can be seen that for a system located at the edge of a display or interactive whiteboard 314 angles which are equal to or greater than that defined by circle 610 are unphysical. Thus detection of such angles can be flagged as errors. Optionally other constraints on signal brightness and/or width may also be applied.
In a further refinement to the system, in a practical system the sensitivity to a touch object varies with the angle of beam 504 - the sensitivity tends to be lower at grazing angles of incidence onto mirror 502 (and this affects the two beams in different ways). Thus optionally the sensitivity of the system may be characterised and sensitivity characterising data stored so that, in operation, the system can be compensated for variations in sensitivity with beam angle. It will be appreciated that the system design is characterised, rather than each individual system needing characterisation.
Figure 6e shows a variant of the scanner 502 in which adjacent pairs of facets 620a, b; are provided with a wedge or other structure (such as a diffraction grating) to change the angle of the scanning beam away from the touch surface 506/display surface 314.
In this way the pairs of facets 620 define a second touch surface or plane at an angle to the first touch surface or plane, which allows sensing of objects further from display surface 314 to provide, for example, gesture recognition or a 'hover touch' capability. Other (pairs of) facets 622 direct the scanning beam within the first touch surface. The skilled person will appreciate that the effect of the wedges may straightforwardly be implemented by MEMS scanners in alternative embodiments.
Figure 7 shows a flow diagram of a procedure for determining an object location using the system of Figure 5, which may be implemented in software on a storage medium 710 and/or hardware in digital signal processor 520. Thus at step 712 the procedure inputs illumination data from photodiodes 516 and uses this data to determine 714, for each touch event, respective angle and distance information, di ... dN, θι ... ΘΝ, as previously described. It will be appreciated that this may be represented in either rectangular coordinates. The system inputs 720 data from photodiode 508 and from the pulse timing determines 722 a reference direction for angles Θ.
In a multitouch system this information provides two-dimensional position data for one touch object and this may be combined 716 with two-dimensional position information for one or more other touch objects to match corresponding positions in 2D using the approximate distance information, then determining accurate distance information by triangulation using the azimuth angle data.
This multitouch 2D data may then optionally be tracked 718, for example using one or more tracking filters such as a Kalman filter to provide trapped multi-touch object location data. Optionally the tracking may employ additional data from the touch sensing system, for example photodiode illumination duration data which may be employed to give an indication of the size of a touch object - a large object may indicate, for example, a pair of adjacent fingers; similarly a two peaked illumination pattern may indicate multiple objects. The size of an object as 'viewed' by the scanning beam may be determined from a knowledge of the beam width, angular velocity and illumination duration. Embodiments of the multi-touch object tracking may include tracking to provide continuity between touch locations where one touch object briefly occludes another in the scanning beam direction, then, for example, filling in by interpolation or extrapolation.
Figure 8 shows, schematically, a pair of sensing systems 500 of the type shown in Figure 5 used to locate two simultaneous touch objects 318 where, again, without distance information there would be ambiguity as to which touch was at which angle.
Figure 8, illustrates, schematically, the display surface 314 of an interactive whiteboard 800 (image projector not shown). It is helpful to locate one of the touch sensing modules near the top left corner of the display surface and one near the top right corner, to help reduce the risk of occlusion when there are multiple touch objects; optionally a third system may be located centrally along the top of the touch sensitive region. Since there is relatively low positional accuracy on a line directly between the touch sensing modules, preferably these are displaced laterally from the edge of the display surface so that touch events do not lie directly on a line between the modules. Optionally, the signal processing includes a calibration procedure 750. In embodiments the location of a touch object is determined by calculation as previously described, and then a correction is applied to compensate for small errors, for example caused by errors in the positioning of the scanning mirrors and other misalignments. The correction may comprise, for example, an arbitrary polynomial mapping to a corrected position (x',yr) <- (x,y) or a translation/scaling/rotation, some other correction, for example to correct for a response of the imaging optics, to calibrate a rotation or scanning rate of the scanner, and the like. The calibration procedure may involve touching a known location in the touch surface. Optionally a calibration may be applied for each separate mirror facet by recording the calibration for each facet, identifying the facets by a fiducial mark or other coding on the scanner - this can help to correct for period variation in the detected location.
In some implementations the system may be self-calibrating/learning. Thus multiple modules may be placed on a surface and detect one another's respective positions in 2D, calibrating automatically by communicating this data between themselves. This may be facilitated by provision of a retro reflector 802 on a touch sensing module.
In embodiments the photodiode 508 of Figure 5 used to establish a reference angle/timing may also be used for power saving and safety. For example the system may be shut down if the lack of a regular pulse at photodiode 508 is detected. Optionally the laser may be operated in a default, eye safe mode where the laser is operated at low power (or with a short duty cycle) only switching the laser fully on when scanning at full speed. Thus the laser may operate at reduced power during an initial spin-up phase. Additionally or alternatively the power or duty cycle of the laser may be reduced when there are no active touch events, to provide a reduced power standby mode. Additionally or alternatively the rotation rate of the scanner may be reduced at the same time. Optionally the laser may be pulsed on and off; more particularly the laser may be switched off during the transition between mirror facets (or in some similar portion of the cycle of a MEMS scanner) to provide a background illumination signal level from the photodiode array which may then be subtracted to reduce the background noise level.
Preferred implementations of the touch sensing system employ an infrared rather than a visible laser. Shorter wavelengths, for example 785μηι, are generally preferable for reduced cost, but longer wavelengths, for example 1.3μηι or longer are preferable for improved eye safety. Whichever wavelength is employed, embodiments of the system include a band pass filter, for example a narrowband interference filter, in the optical path prior to the photodiode array to select for the wavelength of the laser and reduce the ambient light signal. Optionally to compensate for the effect of background light, in particular light at the laser wavelength, the scanning laser may be pulsed on and off. Then during the off period a background reading from the photodiode may be taken and subtracted from subsequent readings when the scanning laser is active. This subtraction may be carried out in the analogue and/or digital signal processing domain.
In embodiments with a rotating mirror scanner it can be desirable to reduce the acoustic noise level. This can be achieved by, for example, employing air bearings rather than ball bearings and/or vacuum isolation and/or adaptive noise cancellation, and the like.
In embodiments of the system time-of-flight detection may be employed to provide further additional accuracy. Thus with a pulsed laser the time of flight to the photodetector may be employed for an additional measurement of distance to the one or more touch objects, and this additional distance information may be employed to augment that obtained from the offset on the photodiode array to provide improved two-dimensional position detection accuracy. In embodiments of such an approach sufficient accuracy may be obtained, depending upon the requirements of the touch system, by a single touch sensing module rather than the pair of modules illustrated in Figure 8. Alternatively a combination of time-of-flight and/or triangulation and/or photodetector-based position measurements may be employed with multiple touch sensing modules. For example in an arrangement of the type shown in Figure 8 time- of-flight may be employed for greater distances where the offset-based approach is less accurate, and triangulation for shorter distances, in this way providing better performance than either alone. The skilled person will appreciate, however, that the systems we describe may nonetheless provide sufficient 2D position accuracy on their own in some applications without the need for triangulation.
Touch sensing systems of the type we have described provide a number of advantages. They can be manufactured very inexpensively, even for systems covering a large LCD screen or interactive whiteboard (a larger display simply requires a slightly higher power laser); they have a relatively low profile, for example less than 10mm or less than 5mm above the display surface or potentially even thinner, and can provide a high refresh rate, for example 200Hz-1 KHz or greater. Further apart from small touch scanning modules positioned near the corners of the display surface no other hardware is required and, in particular, there is no need for a bezel around the display area.
Although we have described systems based on scanning mirrors and MEMS scanners in principle other types of scanning device, for example a diffractive optical element may be employed. However in this latter case the timing requirements make it preferable for the diffractive scanner to provide a comb of scanning beams rather than a single scanning beam. The techniques we have described are particularly useful for providing implementing large touch-sensitive regions (say >0.5m in one direction), for example for large touch- sensitive displays. Such displays may be based on any display technology including flat screen and projection technology. The techniques we have described are particularly useful for an interactive whiteboard but also have advantages in smaller scale touch sensitive displays.
No doubt many other effective alternatives will occur to the skilled person. It will be understood that the invention is not limited to the described embodiments and encompasses modifications apparent to those skilled in the art lying within the spirit and scope of the claims appended hereto.

Claims

CLAIMS:
1. A touch sensing system comprising:
one or more optical beam sources to provide a pair of optical beams;
a pair of controllable beam deflectors comprising at least first and second beam deflectors, wherein said first beam deflector is configured to deflect a first said optical beam through a first angle towards a touch sensing region, and wherein said second beam deflector is configured to deflect a second said optical beam through a second angle towards said touch sensing region;
first and second optical detectors to detect first and second scattered light from each of said first and second beams, wherein said first and second optical detectors are configured to detect scattered light counter-propagating substantially along said first and second optical beams; and
wherein said first and second beam deflectors are controlled in tandem to scan said touch sensing region.
2. A touch sensing system as claimed in 1 wherein said first and second beam deflectors are controlled such that, in a plane, there is a fixed angle between deflectors of said first and second beam deflectors.
3. A touch sensing system as claimed in claim 1 or 2 wherein said first and second beam deflectors comprise faces of a rotatable polygonal mirror.
4. A touch sensing system as claimed in claim 1 or 2 wherein said first and second beam deflectors comprise phase locked MEMS deflectors.
5. A touch sensing system as claimed in claim 3 or 4 comprising a pair of collimated optical beam sources, a pair of beam splitters each in an optical path between each said beam source to a respective said beam deflector, and an imaging device in each of a pair of return optical paths split from respective said beam splitters to image said scattered light onto a respective said detector.
6. A touch sensing system as recited in any preceding claim further comprising a processor to process signals from said detectors to determine, for one or more touches in said touch sourcing region, touch data defining an angular position of a said touch and distance information for a said touch defining an approximate distance between the touch and a said beam deflector.
7. A touch sensing system comprising a pair of touch sensing systems each as recited in claim 6 with overlapping touch sensing regions and a processor to process said touch data from each touch sensing system to identify matched pairs of touches in the respective touch sensing regions of the touch sensing systems, wherein said processor is configured to disambiguate angular positions of a plurality of said touches dependent on said distance information.
8. A touch sensing system as claimed in any one of claims 1 to 5 further comprising a signal processor, coupled to said detectors to receive beam detection signals, and having a timing input to receive a timing signal defining an angular position of a said beam deflector, wherein said signal processor is configured to process said timing signal and said beam detection signals to determine data defining a location of said object in said touch sensing region in two dimensions (2D).
9. A multi-touch sensing system comprising a plurality of sensing systems each as claimed in claim 8, with overlapping said touch sensing regions, wherein said location in 2D comprises data defining said location in polar coordinates including a distance from a said beam deflector and a respective azimuthal angle, the multi-touch sensing system further comprising a signal processor determine locations of said multiple touches from said respective azimuthal angles and to disambiguate multiple touches in said overlapping touch sensing regions responsive to said distances determined for said multiple touches.
10. A multi-touch sensing system as claimed in claim 9 wherein each said touch sensing system is configured to determine the locations of the other touch sensing system(s), and to communicate respective locations of the other touch sensing system(s) to calibrate relative positions of said touch sensing systems.
1 1. A touch sensing system as recited in any one of claims 1 to 10 wherein said one or more optical beam sources comprise a laser, further comprising a control system to reduce an average optical output from said laser when a speed of said scanning is reduced.
12. A multi-touch sensing system, comprising:
a pair of stereoscopic touch sensors, each configured to scan a touch sensing region with a pair of beams using a common beam deflection system and configured to detect a pair of return beams scattered from a plurality of touch objects to provide stereoscopic touch object location data defining a location of a touch object in 2D with respect to a said sensor; and
at least one signal processor to process said touch object location data from said pair of stereoscopic touch sensors to match said touch objects detected by said pair of stereoscopic touch sensors dependent on said distance of said touch objects from said respective sensors, and to determine location data for said touch object from said angular positions from each of said sensors for said matched touch objects.
13. A multi-touch sensing system as claimed in claim 12 wherein each said stereoscopic touch sensor comprises a polygonal mirror illuminated by a pair of beams, each having a detector to detect scattered light from a touch object counter- propagating down a said beam.
14. A multi-touch sensing system as claimed in claim 13 wherein said touch object location data comprises data defining a pair of peaks for a said touch object, wherein a separation of said peaks defines said distance and wherein a timing of a said peak defines said angular position.
15. A touch sensing system as recited in any preceding claim comprising a signal processor to determine additional distance information from one or more of:
i) a brightness of light scattered from a said touch object;
ii) an angular spread of light scattered from a said touch object;
iii) an angle subtended by the return beam of said scattered light at an aperture viewing the return beam; and
iv) a width of a peak in a signal-angle curve of the touch sensing system.
16. A method of processing data from a touch sensing system as recited in any preceding claim, the method comprising:
inputting first and second sets of touch data from candidate touch objects identified by respective stereoscopic touch sensors/sensing systems, said touch data defining touch object distance and angle data for a plurality of said touch objects; matching touch objects in said first and second sets of touch data; and using said angle data to determine object location data for each said touch object.
17. A method of processing data from a touch sensing system as claimed in claim 16 wherein said matching comprises determining a set of match cost data for potential pairings of said touch objects detected by said sensors/sensing systems, and matching touch objects to reduce a total match cost defined by said match cost data.
18. A method of processing data from a touch sensing system as claimed in claim 16 or 17 further comprising identifying unmatched said touch objects, and using both said angle data and said touch object distance to determine object location data for each said unmatched touch object.
19. A method of processing data from a touch sensing system as claimed in any one of claims 16, 17 and 18 further comprising tracking a history of motion of each said touch object, and using said history to determine object location data for each said touch object.
20. A physical carrier carrying processor control code to implement the method of any one of claims 16 to 19.
21. An interactive whiteboard comprising a touch sensing system as recited in any one of claims 1 to 15.
PCT/GB2014/050406 2013-02-13 2014-02-12 Touch sensing systems WO2014125272A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1302481.5 2013-02-13
GB1302481.5A GB2515447B (en) 2013-02-13 2013-02-13 Touch sensing systems

Publications (1)

Publication Number Publication Date
WO2014125272A1 true WO2014125272A1 (en) 2014-08-21

Family

ID=47999014

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2014/050406 WO2014125272A1 (en) 2013-02-13 2014-02-12 Touch sensing systems

Country Status (2)

Country Link
GB (1) GB2515447B (en)
WO (1) WO2014125272A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105373266A (en) * 2015-11-05 2016-03-02 上海影火智能科技有限公司 Novel binocular vision based interaction method and electronic whiteboard system
GB2535429A (en) * 2014-11-14 2016-08-24 Light Blue Optics Ltd Touch sensing systems
WO2017027190A1 (en) * 2015-08-07 2017-02-16 Sony Interactive Entertainment Inc. Systems and methods for using a mems projector to determine an orientation of a photosensor of an hmd or another controller
US9854226B2 (en) 2014-12-22 2017-12-26 Google Inc. Illuminator for camera system having three dimensional time-of-flight capture with movable mirror element
US9918073B2 (en) 2014-12-22 2018-03-13 Google Llc Integrated camera system having two dimensional image capture and three dimensional time-of-flight capture with movable illuminated region of interest
CN108027660A (en) * 2015-08-07 2018-05-11 索尼互动娱乐股份有限公司 System and method for determining the orientation of the photoelectric sensor of HMD or another controllers using MEMS projecting apparatus

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3613066A (en) * 1968-10-22 1971-10-12 Cii Computer input equipment
US4811004A (en) 1987-05-11 1989-03-07 Dale Electronics, Inc. Touch panel system and method for using same
WO1992003807A1 (en) * 1990-08-22 1992-03-05 International Business Machines Corporation Calibration of a rotationally scanning position-tracking device
US20010028344A1 (en) 1998-11-20 2001-10-11 Fujitsu Limited Optical scanning-type touch panel
US20020145595A1 (en) * 2001-03-26 2002-10-10 Mitsuru Satoh Information input/output apparatus, information input/output control method, and computer product
US20090091553A1 (en) 2007-10-03 2009-04-09 Microsoft Corporation Detecting touch on a surface via a scanning laser
EP2498169A2 (en) * 2009-11-05 2012-09-12 Smart Sense Technology Co., Ltd. Apparatus for recognizing the position of an indicating object
WO2012172364A2 (en) 2011-06-16 2012-12-20 Light Blue Optics Ltd Touch-sensitive display devices

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7359041B2 (en) * 2003-09-04 2008-04-15 Avago Technologies Ecbu Ip Pte Ltd Method and system for optically tracking a target using a triangulation technique
KR101155923B1 (en) * 2010-07-23 2012-06-20 삼성에스디아이 주식회사 Light scan type touch panel
US8669966B2 (en) * 2011-02-25 2014-03-11 Jonathan Payne Touchscreen displays incorporating dynamic transmitters

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3613066A (en) * 1968-10-22 1971-10-12 Cii Computer input equipment
US4811004A (en) 1987-05-11 1989-03-07 Dale Electronics, Inc. Touch panel system and method for using same
WO1992003807A1 (en) * 1990-08-22 1992-03-05 International Business Machines Corporation Calibration of a rotationally scanning position-tracking device
US20010028344A1 (en) 1998-11-20 2001-10-11 Fujitsu Limited Optical scanning-type touch panel
US20020145595A1 (en) * 2001-03-26 2002-10-10 Mitsuru Satoh Information input/output apparatus, information input/output control method, and computer product
US20090091553A1 (en) 2007-10-03 2009-04-09 Microsoft Corporation Detecting touch on a surface via a scanning laser
EP2498169A2 (en) * 2009-11-05 2012-09-12 Smart Sense Technology Co., Ltd. Apparatus for recognizing the position of an indicating object
WO2012172364A2 (en) 2011-06-16 2012-12-20 Light Blue Optics Ltd Touch-sensitive display devices

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2535429A (en) * 2014-11-14 2016-08-24 Light Blue Optics Ltd Touch sensing systems
US9854226B2 (en) 2014-12-22 2017-12-26 Google Inc. Illuminator for camera system having three dimensional time-of-flight capture with movable mirror element
US9918073B2 (en) 2014-12-22 2018-03-13 Google Llc Integrated camera system having two dimensional image capture and three dimensional time-of-flight capture with movable illuminated region of interest
US10306209B2 (en) 2014-12-22 2019-05-28 Google Llc Illuminator for camera system having three dimensional time-of-flight capture with movable mirror element
WO2017027190A1 (en) * 2015-08-07 2017-02-16 Sony Interactive Entertainment Inc. Systems and methods for using a mems projector to determine an orientation of a photosensor of an hmd or another controller
CN108027660A (en) * 2015-08-07 2018-05-11 索尼互动娱乐股份有限公司 System and method for determining the orientation of the photoelectric sensor of HMD or another controllers using MEMS projecting apparatus
US10007108B2 (en) 2015-08-07 2018-06-26 Sony Interactive Entertainment Inc. Systems and methods for using multiple MEMS projectors to determine a position of a photosensor of an HMD or another controller
US10095024B2 (en) 2015-08-07 2018-10-09 Sony Interactive Entertainment Inc. Systems and methods for using a MEMS projector to determine an orientation of a photosensor of an HMD or another controller
CN108027660B (en) * 2015-08-07 2021-04-27 索尼互动娱乐股份有限公司 System and method for determining orientation of a photosensor of an HMD or another controller using a MEMS projector
CN113209600A (en) * 2015-08-07 2021-08-06 索尼互动娱乐股份有限公司 System and method for determining position of photoelectric sensor using MEMS projector
CN105373266A (en) * 2015-11-05 2016-03-02 上海影火智能科技有限公司 Novel binocular vision based interaction method and electronic whiteboard system

Also Published As

Publication number Publication date
GB2515447B (en) 2021-01-20
GB201302481D0 (en) 2013-03-27
GB2515447A (en) 2014-12-31

Similar Documents

Publication Publication Date Title
US10303305B2 (en) Scanning touch systems
WO2014125272A1 (en) Touch sensing systems
US8681124B2 (en) Method and system for recognition of user gesture interaction with passive surface video displays
US9971455B2 (en) Spatial coordinate identification device
TWI453642B (en) Multiple-input touch panel and method for gesture recognition
CN101663637B (en) Touch screen system with hover and click input methods
WO2015107225A2 (en) Interactive system
US20100295821A1 (en) Optical touch panel
US20150253428A1 (en) Determining positional information for an object in space
JP2011525651A (en) Method for detecting the position of multiple objects on a contact surface
US9292133B2 (en) Spatial input device
US10877153B2 (en) Time of flight based 3D scanner
US20110115904A1 (en) Object-detecting system
JP2002055770A (en) Coordinate inputting device
WO2013108031A2 (en) Touch sensitive image display devices
US20170059710A1 (en) Object recognition device
JPWO2018216619A1 (en) Non-contact input device
US20140035812A1 (en) Gesture sensing device
WO2013016190A1 (en) Depth perception device and system
JP2019074933A (en) Non-contact input device
US9507462B2 (en) Multi-dimensional image detection apparatus
TW201327324A (en) Optical touch control module
US9213448B2 (en) Positioning module, optical touch system and method of calculating a coordinate of a touch medium
US8723837B1 (en) Scanning mirror touch screen
JP6315127B2 (en) Input device, aerial image interaction system, and input method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14708082

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14708082

Country of ref document: EP

Kind code of ref document: A1