WO2010073047A1 - Touch sensitive image display device - Google Patents
Touch sensitive image display device Download PDFInfo
- Publication number
- WO2010073047A1 WO2010073047A1 PCT/GB2009/051770 GB2009051770W WO2010073047A1 WO 2010073047 A1 WO2010073047 A1 WO 2010073047A1 GB 2009051770 W GB2009051770 W GB 2009051770W WO 2010073047 A1 WO2010073047 A1 WO 2010073047A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display device
- image display
- touch sensitive
- touch
- sensitive image
- Prior art date
Links
- 238000009826 distribution Methods 0.000 claims abstract description 187
- 230000001154 acute effect Effects 0.000 claims abstract description 26
- 238000001514 detection method Methods 0.000 claims description 53
- 230000009471 action Effects 0.000 claims description 29
- 230000008859 change Effects 0.000 claims description 18
- 238000000034 method Methods 0.000 description 24
- 238000013507 mapping Methods 0.000 description 23
- 238000010586 diagram Methods 0.000 description 13
- 238000013459 approach Methods 0.000 description 11
- 230000006870 function Effects 0.000 description 9
- 230000033001 locomotion Effects 0.000 description 9
- 230000001427 coherent effect Effects 0.000 description 8
- 230000004044 response Effects 0.000 description 8
- 230000008901 benefit Effects 0.000 description 7
- 238000012937 correction Methods 0.000 description 6
- 238000005286 illumination Methods 0.000 description 5
- 239000004973 liquid crystal related substance Substances 0.000 description 5
- 230000002441 reversible effect Effects 0.000 description 5
- 230000004075 alteration Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000001419 dependent effect Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000009467 reduction Effects 0.000 description 3
- 241000088844 Nothocestrum Species 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000001788 irregular Effects 0.000 description 2
- 229910052710 silicon Inorganic materials 0.000 description 2
- 239000010703 silicon Substances 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000007620 mathematical function Methods 0.000 description 1
- 238000005065 mining Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/0005—Adaptation of holography to specific applications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/04—Processes or apparatus for producing holograms
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/22—Processes or apparatus for obtaining an optical image from holograms
- G03H1/2202—Reconstruction geometries or arrangements
- G03H1/2205—Reconstruction geometries or arrangements using downstream optical component
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/22—Processes or apparatus for obtaining an optical image from holograms
- G03H1/2294—Addressing the hologram to an active spatial light modulator
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3206—Monitoring of events, devices or parameters that trigger a change in power modality
- G06F1/3231—Monitoring the presence, absence or movement of users
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
- G06F3/0426—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/12—Hotels or restaurants
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/74—Projection arrangements for image reproduction, e.g. using eidophor
- H04N5/7416—Projection arrangements for image reproduction, e.g. using eidophor involving the use of a spatial light modulator, e.g. a light valve, controlled by a video signal
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/0005—Adaptation of holography to specific applications
- G03H2001/0061—Adaptation of holography to specific applications in haptic applications when the observer interacts with the holobject
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/22—Processes or apparatus for obtaining an optical image from holograms
- G03H1/2202—Reconstruction geometries or arrangements
- G03H1/2205—Reconstruction geometries or arrangements using downstream optical component
- G03H2001/2213—Diffusing screen revealing the real holobject, e.g. container filed with gel to reveal the 3D holobject
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/22—Processes or apparatus for obtaining an optical image from holograms
- G03H1/2202—Reconstruction geometries or arrangements
- G03H1/2205—Reconstruction geometries or arrangements using downstream optical component
- G03H2001/2213—Diffusing screen revealing the real holobject, e.g. container filed with gel to reveal the 3D holobject
- G03H2001/2215—Plane screen
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/22—Processes or apparatus for obtaining an optical image from holograms
- G03H1/2202—Reconstruction geometries or arrangements
- G03H1/2205—Reconstruction geometries or arrangements using downstream optical component
- G03H2001/2213—Diffusing screen revealing the real holobject, e.g. container filed with gel to reveal the 3D holobject
- G03H2001/2221—Screen having complex surface, e.g. a structured object
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/22—Processes or apparatus for obtaining an optical image from holograms
- G03H1/2249—Holobject properties
- G03H2001/2263—Multicoloured holobject
- G03H2001/2271—RGB holobject
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H2210/00—Object characteristics
- G03H2210/20—2D object
- G03H2210/22—2D SLM object wherein the object beam is formed of the light modulated by the SLM
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H2210/00—Object characteristics
- G03H2210/40—Synthetic representation, i.e. digital or optical object decomposition
- G03H2210/44—Digital representation
- G03H2210/441—Numerical processing applied to the object data other than numerical propagation
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Definitions
- This invention generally relates to a touch sensitive display device and to a consumer electronic device comprising such a device.
- Table down projection Projecting downwards and outwards onto a flat surface such as a tabletop entails projecting at an acute angle onto the display surface (taking this as the angle between the centre of the output of the projection optics and the middle of the displayed image - this angle, to a line in the surface, is less that 90°).
- Table down projection is not readily achievable by conventional image display techniques; scanning image display systems have a narrow throw angle and thus find it difficult to achieve a useful image size whilst projection systems, especially those based on LEDs (light emitting diodes) which have a wide light output angle, find it difficult to achieve a useful depth of field.
- a touch sensitive image display device for projecting a touch sensitive displayed image at an acute angle onto a surface on which the device is placed, the device comprising: a light source to project a substantially two-dimensional first light distribution in a first plane; a light source to project a substantially two-dimensional second light distribution in a second plane, wherein said second plane is different to said first plane; a multi-pixel sensor system to remotely detect touch of an area of said surface within or adjacent to said displayed image by detecting light from said first distribution, and having an output to provide a detected touch signal; said multi-pixel sensor system to remotely detect presence of an object at least partially within said second light distribution by detecting light from said second distribution, and having an output to provide a detected presence signal; and said controller having an input to receive said detected touch signal and an input to receive said detected presence signal and configured to control said touch sensitive image display device responsive to said signals, wherein said device is configured to multiplex projection of the first light distribution and projection of the second light distribution.
- an embodiment of the first aspect may allow a reduction in the number of components required for a touch sensitive image display device.
- the projection multiplexing is in the time domain so that, e.g., the light sources may be the same light source, i.e., a single light source is used to project both distributions.
- the light sources may be the same light source, i.e., a single light source is used to project both distributions.
- this may be the case where light of different wavelengths obtained from a single broadband light source is used to project both distributions. This may be achieved using filters.
- the or each multi-pixel sensor system may comprise one or more multi-pixel sensors such as a camera or CCD array.
- a multi-pixel sensor allows detection of location of touch or presence.
- Any one of the sensors described herein may indicate light intensity and/or location, and may further give an indication of received light intensity for juding a distance, the output for intensity being binary, M-ary (where M>2) or analogue).
- each substantially two-dimensional first distribution in embodiments defines a substantially laminar distribution, for example having a divergence in a vertical plane of less than 10°,
- the display device may comprise image projection element(s).
- This may comprise a projector, in particular one having an input to receive data defining the displayed image.
- the displayed image may change dynamically in response to the touch and/or presence detections.
- SLM spatial light modulator
- the device may comprise an SLM, image projection optics comprising at least one light source to illuminate the SLM, and output optics to project modulated light from the SLM onto the surface at the acute angle.
- the light source of the image projection element(s) may comprise a coherent light source, a light emitting diode, a filament lamp and/or a fluorescent lamp.
- holographic projection may be advantageous for obtaining a longer depth of field, a wider throw angle, and/or very substantial distortion correction without substantial loss of brightness/efficiency, compared to with non-holographic projection, and this may require the light source of the image projection element(s) to be a coherent light source such as a laser.
- the device having the SLM may have holographic image projection optics as the image projection optics mentioned above, the light source to illuminate the SLM may comprise a coherent light source and the modulators light is then the coherent light, and the display device may then be configured to holographically project a touch sensitive displayed image at the acute angle.
- an image displayed on a pixellated SLM itself may be a hologram.
- the responsive control may be to "wake up" an element of the display device upon receiving a touch and/or presence signal.
- one of the light sources may be switched on or the power emitted therefrom increased, or a screen saver or further (e.g., sub-) image may be displayed over a portion of the displayed image or adjacent thereto.
- the above touch sensitive image display device comprising: at least one further light source to project a further substantially two-dimensional second light distribution in a further plane, wherein said further plane is different to each other said plane; and at least one further multi-pixel sensor system, each said further multi-pixel sensor system to remotely detect presence of an object at least partially within at least one said further light distribution by detecting light from said at least one further distribution, and having an output to provide a detected presence signal.
- each of the first, second and further planes is different in that they do not substantially coincide, even if, for example, any of the planes intercept.
- the distributions are stacked one above the other, preferably substantially parallel to the surface where the image is displayed.
- touch sensitive image display device configured to multiplex projection of the first light distribution, the second light distribution and at least one said further light distribution.
- detections relating to each distribution may occur in a predetermined sequence where the multiplexing is in time, and/or a shared light source may be used for each distribution. Further similarly, the light distributions may be projected using different wavelengths derived from a shared broadband light source.
- said multi-pixel sensor system to detect touch comprises a plurality of multi-pixel sensors each to remotely detect touch of an area of said surface within or adjacent to said displayed image, and having an output to provide a detected touch signal.
- multi-pixel sensor it may advantageously be possible to detect two or more simultaneous touches, especially if one touching object (e.g. finger) is hidden behind the other and the touch sensors sense touching of the displayed image from different angles/view points/locations.
- multiple sensors may allow sensing of relative movement, distance or direction of touches such as the bringing of two fingers together to perform an action responsive to relative position, speed or direction.
- said multi-pixel sensor system to detect presence comprises a plurality of multi-pixel sensors each to remotely detect presence of an object at least partially within said second light distribution, and having an output to provide a detected presence signal.
- the use of multiple multi-pixel sensors to detect presence may allow simultaneous detection of more than one presence especially regarding the above hidden scenario and may be advantageous in relation to the above relative movement/distance/direction of presences.
- the light sources may comprise a shared light source.
- the or each light source may be an infrared light source or may emit visible light.
- first plane and said second plane are substantially parallel planes.
- the second distribution may be located, e.g., about 1 to about 2 centimetres above the first distribution.
- the first and second light distributions may be substantially parallel.
- the distributions are at least non-intersecting.
- the above touch sensitive image display device wherein said multiplexing is wavelength multiplexing. Additionally or alternatively, there may be provided the above touch sensitive image display device, wherein said multiplexing is time multiplexing.
- the first and second light distributions may be projected using different respective wavelengths, e.g., wavelength division multiplexing, e.g., using different infra-red wavelengths emitted from the first and second light sources.
- wavelength division multiplexing e.g., using different infra-red wavelengths emitted from the first and second light sources.
- each light source may be pulsed on when no other of the light sources to project a distribution is on.
- the controller may be configured to control the timing of the emission from the light sources or of projecting the distributions (in the case of a shared light source).
- the input to receive the output to provide a detected touch signal may be the input to receive the output to provide a detected presence signal.
- the detected touch signal and detected presence signal may be provided on the same input line.
- said responsive controller is configured to read said detected touch signal and said detected presence signal in synchronism with said time multiplexing/Thus
- the controller may be configured to read in the signal at a predetermined instant of time when the first distribution is being projected and when a valid touch signal may be available. Similar applies to reading in the detected presence signal.
- the controller may read the touch and presence signals alternatively and may further be control the synchronism detection using a phase locked loop (PLL).
- PLL phase locked loop
- the use of a PLL may further enable a detection signal to be filtered out having the same frequency as a particular corresponding distribution projection. This may advantageously reduce the signal-to-noise ratio of detection input signals.
- the controller is configured to perform said responsive control by distinguishing between said receiving of said detected touch signal and said receiving of a detected presence signal on the basis of a timing of receiving a said signal.
- the above touch sensitive image display device configured to determine an action to be initiated by said responsive control on the basis of said distinguishing.
- a particular action may be selected when a corresponding location of the displayed image is touched and/or a different action (e.g., wake up or power up of a light source) may be selected when presence is detected.
- the above touch sensitive image display device further comprising moveable optics to alternately project said first light distribution and said second light distribution.
- the moveable optics may comprise a rotatable or tiltable mirror.
- said multi-pixel sensor system to detect touch is to detect a location of said touch and said responsive control performs an action determined on the basis of said detected touch location and controls said device to perform said action.
- said multi-pixel sensor system to detect presence is to detect a location of said presence and said responsive control performs an action determined on the basis of said detected presence location and controls said device to perform said action.
- coordinates or pixel identifiers and distortion compensation may be applied.
- the action comprises selecting a menu on the basis of said detected presence location and displaying said selected menu.
- the sensor system to detect touch is configured to detect light scattered from said first light distribution.
- said sensor system to detect presence is configured to detect light scattered from said second light distribution.
- the above detected light in either case may have been, e.g., reflected or diffracted by an object in the distribution.
- the detection may occur by detecting attenuation of light received in the sensor system due to the presence of an object attenuating or blocking a portion of light in the distribution.
- the object may be for example the user's finger or a stylus or any other object that is touching the displayed image or the surface area on which the image is displayed.
- the above touch sensitive image display device wherein said controller is configured to detect hovering of a said object by detecting absence of a said touch detection and occurrence of a said presence detection on the basis of said signals.
- responsive actions may be performed on the basis the action of hovering as to opposed to touching and even advantageously on the basis of the location or movement of the hovering object within a second or further distribution.
- Hovering may correspond to a user remaining present but having paused manual action so that the finger or other object remains suspended, e.g., in the second distribution.
- touch sensitive image display device further comprising a spatial light modulator (SLM) and a controller to control said SLM, on the basis of data defining said displayed image, to replay a target image distorted to compensate for projection onto said surface at said acute angle.
- SLM spatial light modulator
- the controller may process input data defining the (preferably un-distorted) image to be displayed to generate data suitable for modulation of the SLM to project the image such that the image as projected on the surface appears substantially un-distorted.
- the replay of the target image may mean driving a pixellated SLM according to target image data.
- the acute angle may mean an angle of less than 90 degrees to the surface where the image is displayed (this applies throughout the present specification).
- Data defining an image for display may comprise data defining at least one image for display on an SLM to replay a target image distorted to compensate for projection onto the surface at the acute angle.
- a multi-pixel sensor of said multi-pixel sensor system to detect touch is a multi-pixel sensor of said multi-pixel sensor system to detect presence.
- the two systems may share a multi-pixel sensor.
- the touch sensor system may comprise at least one pixel of a multi-pixel sensor and the presence sensor system may comprise at least one other pixel of the shared multi-pixel sensor.
- At least one pixel of the presence sensor may have a greater optical aperture than the at least one pixel of the touch sensor.
- the touch sensor system may use a substantially central portion of the multi-pixel sensor while the presence system may use boundary pixels of the sensor, and those pixels used for presence detection may be larger than the central ones, i.e., have greater optical aperture and thus provide lower resolution detections.
- said multi-pixel sensor system to detect touch is configured to use pixels of said multi-pixel sensor and said multi-pixel sensor system to detect presence is configured to use other pixels of said multi-pixel sensor ._This is broadly consistent with the description above concerning a shared sensor. Furthermore, this allows the presence detection to detect proximity or movement towards (approach) the displayed image, particularly if the pixels used for touch detect light scattered only from touching the displayed image but the pixels used for presence detect light scattered only ftom near the displayed image.
- a consumer electronic device such as a camcorder, camera, music player, mobile phone, media player or computer, comprising the above touch sensitive image display device.
- the acute angle projection may allow preview of a recorded video stream to be observed on a surface on which the camcorder is placed.
- a touch sensitive image display device for projecting a touch sensitive displayed image at an acute angle onto a surface on which the device is placed, the device comprising: a light source to project a first light distribution; a light source to project a second light distribution; a multi-pixel sensor system to remotely detect location of touch of an area of said surface within or adjacent to said displayed image by detecting change of said first distribution, and having an output to provide a detected touch location signal; a multi-pixel sensor system to remotely detect location of an object at least partially within said second light distribution, and having an output to provide a detected object location signal; a controller having an input to receive said detected touch location signal and an input to receive said detected object location signal, and configured to control said device responsive to said detected touch location signal and to control said device responsive to said detected object location signal.
- an embodiment may thus provide a user interface for touch and presence detection which may use respective light distributions at different levels 'above' the displayed image (taking the displayed image and surface as being 'below', even if the projection is not downwards as such), and further advantageously with more than one presence detection distribution (i.e., a plurality of further light distributions that enable presence detections in the same manner as for the second distribution).
- a user interface may allow a user to control the device by presence or movement in different layers, the actions performed by the device in response to the user's control advantageously depending on which particular light distribution the user touch/presence is detected in.
- the user interface may allow the user to control the device at least similarly as when using a joystick, the user being able to achieve this merely by use of his finger or a stylus, for example.
- the display device may further comprise an image projection element(s) such as a projector.
- the device may further have a spatial input for receiving data defining the displayed image.
- a spatial light modulator SLM
- LCOS liquid crystal on silicon
- a dynamic image may be displayed, i.e., the displayed image may change or be updated (e.g., scrolled, panned, zoomed) in response to detection of touch and/or presence (or movement thereof).
- the above touch sensitive image display device may further comprise an SLM, image projection optics comprising at least one light source to eliminate the SLM, and output optics to project modulated light from the SLM onto the surface at the acute angle.
- the projector may have at least one light source comprising a coherent light source, a light emitting diode, a filament lamp and/or fluorescent lamp.
- a coherent light source such as a laser.
- the above image projection optics may be a holographic image projection optics
- the light source to illuminate the SLM may comprise a coherent light source so that the modulated light is coherent light
- the touch sensitive image display device may then be configured to holographically project a touch sensitive displayed image at the acute angle.
- the image displayed on the SLM itself which may be pixellated, may then be a hologram.
- data defining an image for display may comprise data to define at least one image for display on the SLM to replay a target image distorted to compensate for projection onto the surface at the acute angle.
- Advantages of holographic projection in an embodiment may include a wider throw angle, longer depth of field, and/or very substantial distortion correction without substantial loss of brightness/efficiency, compared to a non-holographic embodiment.
- the or each light source may include projection optics such as a lens which may be, e.g., cylindrical, holographic or lenticular. This is particularly advantageous where the or each light distribution is substantially two-dimensional e.g. a sheet distribution.
- the second light distribution may be above the first light distribution. Compared to the case where they are on the same level, this may be advantageous in avoiding any need to perform a touch to initiate an action (e.g., to wake the device up or an element of the touch sensing system such as the light source for the first light distribution).
- the first light distribution is very close to the surface where the image is displayed, for example less than about 3mm above the surface.
- the first distribution is advantageously sufficiently close to the surface that the user feels that the device is controllable by touching rather than by merely positioning an object in the first distribution.
- the second distribution may be about 1 to about 2 cm above the first distribution.
- the use of a multi-pixel sensor system may allow the location of a touch or of a presence to be detected.
- a location may be provided from the system as identifiers of pixels detecting light from the object or may be co-ordinate values e.g. (x,y).
- the or each image detection may be achieved by detecting light scattered, reflected or diffracted from a light distribution.
- the remote touch detection may detect light of the first distribution that has been re-directed.
- a detection of attenuation by booking or attenuating of light of the distribution may be performed.
- Any one of the sensors described herein may indicate light intensity and/or location, wherein any such indication of intensity may be binary, M-ary (where M>2) or analogue).
- the above touch sensitive image display device wherein at least one said output to provide a detected location signal is configured to indicate a light distribution of said location detection.
- the outputs may be physically separate outputs, in which case the indication of distribution may be at least implicit from the physical output on which the signal appears. If the outputs are provided on the same signal line, an identifier of distribution may be explicitly provided with the signal.
- there may be one output signal line indicating the detection signal and the particular distribution in which the object was detected e.g., by providing a detection indicator and a distribution identifier
- the distribution may be determined by timing of the detection signal if the projections of light distributions are multiplexed in time in a pre-determined sequence.
- the provided detected location signal configured to indicate the light distribution may be the detected touch location signal or the detected presence location signal.
- the above touch sensitive image display device comprising: at least one light source to project at least one further light distribution; said multi-pixel sensor system to remotely detect location of an object at least partially within the or each said further light distribution, and having at least one output to indicate a said further distribution and to provide a signal indicating a said location of said object in said indicated further distribution.
- the or each of the at least one light sources may then include projection optics such as a lens, which may be, e.g., cylindrical, holographic or lenticular, particularly advantageously for providing a substantially two-dimensional distribution such as a sheet distribution.
- the feature of projecting at least one further light distribution may mean that any number of lights distributions can be provided and used similarly to the above second light distribution for detecting presence.
- the provision of the multi-pixel sensor system may allow detection of location of presence in the or each further light distribution.
- the system may then comprise one multi-pixel sensor per further light distribution or such a sensor may be shared for detections in different distributions, for example, where multiplexing is used.
- the indication of further distribution may be achieved by an output on a signal line indicating the detection and a distribution identifier, or the distribution may be determined by timing of the detection signal on the signal line if the projecting of light distributions is multiplexed in time in a predetermined sequence.
- the controller is configured to select an action on the basis of at least one of said signals, and to control said touch sensitive image display device to perform said action.
- the at least one signal may comprise a detected touch location signal, a detected object location signal relating to the second distribution, or detected object location signal(s) relating to any further distribution.
- the controller may be configured to select said action further on the basis of at least one said indication of light distribution. This may as advantageously allow control of the display device by the user moving an object such as his finger in different levels/layers corresponding to distributions of the user interface.
- the above touch sensitive image display device wherein said touch sensitive image display is operable by a user's finger as a joystick.
- the light sources sensor systems and controller may detect and respond to movement of the user's finger or stylus or similar object in a joystick like manner.
- the touch is a touch by a user's finger or other object.
- the object may be a user's finger or other object.
- the controller is further configured to perform a said responsive control on the basis of a rate of change of at least one said detected location.
- the detected location may be the location of a touch and/or object presence, in any combination of the first, second and further distributions. This further applies to the above touch sensitive image display device, wherein the controller is further to configured to perform a said responsive control on the basis of a direction of change of at least one said detected location.
- the controller is further configured to perform a said responsive control on the basis of a locus of change of at least one said detected location.
- the detected location may be of a touch or an object presence, in any combination of the distributions.
- controller is further configured to perform a said responsive control on the basis of difference between two said detected locations.
- controller is further configured to perform a said responsive control on the basis of a rate of change of difference between two said detected locations.
- a said responsive control controls said touch sensitive image display device to update or move at least a portion of said displayed image.
- a said responsive control controls said touch sensitive image display device to display a further image adjacent to or at least partially overlapping said displayed image.
- a further image may be a sub-image at least partly within the displayed image.
- Such a sub-image may be, e.g. a menu or button (e.g. stop, start, fast forward, rewind, etc.).
- the further image may replace at least a portion of the displayed image or be displayed in a water mark style with the displayed image.
- the further image may subsstantially replace the entire displayed image. For example, if the further image fully overlaps the displayed image, the further image could be a low-power screen saver, the further image being adapted to reduce the required projection power relative to a higher complexity image.
- a said multi-pixel sensor system comprises at least one multi-pixel sensor.
- the multi-pixel sensor system may be the touch system or the object location system or both, and the multi-pixel sensor there may be a camera or CCD array or any other means for detecting a location.
- a said sensor of said multi-pixel sensor system to detect touch is a said sensor of said multi-pixel sensor system to remotely detect location of an object.
- the systems may each comprise a sensor shared with the other system. Furthermore, where each system comprises a single sensor, there may be only one sensor provided in the display device for all sensing. This may apply similarly where further distributions are used as described above.
- touch sensitive image display device further comprising an anti- distortion system to map a said detected location of touch to a portion of said input image.
- the said touch or object location and/or the portion may correspond to one or more pixels of the input image.
- the above touch sensitive image display device wherein at least one said light source is another of said light sources.
- the light source may comprise any combination of the light sources projecting the first, second or any further distribution. Thus, there may be fewer light sources than distributions, and even a single light source for projecting all distributions, if the distributions are multiplexed in wavelength or time.
- any combination of the sources may comprise an infra-red light source.
- either or both of the sources (and/or the sensors or shared sensor) may have a filter to pass infra-red but not visible light.
- At least two light sources may emit light at different wavelengths.
- different infra-red wavelengths may be used for each light source so that the distributions can be used simultaneously.
- Any number of light sources (and/or the sensors or shared sensor) may comprise a filter e.g. a notch filter, for passing a pre-determined wavelength.
- a consumer electronic device such as a camcorder, camera, music player, mobile phone, media player or computer, comprising the touch sensitive image display device according to any preceding claim.
- a camcorder may use the display device to allow a user to preview a recorded video sequence by observing and image projected at an acute angle from the camcorder onto a surface on which the camcorder stands.
- the carrier may be, for example, a disk, CD- or DVD-ROM, or programmed memory such as read-only memory (Firmware).
- the code (and/or data) may comprise source, object or executable code in a conventional programming language (interpreted or compiled) such as C, or assembly code, for example for general purpose computer system or a digital signal processor (DSP), or the code may comprise code for setting up or controlling an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array), or code for a hardware description language such as Verilog (Trade Mark) or VHDL (Very high speed integrated circuit Hardware Description Language).
- ASIC Application Specific Integrated Circuit
- FPGA Field Programmable Gate Array
- Verilog Trade Mark
- VHDL Very high speed integrated circuit Hardware Description Language
- Fig. 1 shows a display device of an embodiment
- Fig. 2 shows a display device of an embodiment
- Fig. 3 a shows a basic block diagram of elements relating to Figures 1 and 2;
- FIG. 3b shows a more detailed block diagram of an embodiment having features discussed in relation to Figures 1 - 2 and 3a;
- Fig. 3c shows a more detailed block diagram of an embodiment
- Fig. 4 shows a timing diagram for illustrating a first mode of operation of an embodiment
- Fig. 5 shows a timing diagram of multiplexing of detection in the time domain in an embodiment
- Fig. 6 shows an arrangement including a phase locked loop for synchronising distribution projection and receiving of detection signals in an embodiment
- Fig. 7 shows how a controller C may determine a proximity detect signal in an embodiment
- Fig. 8 illustrates a shared a multi-pixel sensor in an embodiment
- Fig. 9 shows a consumer electronics device such as a camcorder including a touch sensitive image display device.
- the embodiment includes a projector for projecting a touch sensitive displayed image at an acute angle onto a surface on which the device is placed.
- the acute angle is generally less than 90° from the surface where the image is displayed.
- the angle may be pre-determined or may be adjustable by, for example, moveable optics.
- the projection optics may comprise a Texas Instruments (registered trademark) digital light processor, a LCOS imager and/or a laser scanner.
- the image may be projected indirectly to the surface, by first projecting the image to a mirror attached to the housing of the projector, the mirror being arranged to be further from the surface than the projector itself so that the resulting displayed image may be larger.
- the projector projects the image on the basis of data defining a desired image.
- the data may be received in an image processor of the projector from a data input receiving data from a source internal or external to the display device.
- the image processor may have code for calculating a target image for projection, such that the displayed image as observed by the user does not suffer distortion resulting from the acute angle. Additionally or alternatively, such anti-distortion calculations may account for distortion due to curvature of the surface on which the image is displayed. Such compensation may be carried out by forward or reverse mapping between the (preferably undistorted) input image data and the target image, and may involve cropping of image data.
- the acute angle projection may be achieved by providing a stand for supporting the display device or at least the projector though at a pre-determined or adjustable angle.
- the projector may be used to scan, e.g., known shapes, patterns and/or lines onto a surface and any described multi-pixel sensing system may be operated 'synchronously' to capture an image to correlate.
- the captured images may be used to compute a 3-D surface profile.
- the profile information may then be used to create a correction function, e.g., look-up table, for images projected onto that surface.
- Figure 1 shows the display device 1 comprising a projector 2, sensor systems 3 and 4, light source 5 associated with output optics 5a and light source 6 associate with output optics 6a.
- the sensor systems 3 and 4 may be combined such that a single sensor 7a or 7b with optional anti-distortion processor ADl, AD2 can be shared for both presence and touch sensing.
- This is shown by the dashed lines in Figure 1, particularly in relation to the anti-distortion units ADl and AD2 which may be comprised in the sensor system(s) or in the controller C).
- Figure 1 further shows that the light source 6 projects a first light distribution dl over the displayed image Im, and light source 5 projects a second distribution d2 'over the first distribution.
- the first distribution is preferably less than about 3mm from the surface where the image is displayed and the second distribution d2 may be about 1 to about 2cm above the first distribution.
- Figure 1 further shows that a further distribution, or any number of further distributions d3, may be used to allow detections at different levels above the displayed image).
- Figure 1 shows that a finger f may be detected touching the displayed image on the basis of sensing light from the first distribution dl. Similarly, presence of the finger f in the distribution d2 may be detected, and this applies further to any further distributions d3. Thus, the user may be able to control the device by "touching" at different levels. Furthermore, detection using the distribution d2 can be used to protect proximity or approach towards the displayed image. In more detail, regarding the equations shown in Figure 1, the difference between two positions such as pi and p2, which positions are detectable by sensors 7a and 7b if these are multi-pixel sensors, may result in the selection of a function to be performed by the controller c.
- the positions may be locations of touch or presence in one particular distribution or may be locations in different distributions.
- a versatile user interface may be provided such that, for example, the user can control the device in a joystick-like manner by using his finger, a stylus, pen, etc.
- Figure 2 shows an embodiment of a display device 11 with corresponding elements as described above in relation to Figure 1.
- the combination of light source 5 (with any required output optics 5a) and sensor 7a may be provided in the form of a transceiver, including, e.g., a laser and camera.
- the embodiment of Figure 2 illustrates that the second optical distribution d2 may be a three- dimensional volume such as a cone.
- sensors 7a and 7b may detect light reflected or scattered from an object such as finger f.
- the embodiment of Figure 2 may provide a less complex apparatus when it is merely required to detect whether or not an object is near, i.e., proximate to a displayed image, whether this is for allowing power management, an improved user-interface or even for safety (e.g., if any light source such as that of the projector is a laser). This may be achieved by detecting the intensity of light scattered from an object. A suitable action taken by the controller c in response to such a detection may be to turn on any element of the device 11, such as the light source 6 and/or projector 2 (in particular the light source of the projector). For example, the touch sensing system may be disabled when a user is not interacting with the display device.
- power hungry devices may be switched on, or at least have their power increased, only when proximity is detected.
- a further, e.g., sub-, image can be displayed with the above-mentioned displayed image in response to the detection.
- absence of detections of touch and distribution dl can be used to switch off or reduce the above powers. Consequently, it may be possible in any one of the embodiments described herein to reduce power consumption and/or to display further images with the above mentioned display image only when the user is interacting with the device.
- a further image may be a sub-image that replaces, or is overlaid on (similarly to a water mark), a portion of the displayed image, or it may be an image that sustantially replaces the entire displayed image.
- the further image could be a low-power screen saver, the further image being adapted to reduce the required projection power relative to a higher complexity image.
- this may be a menu or a control button such as stop, fast forward, rewind, start, pause etc.
- Such a further image may be displayed (e.g., under control of the controller) in response to any detection signal described herein).
- embodiments may provide power saving features and/or a better interface for the user.
- Figure 3a shows a basic block diagram of elements described above in relation to Figures 1 or 2. It is noted that, in some embodiments, the sensor systems 3 and 4 may be a combined system that multiplexes in wavelength or time sensing and/or projecting of light of the distributions. In more detail, Figure 3a shows that the light sources may be infra-red. (It is noted that any indication of direction(s) of signal transmission between blocks in any block diagram of this specification is purely for example only and does not require bidirectional transmission on any link).
- the sensor system 3 may be a multi-pixel power sensor such as a camera for calculating location (position).
- a second sensor is provided for the sensor system 4, this may be a single pixel or at least have a lower number of pixels than the sensor(s) of system 3.
- the sensor system 3 may be suitable for detecting location of a touch while the system using the second sensor may be suitable merely for detecting presence or motion.
- the light source 6 may provide a substantially two-dimensional sheet just above the surface where the image is displayed, this sheet being used to do the touch sensing.
- a further substantially two- dimensional sheet of light may be protected or alternatively the distribution may be a three-dimensional cone or volume of light.
- FIG. 3b A more detailed block diagram of a further embodiment having any of the above features discussed in relation to Figures 1 - 3, and having corresponding elements, is shown in Figure 3b.
- This detailed arrangement may be implemented within a consumer electronic device such as a camcorder, computer, mobile phone etc.
- a memory M is provided for image or video data.
- the image data/pages may be derived from a video input, non-volatile ram storage or any other source.
- the image sensor IS 1 is optional and may be present in a device for recording video or still images, such as a camera or camcorder; the image processor IP is a further optional part associated with the image sensor ISl).
- the memory may be a self-contained unit or found within the projector or the controller C.
- the output optics 2a are generally associated with the projector 2.
- the display processor Ca, power management Cb and detect unit Cc are all associated in Figure 3b with the controller C. However, they may be provided in other units coupled to the controller C.
- a proximity detect block corresponding to a sensor system 3 is shown coupled to the power management block Cb and may comprise the light source 5 and detector 7a.
- an output from a detector 7b in the form of a camera for detecting location of a touch is shown coupled to the power management block Cb.
- the power management can control illumination of the first distribution for touch and/or of the displayed image off, or at least reduce the power used for these processes. The illumination may be controlled off after a period of absence of touch detects and the proximity detector may be pulled thereafter until proximity is detected so that the illumination can be turned back up to full power.
- Figure 3b further shows a memory Ml that may be used to map locations of detected touches to locations of the preferably un-distorted input image so that anti-distortion compensation can be formed.
- the touch map memory Ml may be used to implement a look up table for anti-distortion compensation.
- Figure 3c shows a yet more detailed block diagram that may correspond to elements of any one of the embodiments described in this specification.
- Figure 4 shows a timing diagram for illustrating a first mode of operation of any one of the embodiments described herein. The mode is particularly advantageous in an embodiment where the above power management is implemented.
- the upper trace of Fig. 4 shows relatively high power in the first distribution when projection from the light source 6 is active.
- the lower trace shows lower power in the second light distribution due to low duty cycle of projection from the light 5.
- Figure 4 shows that the first light source for touch sensing by means of the distribution dl may provide a relatively high power light distribution when active, in comparison to the average power of the second distribution generated by the other light force.
- the lower power may be due to a lower duty cycle where the second distribution is pulsed on and off repetitively.
- Pulsing of one or more of the distributions in any embodiment described herein may enable rejection of at least some of the ambient light in a corresponding sensor system detection, particularly where a PLL is used as further described herein.
- the background/ambient signal level may be read as frequently, preferably twice as frequently, as the pulsing of the corresponding light source).
- the high-power first distribution source is maintained on. However, when absence of such an object is detected, the higher power source is turned down or switched off and polling for proximity begins by pulsing the lower power second distribution d2. When proximity is detected in the distribution d2, because the hand or other object has returned, the relatively higher power first distribution source for touch sensing is switched back to full power operation once more.
- Figure 5 shows a timing diagram of multiplexing of detection in the time domain. This may be used in any one of the embodiments to reduce the component count required for the device, e.g., by requiring a single light source and/or single sensor system.
- the upper trace shows projection in the lower (preferably infra-red (IR)) layer.
- the lower trace shows projection in the upper (IR) layer.
- the light distributions dl (lower IR layer) and d2 (upper IR layer) are switched to full power alternately.
- a single sensing system or even sensor may be configured to provide touch and presence detection signals synchronous with the alternating, such that the identification of each signal as being of touch or presence may be identified at least by the timing of the detection.
- Figure 6 shows that a phase locked loop (PLL) may be used to synchronise the distribution projection and receiving of light signals for detection such that, for example, a power laser, may be controlled accordingly.
- PLL phase locked loop
- An advantage of such an implementation may be reduction of signal to noise ratio in the detection signals, since the sensor output can be filtered to obtain detection signals occurring at the same frequency and/or phase as the distribution pulsing, for example.
- the PLL may be part of the controller c for acting on the detection, or may be in a different device component.
- Figure 7 shows that, in any of the embodiments described herein, the controller C may determine a proximity detect signal and on this basis control one or more elements of the device to be switched on/off or switched to a lower/higher power mode.
- a plurality of such control signals may be provided, for example for controlling predetermined touch sensor elements such as the light distribution dl light source off or for tuning the projector 2 or at least the light source thereof off.
- Figure 8 illustrates a principle applicable in any one of the embodiments described herein, wherein the sensor systems 3 and 4 may share a multi-pixel sensor.
- the camera functions as the sensor 7a, 7b and has an array of pixels that may have a grid format such as that shown in the lower portion of Figure 8.
- a signal processor which may be found in the controller c outputs on the basis of the camera output a location such as co-ordinates (X 5 Y) and a detection signal. (The location may even indicate which distribution the location has been detected in).
- the resolution of the pixels used for proximity detector detection shown at the top of the grid
- the controller may have knowledge of co-ordinate or locations corresponding to mere proximity and those corresponding to a detected touch location on the displayed image. Furthermore, the controller may have memory and/or processing for mapping the displayed image co-ordinates to input image regions including to compensate for distortion due to the acute angle or curvature of the surface.
- Figure 9 shows a consumer electronics device REC such as a camcorder including a touch sensitive image display device of any one of the embodiments described herein.
- a consumer electronics device REC such as a camcorder including a touch sensitive image display device of any one of the embodiments described herein.
- an optional lens of the device for recording video a output optics lens of a projector 2 of the touch sensitive image display device, the display image Im, and light sources 5, 6 (which may be combined, i.e. comprise a single, shared light source).
- Such a device may include elements of any embodiment described herein in any combination, in particular those of the block diagrams described.
- the following relates to the above-mentioned anti-distortion compensation and is applicable in any embodiment described herein, in particular those using an SLM and in particular holographic projection.
- light from the entire illuminated area of the SLM may be directed into the distorted target image field.
- the displayed image is substantially focus-free; that is the focus of the displayed image does not substantially depend upon the distance from the image projection system to the display surface.
- a demagnifying optical system may be employed to increase the divergence of the modulated light to form the displayed image, thus allowing an image of a useful size to be displayed at a practical distance.
- the field of the displayed image may suffer from keystone distortion, the trapezoidal distortion of a nominally rectangular input image field caused by projection onto a surface at an angle which is not perpendicular to the axis of the output optics.
- the image projection system internally generates a target image to which the inverse distortion has been applied so that when this target image is projected the keystone distortion is compensated.
- the target image is the image to which a transform is applied to generate data for display on the SLM.
- the system also includes non- volatile memory storing mapping data for mapping between the input image and the target image.
- either forward or reverse mapping may be employed, but preferably the latter, in which pixels of the target image are mapped to pixels of the input image, a value for a pixel of the target image then being a assigned based upon lookup of the value of the corresponding pixel in the input image.
- the trapezoidal shape of the target image field is located in a larger, for example rectangular target image (memory) space and then each pixel of the target image field is mapped back to a pixel of the (undistorted) input image and this mapping is then used to provide values for the pixels of the target image field.
- This is preferable to a forward mapping from the input image field to the distorted target image field for reasons which are explained below.
- the transform is only applied to the distorted, generally trapezoidal target image field rather than to the entire (rectangular) target image memory space, to avoid performing unnecessary calculations.
- a compensation is also applied for variations in per unit area brightness of the projected image due to the acute angle projection.
- diffraction from a given pixel of the SLM may contribute to substantially an entire displayed hologram (where holographic projection is used in an embodiment)
- the diffracted light from this pixel will be distorted resulting in more illumination per unit area at the short-side end of the trapezoid as compared with the long-side end of the trapezoid.
- an amplitude or intensity scale factor is applied the value of which depends upon the location (in two dimensions) of a pixel in the target image space.
- This amplitude/intensity compensation may be derived from a stored amplitude/intensity map determined, for example, by a calibration procedure or it may comprise one or a product of partial derivatives of a mapping function from the input image to the anti-distorted target image.
- the amplitude/intensity correction may be dependent on a value indicating what change of area in the original, input image results from a change of area in the anti- distorted target image space (at the corresponding position) by the same amount.
- mapping pixels of the input image to pixels of the target image may not populate all the pixels of the target image with values.
- One approach to address this issue is to map a pixel of the input image to an extended region of the target image, for example, a regular or irregular extended spot, hi this case a single pixel of the input image may map to a plurality of pixels of the target image.
- pixels of the target image which remain unpopulated may be given values by interpolation between pixels of the target image populated with pixel values.
- these extended regions or spots may overlap in the target image, in which case the value of a target image pixel may be determined by combining more particularly summing, the overlapping values (so that multiple input image pixels may contribute to the value of a single target image pixel).
- Preferred embodiments of the image projection system provide a multi-colour, more particularly a full colour display.
- red, green and blue laser illumination of the SLM may be employed, time multiplexed to display three colour planes of the input image in turn.
- the blue light diverges less than the red light and thus in preferred embodiments the target image also has three colour planes in which a different scaling is employed for each colour, to compensate for the differing sizes of the projected colour image planes. More particularly, since the red light diverges most, the target image field of the red colour plane is the smallest target image field of the three target image planes (since the target image has "anti-distortion" applied).
- the size of the target image field for a colour is inversely proportional to the wavelength of light used for that colour.
- the distortion (more correctly anti-distortion) of each colour image plane may be mapped to a corresponding colour plane of the target image field using a calibration process which corrects for chromatic aberration within the projection system such as chromatic aberration within the projection optics, chromatic aberration caused by slight misalignment between rays for different colours within the optics, and the light.
- the techniques employed in preferred embodiments of the projector facilitate miniaturisation of the projector. These techniques also facilitate handling of extreme distortion caused by projection onto a surface on which the projector is placed, this extreme distortion resulting from the geometry illustrated in later figure Ic in combination with the small size of the projector.
- the surface onto which the image is projected is no more than Im, 0.5m, 0.3m, 0.2m, 0.15m, or 0.1m away from the output of the projection optics 102.
- the distance from the output of the projection optics to the furthest edge of the displayed image is substantially greater than the distance from the output of the projection optics to the nearest edge of the displayed image, for example 50%, 100%, 150%, 200% or 250% greater.
- the acute projection angle may be less than 70°, 65°, 60°, 55°, 50°, or even 45°.
- the device may also provide a forward projection mode and incorporate a stand such as a bipod or tripod stand, and preferably also a sensor to automatically detect when the device is in its table-down projection configuration, automatically applying distortion compensation in response to such detection.
- the projection optics may be adjusted to alter between forward and table-down projection. This could be achieved with a moveable or switchable mirror, but an alternative approach employs a wide angle or fisheye lens which when translated perpendicular to the output axis of the optics may be employed to move from forward projection to table-down projection at an acute angle.
- a mapping between the input image and the anti-distorted target image may comprise either an analytical mapping, based on a mathematical function, or a numerical mapping, for example, derived from a calibration procedure or both.
- target image pixels are mapped to input image pixels to lookup target image pixel values.
- the target image is also corrected for area mapping distortion and, in a colour system, preferably the different colour planes are appropriately scaled so that they reproduced in the projection surface at substantially the same size.
- devices and methods preferably an (AD)OSPR-type procedure (WO2007/031797) is employed to generate the hologram data.
- AD AD
- a single displayed image or image frame is generated using a plurality of temporal holographic subframes displayed in rapid succession such that the corresponding images average in an observer's eye to give the impression of a single, noise-reduced displayed image.
- buttons for play, stop etc. may be overlaid on the displayed video imaging only when necessary so that the video can be comfortably viewed by the user when the user is not manually interacting with the device.
- a further advantage is how to reduce power of a touch sensitive display device, this being achieved by using a lower power proximity sensing system that allows the higher power touch sensing system to operate at lower power or even "sleep" when the user is not manually interacting but, e.g. merely passively observing the displayed image. Such proximity detection may further allow the device to be activated merely by a user's hand or other object approaches the displayed image rather than only by direct touch activation. Thus, power saving and/or a better interface may be achievable.
- the projector in any one of the embodiments described herein may be holographic since this may advantageously provide a wide through angle long depth of field and very substantial distortion correction with less loss of brightness/efficiency than in non-holographic projectors.
- These techniques are described in our UK patent application number GB0822336.4 filed on December 8, 2008 hereby incorporated by reference in its entirety.
- the mapping between a target image for display and an input image is described by a pair of polynomial expansions and, more particularly by two sets of polynomial coefficients for these expansions.
- a location (x, y) in the input image space as a pair of functions f ', g ' of the coordinates in the (anti- distorted) target image space, as follows: f ⁇ x ⁇ y ⁇ ) > x g'O',/)
- mapping from the target to the input image is employed.
- An example pair of polynomial expansions is given below:
- a single pixel of the target image may maps to a plurality of pixels in the input image. This can be appreciated because the distortion effectively shortens the nearer edge of the input image as compared with the more distant edge from the output optics. Therefore in some preferred embodiments the target image is constructed by stepping through the (x', y') positions in the target image and for each looking up the addresses of the corresponding pixels in the input image and using the values from these pixels to assign a value to the corresponding pixel in the target image. Where multiple input image pixels correspond to a single target image pixel the values of the input image pixels may, for example, be summed or some other approach may be employed for example selecting a value such as a mean, medium or mode value.
- mapping from the input image to the target image can leave holes in the target image, that is pixels with unpopulated values.
- a single pixel of the input image may be mapped to a regular or irregular spot with an extended size (over multiple pixels) in the target image, optionally with a super imposed intensity distribution such as a gaussian distribution.
- a hologram H(X, Y) of the target image is generated to approximate the following expression: wheie N represents the number of pixels in the hologiam m the X and Y-directions (here foi simplicity, the same numbei)
- the region of the target image space outside the image may be filled with zeios and theiefoie in some preferred implementations the evaluation of H(X, Y) is performed over a window of target image space defined by the target image, for efficiency
- a set of functions f R ', g R ', f G ', g G ', f B ', g B ' is employed to correct foi chromatic aberration, positioning of the different coloured laseis and the light
- mappmg using a foiwaid function fiom the mput image to the target nnage space the scaling applied is to multiply iather than divide by wavelength and the above approaches are adapted mutatis mutandis
- a different approach may, however, be employed when forward mapping from the input image to the target image space.
- an input image pixel is mapped to an extended area or spot in the target image space area correction may be performed automatically by adding the contributions from overlapping spots in the target image space - that is a target image pixel value maybe determined by adding pixel values from input image pixels whose extended spots overlap that target image pixel.
- a spatial light modulator such as a pixellated liquid crystal device, which may be transmissive or reflective, for displaying a target image based on the input image may allow the displayed image to be updated (e.g., scrolled, panned, zoomed) dynamically in response to touch and/or proximity/presence (or movement thereof).
- touch-sensitive displays for the following: mobile phone; PDA; laptop; digital camera; digital video camera; games console; in- car cinema; navigation systems (in-car or personal e.g. wristwatch GPS); head-up and helmet-mounted displays e.g. for automobiles and aviation; watch; personal media player (e.g. photo/video viewer/player, MP3 player, personal video player); dashboard mounted display; laser light show box; personal video projector (a "video iPod (RTM)" concept); advertising and signage systems; computer (including desktop); remote control unit; an architectural fixture incorporating an image display system; more generally any device where it is desirable to share pictures and/or for more than one person at once to view an image.
- touch-sensitive displays for the following: mobile phone; PDA; laptop; digital camera; digital video camera; games console; in- car cinema; navigation systems (in-car or personal e.g. wristwatch GPS); head-up and helmet-mounted displays e.g. for automobiles and aviation; watch; personal media player (e.g. photo/video viewer/player, MP
- an embodiment may advantageously provide a combined input-output, i.e., interactive, display having a 3-D touch interface (i.e., one that is able to detect and respond to touch/presence of an object in different spatial regions).
- a combined input-output i.e., interactive, display having a 3-D touch interface (i.e., one that is able to detect and respond to touch/presence of an object in different spatial regions).
- any above-described presence sensor system which is not arranged to detect touch of the displayed image but merely presence and/or approach and/or location in a corresponding second distribution, may be provided to allow sleep and/or wake-up detection for any element of the device, in particular of the touch sensing system comprising light source for the first distribution and touch sensor system.
Abstract
We describe a touch sensitive image display device for projecting a touch sensitive displayed image at an acute angle onto a surface on which the device is placed, the device comprising: a light source to project a substantially two-dimensional first light distribution in a first plane; a light source to project a substantially two- dimensional second light distribution in a second plane, wherein said second plane is different to said first plane; a multi-pixel sensor system to remotely detect touch of an area of said surface within or adjacent to said displayed image by detecting light from said first distribution, and having an output to provide a detected touch signal; said multi-pixel sensor system to remotely detect presence of an object at least partially within said second light distribution by detecting light from said second distribution, and having an output to provide a detected presence signal; and said controller having an input to receive said detected touch signal and an input to receive said detected presence signal and configured to control said touch sensitive image display device responsive to said signals, wherein said device is configured to multiplex projection of the first light distribution and projection of the second light distribution.
Description
TOUCH SENSITIVE IMAGE DISPLAY DEVICE
FIELD OF THE INVENTION
This invention generally relates to a touch sensitive display device and to a consumer electronic device comprising such a device.
BACKGROUND TO THE INVENTION
We have previously described techniques for displaying an image liolographically - see, for example, WO 2005/059660 (Noise Suppression Using One Step Phase Retrieval), WO 2006/134398 (Hardware for OSPR), WO 2007/031797 (Adaptive Noise Cancellation Techniques), WO 2007/110668 (Lens Encoding), and WO 2007/141567 (Colour Image Display), and GB application GB0823457.7 filed on December 24, 2008 (Holographic Image Display Systems). These are all hereby incorporated by reference in their entirety.
Projecting downwards and outwards onto a flat surface such as a tabletop entails projecting at an acute angle onto the display surface (taking this as the angle between the centre of the output of the projection optics and the middle of the displayed image - this angle, to a line in the surface, is less that 90°). We conveniently refer to this as "table down projection". Table down projection is not readily achievable by conventional image display techniques; scanning image display systems have a narrow throw angle and thus find it difficult to achieve a useful image size whilst projection systems, especially those based on LEDs (light emitting diodes) which have a wide light output angle, find it difficult to achieve a useful depth of field. Moreover table down projection can often involve very substantial distortion of an image which can result in inefficient use of the area of an image display device, resulting in major reductions in image brightness and overall system efficiency. Background information relating to compensating for keystone distortion in an LCD projector can be found in US6,367,933 (WO00/21282); further background prior art can be found in: WO02/101443; US6,491,400; and US7,379,619.
Holographic image display techniques are described in our UK Patent Application number GB0822336.4 filed on 8 December 2008 hereby incorporated by reference in its entirety.
The inventors have further recognised that "tabledown" projectors of the type we have previously described can be combined with touch sensing technology to provide a touch sensitive image display with many advantages, including where the image display technique is non-holographic or holographic.
Background prior art relating to touchsensing can be found, for example, in patent applications filed by Lumio Inc (such as WO2008/038275) and VKB Inc (such as US2007/222760), as well as in patent/applications filed by Canesta Inc (for example US6,323,942), and patent applications filed by Sensitive Object (such as WO2006/108443 and WO2008/146098).
SUMMARY Multiplexed systems
According to a first aspect of the present invention, there is provided a touch sensitive image display device for projecting a touch sensitive displayed image at an acute angle onto a surface on which the device is placed, the device comprising: a light source to project a substantially two-dimensional first light distribution in a first plane; a light source to project a substantially two-dimensional second light distribution in a second plane, wherein said second plane is different to said first plane; a multi-pixel sensor system to remotely detect touch of an area of said surface within or adjacent to said displayed image by detecting light from said first distribution, and having an output to provide a detected touch signal; said multi-pixel sensor system to remotely detect presence of an object at least partially within said second light distribution by detecting light from said second distribution, and having an output to provide a detected presence signal; and said controller having an input to receive said detected touch signal and an input to receive said detected presence signal and configured to control said touch sensitive image display device responsive to said signals, wherein said device is configured to multiplex projection of the first light distribution and projection of the second light distribution.
Advantageously, an embodiment of the first aspect may allow a reduction in the number of components required for a touch sensitive image display device. This may be the case where the projection multiplexing is in the time domain so that, e.g., the light sources may be the same light source, i.e., a single light source is used to project both distributions. Similarly, this may be the case where light of different wavelengths obtained from a single broadband light source is used to project both distributions. This may be achieved using filters.
The or each multi-pixel sensor system may comprise one or more multi-pixel sensors such as a camera or CCD array. In particular, a multi-pixel sensor allows detection of location of touch or presence. (Any one of the sensors described herein may indicate light intensity and/or location, and may further give an indication of received light intensity for juding a distance, the output for intensity being binary, M-ary (where M>2) or analogue).
Here and in the description below the or each substantially two-dimensional first distribution in embodiments defines a substantially laminar distribution, for example having a divergence in a vertical plane of less than 10°,
To protect the touch sensitive displayed image, the display device may comprise image projection element(s). This may comprise a projector, in particular one having an input to receive data defining the displayed image. Furthermore, the displayed image may change dynamically in response to the touch and/or presence detections. This may be achieved using a spatial light modulator (SLM) , which may for example be a liquid crystal device pixellated device such as a LCOS (liquid crystal on silicon) device and may be a transmissive or reflective device. More specifically, the device may comprise an SLM, image projection optics comprising at least one light source to illuminate the SLM, and output optics to project modulated light from the SLM onto the surface
at the acute angle. Furthermore, in any embodiment, the light source of the image projection element(s) may comprise a coherent light source, a light emitting diode, a filament lamp and/or a fluorescent lamp.
However, holographic projection may be advantageous for obtaining a longer depth of field, a wider throw angle, and/or very substantial distortion correction without substantial loss of brightness/efficiency, compared to with non-holographic projection, and this may require the light source of the image projection element(s) to be a coherent light source such as a laser. In such an embodiment, the device having the SLM may have holographic image projection optics as the image projection optics mentioned above, the light source to illuminate the SLM may comprise a coherent light source and the modulators light is then the coherent light, and the display device may then be configured to holographically project a touch sensitive displayed image at the acute angle. Thus an image displayed on a pixellated SLM itself may be a hologram.
In particular embodiments, the responsive control may be to "wake up" an element of the display device upon receiving a touch and/or presence signal. For example, one of the light sources may be switched on or the power emitted therefrom increased, or a screen saver or further (e.g., sub-) image may be displayed over a portion of the displayed image or adjacent thereto.
There may further be provided the above touch sensitive image display device, comprising: at least one further light source to project a further substantially two-dimensional second light distribution in a further plane, wherein said further plane is different to each other said plane; and at least one further multi-pixel sensor system, each said further multi-pixel sensor system to remotely detect presence of an object at least partially within at least one said further light distribution by detecting light from said at least one further distribution, and having an output to provide a detected presence signal.
Thus, there may be any number of second light distributions. This may advantageously allow presence sensing on multiple levels. Particularly advantageously, each of the first, second and further planes is different in that they do not substantially coincide, even if, for example, any of the planes intercept. Preferably, the distributions are stacked one above the other, preferably substantially parallel to the surface where the image is displayed.
There may further be provided the above touch sensitive image display device, wherein said device is configured to multiplex projection of the first light distribution, the second light distribution and at least one said further light distribution.
Thus, similarly as described above in relation to time and wavelength, detections relating to each distribution may occur in a predetermined sequence where the multiplexing is in time, and/or a shared light source may be used for each distribution. Further similarly, the light distributions may be projected using different wavelengths derived from a shared broadband light source.
There may further be provided the above touch sensitive image display device, wherein said multi-pixel sensor system to detect touch comprises a plurality of multi-pixel sensors each to remotely detect touch of an area of said surface within or adjacent to said displayed image, and having an output to provide a detected touch signal.
Thus, where more than one multi-pixel sensor is used, it may advantageously be possible to detect two or more simultaneous touches, especially if one touching object (e.g. finger) is hidden behind the other and the touch sensors sense touching of the displayed image from different angles/view points/locations. Furthermore, multiple sensors may allow sensing of relative movement, distance or direction of touches such as the bringing of two fingers together to perform an action responsive to relative position, speed or direction.
There may further be provided the above touch sensitive image display device, wherein said multi-pixel sensor system to detect presence comprises a plurality of multi-pixel sensors each to remotely detect presence of an object at least partially within said second light distribution, and having an output to provide a detected presence signal.
Similarly as described above in relation to simultaneous detection of more than one touch, the use of multiple multi-pixel sensors to detect presence may allow simultaneous detection of more than one presence especially regarding the above hidden scenario and may be advantageous in relation to the above relative movement/distance/direction of presences.
There may further be provided the above touch sensitive image display device, wherein said light source to project said first light distribution is said light source to project said second light distribution. Thus, the light sources may comprise a shared light source. The or each light source may be an infrared light source or may emit visible light.
There may further be provided the above touch sensitive image display device, wherein said first plane and said second plane are substantially parallel planes. For example, when the second plane is 'above' the first plane when the touch sensitive image display device is in use ('above' assumes the displayed image and surface are 'below', even when the direction of projection is not downwards as such), the second distribution may be located, e.g., about 1 to about 2 centimetres above the first distribution. Thus, the first and second light distributions may be substantially parallel. However, preferably, the distributions are at least non-intersecting.
There may further be provided the above touch sensitive image display device, wherein said multiplexing is wavelength multiplexing. Additionally or alternatively, there may be provided the above touch sensitive image display device, wherein said multiplexing is time multiplexing.
For the wavelength multiplexing, the first and second light distributions may be projected using different respective wavelengths, e.g., wavelength division multiplexing, e.g., using different infra-red wavelengths emitted from the first and second light sources.
Regarding the above time multiplexing, this may be such that, when the first light distribution is being projected, the second light distribution is not being projected. Where such multiplexing occurs, each light source may be pulsed on when no other of the light sources to project a distribution is on. The controller may be configured to control the timing of the emission from the light sources or of projecting the distributions (in the case of a shared light source). In such cases, the input to receive the output to provide a detected touch signal may be the input to receive the output to provide a detected presence signal. In other words, the detected touch signal and detected presence signal may be provided on the same input line. Of further advantage to a time multiplexing embodiment, there may be provided the above touch sensitive image display device, wherein said responsive controller is configured to read said detected touch signal and said detected presence signal in synchronism with said time multiplexing/Thus, the controller may be configured to read in the signal at a predetermined instant of time when the first distribution is being projected and when a valid touch signal may be available. Similar applies to reading in the detected presence signal. In particular, the controller may read the touch and presence signals alternatively and may further be control the synchronism detection using a phase locked loop (PLL). The use of a PLL may further enable a detection signal to be filtered out having the same frequency as a particular corresponding distribution projection. This may advantageously reduce the signal-to-noise ratio of detection input signals.
There may further be provided the above touch sensitive image display device, wherein the controller is configured to perform said responsive control by distinguishing between said receiving of said detected touch signal and said receiving of a detected presence signal on the basis of a timing of receiving a said signal.
There may still further be provided the above touch sensitive image display device, configured to determine an action to be initiated by said responsive control on the basis of said distinguishing. Thus, for example, a particular action may be selected when a corresponding location of the displayed image is touched and/or a different action (e.g., wake up or power up of a light source) may be selected when presence is detected.
There may yet further be provided to the device the above touch sensitive image display device, further comprising moveable optics to alternately project said first light distribution and said second light distribution. In such an embodiment, the moveable optics may comprise a rotatable or tiltable mirror.
There may further be provided the above touch sensitive image display device, wherein said multi-pixel sensor system to detect touch is to detect a location of said touch and said responsive control performs an action determined on the basis of said detected touch location and controls said device to perform said action.
The detected touch location may be provided as coordinate values, e.g., (x,y), or as pixel identifiers indicating the pixels of a multi-pixel sensor that detected a change in a light distribution resulting from the touch. Determining the action on the basis of the location may involve applying a reverse distortion to map the actual
detected location onto a corresponding location in the original preferably un-distorted input image. This may compensate for distortion resulting from the acute angle.
Similarly, there may be provided the above touch sensitive image display device, wherein said multi-pixel sensor system to detect presence is to detect a location of said presence and said responsive control performs an action determined on the basis of said detected presence location and controls said device to perform said action. Again, coordinates or pixel identifiers and distortion compensation may be applied.
There may further be provided the above touch sensitive image display device, wherein the action comprises selecting a menu on the basis of said detected presence location and displaying said selected menu. This is of particular advantage where an action is determined on the basis of a detected touch/presence location as described above. Thus, context-sensitive menus may be displayed.
There may be further provided the above touch sensitive image display device, wherein the sensor system to detect touch is configured to detect light scattered from said first light distribution. Similarly, there may be provided the above touch sensitive image display device, wherein said sensor system to detect presence is configured to detect light scattered from said second light distribution.
Alternatively, the above detected light in either case may have been, e.g., reflected or diffracted by an object in the distribution. Further alternatively, the detection may occur by detecting attenuation of light received in the sensor system due to the presence of an object attenuating or blocking a portion of light in the distribution. The object may be for example the user's finger or a stylus or any other object that is touching the displayed image or the surface area on which the image is displayed.
There may further be provided the above touch sensitive image display device, wherein said controller is configured to detect hovering of a said object by detecting absence of a said touch detection and occurrence of a said presence detection on the basis of said signals. Thus, responsive actions may be performed on the basis the action of hovering as to opposed to touching and even advantageously on the basis of the location or movement of the hovering object within a second or further distribution. Hovering may correspond to a user remaining present but having paused manual action so that the finger or other object remains suspended, e.g., in the second distribution.
There may still further be provided the above touch sensitive image display device, further comprising a spatial light modulator (SLM) and a controller to control said SLM, on the basis of data defining said displayed image, to replay a target image distorted to compensate for projection onto said surface at said acute angle.
Thus, the controller may process input data defining the (preferably un-distorted) image to be displayed to generate data suitable for modulation of the SLM to project the image such that the image as projected on the surface appears substantially un-distorted. The replay of the target image may mean driving a pixellated SLM
according to target image data. The acute angle may mean an angle of less than 90 degrees to the surface where the image is displayed (this applies throughout the present specification). Data defining an image for display may comprise data defining at least one image for display on an SLM to replay a target image distorted to compensate for projection onto the surface at the acute angle.
There may further be provided the above touch sensitive image display device, wherein a multi-pixel sensor of said multi-pixel sensor system to detect touch is a multi-pixel sensor of said multi-pixel sensor system to detect presence. Thus, the two systems may share a multi-pixel sensor. In this case, the touch sensor system may comprise at least one pixel of a multi-pixel sensor and the presence sensor system may comprise at least one other pixel of the shared multi-pixel sensor. At least one pixel of the presence sensor may have a greater optical aperture than the at least one pixel of the touch sensor. Thus, the touch sensor system may use a substantially central portion of the multi-pixel sensor while the presence system may use boundary pixels of the sensor, and those pixels used for presence detection may be larger than the central ones, i.e., have greater optical aperture and thus provide lower resolution detections.
There may further be provided the above touch sensitive image display device, wherein said multi-pixel sensor system to detect touch is configured to use pixels of said multi-pixel sensor and said multi-pixel sensor system to detect presence is configured to use other pixels of said multi-pixel sensor ._This is broadly consistent with the description above concerning a shared sensor. Furthermore, this allows the presence detection to detect proximity or movement towards (approach) the displayed image, particularly if the pixels used for touch detect light scattered only from touching the displayed image but the pixels used for presence detect light scattered only ftom near the displayed image.
According to a second aspect of the presence invention, there is provided a consumer electronic device such as a camcorder, camera, music player, mobile phone, media player or computer, comprising the above touch sensitive image display device. For example, when implemented in a camcorder, the acute angle projection may allow preview of a recorded video stream to be observed on a surface on which the camcorder is placed.
According to a further aspect of the present invention, there is provided a method corresponding to the above first aspect, optionally with any combination of the above features which may be provided in the first aspect.
According to a further aspect of the present invention, there is provided a method corresponding to the above second aspect, optionally with any combination of the above features which may be provided in the second aspect.
Joystick-type control
According to another aspect of the present invention, there is a provided a touch sensitive image display device for projecting a touch sensitive displayed image at an acute angle onto a surface on which the device is placed, the device comprising: a light source to project a first light distribution; a light source to project a second light
distribution; a multi-pixel sensor system to remotely detect location of touch of an area of said surface within or adjacent to said displayed image by detecting change of said first distribution, and having an output to provide a detected touch location signal; a multi-pixel sensor system to remotely detect location of an object at least partially within said second light distribution, and having an output to provide a detected object location signal; a controller having an input to receive said detected touch location signal and an input to receive said detected object location signal, and configured to control said device responsive to said detected touch location signal and to control said device responsive to said detected object location signal.
Advantageously, an embodiment may thus provide a user interface for touch and presence detection which may use respective light distributions at different levels 'above' the displayed image (taking the displayed image and surface as being 'below', even if the projection is not downwards as such), and further advantageously with more than one presence detection distribution (i.e., a plurality of further light distributions that enable presence detections in the same manner as for the second distribution). Particularly advantageously, such an interface may allow a user to control the device by presence or movement in different layers, the actions performed by the device in response to the user's control advantageously depending on which particular light distribution the user touch/presence is detected in. Moreover, the user interface may allow the user to control the device at least similarly as when using a joystick, the user being able to achieve this merely by use of his finger or a stylus, for example.
For projecting the displayed image, the display device may further comprise an image projection element(s) such as a projector. The device may further have a spatial input for receiving data defining the displayed image. Furthermore, and particular where a spatial light modulator (SLM) ), which may for example be a liquid crystal device pixellated device such as a LCOS (liquid crystal on silicon) device and may be a transmissive or reflective device, is provided in the display device, a dynamic image may be displayed, i.e., the displayed image may change or be updated (e.g., scrolled, panned, zoomed) in response to detection of touch and/or presence (or movement thereof).
Thus, the above touch sensitive image display device may further comprise an SLM, image projection optics comprising at least one light source to eliminate the SLM, and output optics to project modulated light from the SLM onto the surface at the acute angle. The projector may have at least one light source comprising a coherent light source, a light emitting diode, a filament lamp and/or fluorescent lamp. However, it may be advantageous to use a holographic projector, and this may require a coherent light source such as a laser. For the holographic projector, the above image projection optics may be a holographic image projection optics, the light source to illuminate the SLM may comprise a coherent light source so that the modulated light is coherent light, and the touch sensitive image display device may then be configured to holographically project a touch sensitive displayed image at the acute angle. Thus, the image displayed on the SLM itself, which may be pixellated, may then be a hologram. Where an SLM is used in a holographic or non-holographic embodiment, data defining an image for display may comprise data to define at least one image for display on the SLM to replay a target image distorted to compensate for projection onto the surface at the acute angle. Advantages of holographic projection
in an embodiment may include a wider throw angle, longer depth of field, and/or very substantial distortion correction without substantial loss of brightness/efficiency, compared to a non-holographic embodiment.
Further considering the above first aspect, the or each light source may include projection optics such as a lens which may be, e.g., cylindrical, holographic or lenticular. This is particularly advantageous where the or each light distribution is substantially two-dimensional e.g. a sheet distribution.
When the display device is in use, the second light distribution may be above the first light distribution. Compared to the case where they are on the same level, this may be advantageous in avoiding any need to perform a touch to initiate an action (e.g., to wake the device up or an element of the touch sensing system such as the light source for the first light distribution). Preferably, the first light distribution is very close to the surface where the image is displayed, for example less than about 3mm above the surface. The first distribution is advantageously sufficiently close to the surface that the user feels that the device is controllable by touching rather than by merely positioning an object in the first distribution. The second distribution may be about 1 to about 2 cm above the first distribution.
Further regarding the first aspect, the use of a multi-pixel sensor system may allow the location of a touch or of a presence to be detected. Such a location may be provided from the system as identifiers of pixels detecting light from the object or may be co-ordinate values e.g. (x,y). The or each image detection may be achieved by detecting light scattered, reflected or diffracted from a light distribution. Thus, for example, the remote touch detection may detect light of the first distribution that has been re-directed. Alternatively, a detection of attenuation by booking or attenuating of light of the distribution may be performed. (Any one of the sensors described herein may indicate light intensity and/or location, wherein any such indication of intensity may be binary, M-ary (where M>2) or analogue).
There may further be provided the above touch sensitive image display device, wherein at least one said output to provide a detected location signal is configured to indicate a light distribution of said location detection. The outputs may be physically separate outputs, in which case the indication of distribution may be at least implicit from the physical output on which the signal appears. If the outputs are provided on the same signal line, an identifier of distribution may be explicitly provided with the signal. Thus there may be one output signal line indicating the detection signal and the particular distribution in which the object was detected (e.g., by providing a detection indicator and a distribution identifier), or the distribution may be determined by timing of the detection signal if the projections of light distributions are multiplexed in time in a pre-determined sequence. Alternatively, there may be one output signal line per distribution. The provided detected location signal configured to indicate the light distribution may be the detected touch location signal or the detected presence location signal.
There may further be provided the above touch sensitive image display device, comprising: at least one light source to project at least one further light distribution; said multi-pixel sensor system to remotely detect location
of an object at least partially within the or each said further light distribution, and having at least one output to indicate a said further distribution and to provide a signal indicating a said location of said object in said indicated further distribution. The or each of the at least one light sources may then include projection optics such as a lens, which may be, e.g., cylindrical, holographic or lenticular, particularly advantageously for providing a substantially two-dimensional distribution such as a sheet distribution.
The feature of projecting at least one further light distribution may mean that any number of lights distributions can be provided and used similarly to the above second light distribution for detecting presence. In this regard, there may be provided one physical light source for further distribution or at least some further distributions may share the same physical light source. The provision of the multi-pixel sensor system may allow detection of location of presence in the or each further light distribution. The system may then comprise one multi-pixel sensor per further light distribution or such a sensor may be shared for detections in different distributions, for example, where multiplexing is used. The indication of further distribution may be achieved by an output on a signal line indicating the detection and a distribution identifier, or the distribution may be determined by timing of the detection signal on the signal line if the projecting of light distributions is multiplexed in time in a predetermined sequence.
There may further be provided the above touch sensitive image display device, wherein the controller is configured to select an action on the basis of at least one of said signals, and to control said touch sensitive image display device to perform said action. The at least one signal may comprise a detected touch location signal, a detected object location signal relating to the second distribution, or detected object location signal(s) relating to any further distribution.
In such a touch sensitive image display device, the controller may be configured to select said action further on the basis of at least one said indication of light distribution. This may as advantageously allow control of the display device by the user moving an object such as his finger in different levels/layers corresponding to distributions of the user interface.
Particularly advantageously, there may be provided the above touch sensitive image display device, wherein said touch sensitive image display is operable by a user's finger as a joystick. Thus, the light sources sensor systems and controller may detect and respond to movement of the user's finger or stylus or similar object in a joystick like manner.
There may further be provided the above touch sensitive image display device, wherein the touch is a touch by a user's finger or other object. Similarly, the object may be a user's finger or other object. As an alternative to the user's finger in either case, the user may use a stylus or pen or other stick-like object or any other object instead of his finger.
Still further, there may be provided the above touch sensitive image display device, wherein the controller is further configured to perform a said responsive control on the basis of a rate of change of at least one said detected location. The detected location may be the location of a touch and/or object presence, in any combination of the first, second and further distributions. This further applies to the above touch sensitive image display device, wherein the controller is further to configured to perform a said responsive control on the basis of a direction of change of at least one said detected location.
There may further be provided the above touch sensitive image display device, wherein the controller is further configured to perform a said responsive control on the basis of a locus of change of at least one said detected location. Again, the detected location may be of a touch or an object presence, in any combination of the distributions.
Furthermore, there may be provided the above touch sensitive image display device, wherein the controller is further configured to perform a said responsive control on the basis of difference between two said detected locations.
Furthermore, there may be provided the above touch sensitive image display device, wherein the controller is further configured to perform a said responsive control on the basis of a rate of change of difference between two said detected locations.
There may further be provided the above touch sensitive image display device, wherein a said responsive control controls said touch sensitive image display device to update or move at least a portion of said displayed image.
Furthermore, there may be provided the above touch sensitive image display device, wherein a said responsive control controls said touch sensitive image display device to display a further image adjacent to or at least partially overlapping said displayed image. Such a further image may be a sub-image at least partly within the displayed image. Such a sub-image may be, e.g. a menu or button (e.g. stop, start, fast forward, rewind, etc.). Furthermore, the further image may replace at least a portion of the displayed image or be displayed in a water mark style with the displayed image. The further image may subsstantially replace the entire displayed image. For example, if the further image fully overlaps the displayed image, the further image could be a low-power screen saver, the further image being adapted to reduce the required projection power relative to a higher complexity image.
There may further be provided the above touch sensitive image display device, wherein a said multi-pixel sensor system comprises at least one multi-pixel sensor. The multi-pixel sensor system may be the touch system or the object location system or both, and the multi-pixel sensor there may be a camera or CCD array or any other means for detecting a location.
There may further be provided the above touch sensitive image display device, wherein a said sensor of said multi-pixel sensor system to detect touch is a said sensor of said multi-pixel sensor system to remotely detect location of an object. The systems may each comprise a sensor shared with the other system. Furthermore, where each system comprises a single sensor, there may be only one sensor provided in the display device for all sensing. This may apply similarly where further distributions are used as described above.
There may further be provided the above touch sensitive image display device, further comprising an anti- distortion system to map a said detected location of touch to a portion of said input image. The said touch or object location and/or the portion may correspond to one or more pixels of the input image.
There may further be provided the above touch sensitive image display device, wherein at least one said light source is another of said light sources. The light source may comprise any combination of the light sources projecting the first, second or any further distribution. Thus, there may be fewer light sources than distributions, and even a single light source for projecting all distributions, if the distributions are multiplexed in wavelength or time.
Any combination of the sources may comprise an infra-red light source. In this case, either or both of the sources (and/or the sensors or shared sensor) may have a filter to pass infra-red but not visible light. At least two light sources may emit light at different wavelengths. In particular, different infra-red wavelengths may be used for each light source so that the distributions can be used simultaneously. Any number of light sources (and/or the sensors or shared sensor) may comprise a filter e.g. a notch filter, for passing a pre-determined wavelength.
According to a second aspect of the present invention, there is provided a consumer electronic device such as a camcorder, camera, music player, mobile phone, media player or computer, comprising the touch sensitive image display device according to any preceding claim. For example, a camcorder may use the display device to allow a user to preview a recorded video sequence by observing and image projected at an acute angle from the camcorder onto a surface on which the camcorder stands.
According to a further aspect of the present invention, there is provided a method corresponding to the above first aspect, optionally with any combination of the above features which may be provided in the first aspect.
According to a further aspect of the present invention, there is provided a method corresponding to the above second aspect, optionally with any combination of the above features which may be provided in the second aspect.
Thus the invention generally provides methods corresponding to the above-described devices, and processor control code, in particular on a carrier, to implement the methods.
The carrier may be, for example, a disk, CD- or DVD-ROM, or programmed memory such as read-only memory (Firmware). The code (and/or data) may comprise source, object or executable code in a conventional programming language (interpreted or compiled) such as C, or assembly code, for example for general purpose computer system or a digital signal processor (DSP), or the code may comprise code for setting up or controlling an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array), or code for a hardware description language such as Verilog (Trade Mark) or VHDL (Very high speed integrated circuit Hardware Description Language). As the skilled person will appreciate such code and/or data may be distributed between a plurality of coupled components in communication with one another.
The skilled person will appreciate that features of the above-described aspects and embodiments of the inventions may be combined.
BRIEF DESCRIPTION OF THE DRAWINGS
For a better understanding of the invention and to show how the same may be carried into effect, reference will now be made, by way of example, to the accompanying drawings, in which:
Fig. 1 shows a display device of an embodiment;
Fig. 2 shows a display device of an embodiment;
Fig. 3 a shows a basic block diagram of elements relating to Figures 1 and 2;
Fig. 3b shows a more detailed block diagram of an embodiment having features discussed in relation to Figures 1 - 2 and 3a;
Fig. 3c shows a more detailed block diagram of an embodiment;
Fig. 4 shows a timing diagram for illustrating a first mode of operation of an embodiment;
Fig. 5 shows a timing diagram of multiplexing of detection in the time domain in an embodiment;
Fig. 6 shows an arrangement including a phase locked loop for synchronising distribution projection and receiving of detection signals in an embodiment;
Fig. 7 shows how a controller C may determine a proximity detect signal in an embodiment;
Fig. 8 illustrates a shared a multi-pixel sensor in an embodiment; and
Fig. 9 shows a consumer electronics device such as a camcorder including a touch sensitive image display device.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
The following describes an embodiment of a touch sensitive image display device. The embodiment includes a projector for projecting a touch sensitive displayed image at an acute angle onto a surface on which the device is placed. The acute angle is generally less than 90° from the surface where the image is displayed. The angle may be pre-determined or may be adjustable by, for example, moveable optics. The projection optics may comprise a Texas Instruments (registered trademark) digital light processor, a LCOS imager and/or a laser scanner. Furthermore, there may be a projection lens that is moveable or set at a pre-determined angle for projecting the image. Optionally, the image may be projected indirectly to the surface, by first projecting the image to a mirror attached to the housing of the projector, the mirror being arranged to be further from the surface than the projector itself so that the resulting displayed image may be larger.
The projector projects the image on the basis of data defining a desired image. The data may be received in an image processor of the projector from a data input receiving data from a source internal or external to the display device. Furthermore, the image processor may have code for calculating a target image for projection, such that the displayed image as observed by the user does not suffer distortion resulting from the acute angle. Additionally or alternatively, such anti-distortion calculations may account for distortion due to curvature of the surface on which the image is displayed. Such compensation may be carried out by forward or reverse mapping between the (preferably undistorted) input image data and the target image, and may involve cropping of image data. (In alternative embodiments, the acute angle projection may be achieved by providing a stand for supporting the display device or at least the projector though at a pre-determined or adjustable angle.
Specifically regarding compensating for curvature, the projector may be used to scan, e.g., known shapes, patterns and/or lines onto a surface and any described multi-pixel sensing system may be operated 'synchronously' to capture an image to correlate. The captured images may be used to compute a 3-D surface profile. The profile information may then be used to create a correction function, e.g., look-up table, for images projected onto that surface.
Figure 1 shows the display device 1 comprising a projector 2, sensor systems 3 and 4, light source 5 associated with output optics 5a and light source 6 associate with output optics 6a. (Alternatively, the sensor systems 3 and 4 may be combined such that a single sensor 7a or 7b with optional anti-distortion processor ADl, AD2 can be shared for both presence and touch sensing. This is shown by the dashed lines in Figure 1, particularly in relation to the anti-distortion units ADl and AD2 which may be comprised in the sensor system(s) or in the controller C). Figure 1 further shows that the light source 6 projects a first light distribution dl over the displayed image Im, and light source 5 projects a second distribution d2 'over the first distribution. The first distribution is preferably less than about 3mm from the surface where the image is displayed and the second distribution d2 may be about
1 to about 2cm above the first distribution. (Figure 1 further shows that a further distribution, or any number of further distributions d3, may be used to allow detections at different levels above the displayed image).
Regarding the user interface, Figure 1 shows that a finger f may be detected touching the displayed image on the basis of sensing light from the first distribution dl. Similarly, presence of the finger f in the distribution d2 may be detected, and this applies further to any further distributions d3. Thus, the user may be able to control the device by "touching" at different levels. Furthermore, detection using the distribution d2 can be used to protect proximity or approach towards the displayed image. In more detail, regarding the equations shown in Figure 1, the difference between two positions such as pi and p2, which positions are detectable by sensors 7a and 7b if these are multi-pixel sensors, may result in the selection of a function to be performed by the controller c. The positions may be locations of touch or presence in one particular distribution or may be locations in different distributions. In this way, a versatile user interface may be provided such that, for example, the user can control the device in a joystick-like manner by using his finger, a stylus, pen, etc.
Figure 2 shows an embodiment of a display device 11 with corresponding elements as described above in relation to Figure 1. Specifically, Figure 2 shows that the combination of light source 5 (with any required output optics 5a) and sensor 7a may be provided in the form of a transceiver, including, e.g., a laser and camera. Moreover, the embodiment of Figure 2 illustrates that the second optical distribution d2 may be a three- dimensional volume such as a cone. As with Figure 1, sensors 7a and 7b may detect light reflected or scattered from an object such as finger f. Particularly in the case where the sensor 7a is a low resolution sensor such as a single pixel sensor, the embodiment of Figure 2 may provide a less complex apparatus when it is merely required to detect whether or not an object is near, i.e., proximate to a displayed image, whether this is for allowing power management, an improved user-interface or even for safety (e.g., if any light source such as that of the projector is a laser). This may be achieved by detecting the intensity of light scattered from an object. A suitable action taken by the controller c in response to such a detection may be to turn on any element of the device 11, such as the light source 6 and/or projector 2 (in particular the light source of the projector). For example, the touch sensing system may be disabled when a user is not interacting with the display device.
Thus, as in any one of the embodiments of the present invention described herein, power hungry devices may be switched on, or at least have their power increased, only when proximity is detected. Similarly, a further, e.g., sub-, image can be displayed with the above-mentioned displayed image in response to the detection. For the power management, it is further noted that absence of detections of touch and distribution dl can be used to switch off or reduce the above powers. Consequently, it may be possible in any one of the embodiments described herein to reduce power consumption and/or to display further images with the above mentioned display image only when the user is interacting with the device. (Such a further image may be a sub-image that replaces, or is overlaid on (similarly to a water mark), a portion of the displayed image, or it may be an image that sustantially replaces the entire displayed image. For example, if the further image fully overlaps the displayed image, the further image could be a low-power screen saver, the further image being adapted to reduce the required projection power relative to a higher complexity image. Where the further image is a sub-image, this
may be a menu or a control button such as stop, fast forward, rewind, start, pause etc. Such a further image may be displayed (e.g., under control of the controller) in response to any detection signal described herein). Thus, embodiments may provide power saving features and/or a better interface for the user.
Figure 3a shows a basic block diagram of elements described above in relation to Figures 1 or 2. It is noted that, in some embodiments, the sensor systems 3 and 4 may be a combined system that multiplexes in wavelength or time sensing and/or projecting of light of the distributions. In more detail, Figure 3a shows that the light sources may be infra-red. (It is noted that any indication of direction(s) of signal transmission between blocks in any block diagram of this specification is purely for example only and does not require bidirectional transmission on any link).
The sensor system 3 may be a multi-pixel power sensor such as a camera for calculating location (position). A second sensor is provided for the sensor system 4, this may be a single pixel or at least have a lower number of pixels than the sensor(s) of system 3. Thus, the sensor system 3 may be suitable for detecting location of a touch while the system using the second sensor may be suitable merely for detecting presence or motion. The light source 6 may provide a substantially two-dimensional sheet just above the surface where the image is displayed, this sheet being used to do the touch sensing. For the second light source 5, a further substantially two- dimensional sheet of light may be protected or alternatively the distribution may be a three-dimensional cone or volume of light.
A more detailed block diagram of a further embodiment having any of the above features discussed in relation to Figures 1 - 3, and having corresponding elements, is shown in Figure 3b. This detailed arrangement may be implemented within a consumer electronic device such as a camcorder, computer, mobile phone etc.
As may be found in any one of the embodiments described herein, a memory M is provided for image or video data. The image data/pages may be derived from a video input, non-volatile ram storage or any other source. (The image sensor IS 1 is optional and may be present in a device for recording video or still images, such as a camera or camcorder; the image processor IP is a further optional part associated with the image sensor ISl). The memory may be a self-contained unit or found within the projector or the controller C.
The output optics 2a are generally associated with the projector 2. The display processor Ca, power management Cb and detect unit Cc are all associated in Figure 3b with the controller C. However, they may be provided in other units coupled to the controller C. A proximity detect block corresponding to a sensor system 3 is shown coupled to the power management block Cb and may comprise the light source 5 and detector 7a. Furthermore, an output from a detector 7b in the form of a camera for detecting location of a touch is shown coupled to the power management block Cb. On the basis of these two inputs, the power management can control illumination of the first distribution for touch and/or of the displayed image off, or at least reduce the power used for these processes. The illumination may be controlled off after a period of absence of touch detects and the proximity
detector may be pulled thereafter until proximity is detected so that the illumination can be turned back up to full power.
Figure 3b further shows a memory Ml that may be used to map locations of detected touches to locations of the preferably un-distorted input image so that anti-distortion compensation can be formed. The touch map memory Ml may be used to implement a look up table for anti-distortion compensation.
Figure 3c shows a yet more detailed block diagram that may correspond to elements of any one of the embodiments described in this specification.
Figure 4 shows a timing diagram for illustrating a first mode of operation of any one of the embodiments described herein. The mode is particularly advantageous in an embodiment where the above power management is implemented. The upper trace of Fig. 4 shows relatively high power in the first distribution when projection from the light source 6 is active. The lower trace shows lower power in the second light distribution due to low duty cycle of projection from the light 5. In other words, Figure 4 shows that the first light source for touch sensing by means of the distribution dl may provide a relatively high power light distribution when active, in comparison to the average power of the second distribution generated by the other light force. In the case shown in Figure 4, the lower power may be due to a lower duty cycle where the second distribution is pulsed on and off repetitively. (Pulsing of one or more of the distributions in any embodiment described herein may enable rejection of at least some of the ambient light in a corresponding sensor system detection, particularly where a PLL is used as further described herein. Where pulsing is used, the background/ambient signal level may be read as frequently, preferably twice as frequently, as the pulsing of the corresponding light source).
As further shown in Figure 4, while a hand or other object is present as detected by the touch and/or presence sensor system, the high-power first distribution source is maintained on. However, when absence of such an object is detected, the higher power source is turned down or switched off and polling for proximity begins by pulsing the lower power second distribution d2. When proximity is detected in the distribution d2, because the hand or other object has returned, the relatively higher power first distribution source for touch sensing is switched back to full power operation once more.
Figure 5 shows a timing diagram of multiplexing of detection in the time domain. This may be used in any one of the embodiments to reduce the component count required for the device, e.g., by requiring a single light source and/or single sensor system. The upper trace shows projection in the lower (preferably infra-red (IR)) layer. The lower trace shows projection in the upper (IR) layer. As shown in Figure 5, the light distributions dl (lower IR layer) and d2 (upper IR layer) are switched to full power alternately. In this case, a single sensing system or even sensor may be configured to provide touch and presence detection signals synchronous with the alternating, such that the identification of each signal as being of touch or presence may be identified at least by the timing of the detection.
With further regard to pulsing of the distribution d2, or for that matter of any other distribution such as dl or any further distribution where implemented for example to save power, Figure 6 shows that a phase locked loop (PLL) may be used to synchronise the distribution projection and receiving of light signals for detection such that, for example, a power laser, may be controlled accordingly. An advantage of such an implementation may be reduction of signal to noise ratio in the detection signals, since the sensor output can be filtered to obtain detection signals occurring at the same frequency and/or phase as the distribution pulsing, for example. The PLL may be part of the controller c for acting on the detection, or may be in a different device component.
Figure 7 shows that, in any of the embodiments described herein, the controller C may determine a proximity detect signal and on this basis control one or more elements of the device to be switched on/off or switched to a lower/higher power mode. A plurality of such control signals may be provided, for example for controlling predetermined touch sensor elements such as the light distribution dl light source off or for tuning the projector 2 or at least the light source thereof off.
Figure 8 illustrates a principle applicable in any one of the embodiments described herein, wherein the sensor systems 3 and 4 may share a multi-pixel sensor. The camera functions as the sensor 7a, 7b and has an array of pixels that may have a grid format such as that shown in the lower portion of Figure 8. A signal processor which may be found in the controller c outputs on the basis of the camera output a location such as co-ordinates (X5Y) and a detection signal. (The location may even indicate which distribution the location has been detected in). As shown in Figure 8, the resolution of the pixels used for proximity detector detection (shown at the top of the grid) may be lower than those used for touch location sensing. As shown in the top right diagram of Figure 8, the controller may have knowledge of co-ordinate or locations corresponding to mere proximity and those corresponding to a detected touch location on the displayed image. Furthermore, the controller may have memory and/or processing for mapping the displayed image co-ordinates to input image regions including to compensate for distortion due to the acute angle or curvature of the surface.
Figure 9 shows a consumer electronics device REC such as a camcorder including a touch sensitive image display device of any one of the embodiments described herein. In Figure 9, there is shown an optional lens of the device for recording video, a output optics lens of a projector 2 of the touch sensitive image display device, the display image Im, and light sources 5, 6 (which may be combined, i.e. comprise a single, shared light source). Such a device may include elements of any embodiment described herein in any combination, in particular those of the block diagrams described.
The following relates to the above-mentioned anti-distortion compensation and is applicable in any embodiment described herein, in particular those using an SLM and in particular holographic projection.
In embodiments of a device using an SLM as describe above, especially where diffraction is employed, light from the entire illuminated area of the SLM may be directed into the distorted target image field. Moreover, the displayed image is substantially focus-free; that is the focus of the displayed image does not substantially depend upon the distance from the image projection system to the display surface. A demagnifying optical system may
be employed to increase the divergence of the modulated light to form the displayed image, thus allowing an image of a useful size to be displayed at a practical distance.
The field of the displayed image may suffer from keystone distortion, the trapezoidal distortion of a nominally rectangular input image field caused by projection onto a surface at an angle which is not perpendicular to the axis of the output optics. Thus the image projection system internally generates a target image to which the inverse distortion has been applied so that when this target image is projected the keystone distortion is compensated. The target image is the image to which a transform is applied to generate data for display on the SLM. Thus in some preferred embodiments the system also includes non- volatile memory storing mapping data for mapping between the input image and the target image.
To convert from the input image to the target image either forward or reverse mapping may be employed, but preferably the latter, in which pixels of the target image are mapped to pixels of the input image, a value for a pixel of the target image then being a assigned based upon lookup of the value of the corresponding pixel in the input image. Thus in some preferred embodiments the trapezoidal shape of the target image field is located in a larger, for example rectangular target image (memory) space and then each pixel of the target image field is mapped back to a pixel of the (undistorted) input image and this mapping is then used to provide values for the pixels of the target image field. This is preferable to a forward mapping from the input image field to the distorted target image field for reasons which are explained below. In either case, however, in some preferred embodiments the transform is only applied to the distorted, generally trapezoidal target image field rather than to the entire (rectangular) target image memory space, to avoid performing unnecessary calculations.
Where reverse mapping as described above, is employed preferably compensation is also applied for variations in per unit area brightness of the projected image due to the acute angle projection. Thus while diffraction from a given pixel of the SLM may contribute to substantially an entire displayed hologram (where holographic projection is used in an embodiment), nonetheless the diffracted light from this pixel will be distorted resulting in more illumination per unit area at the short-side end of the trapezoid as compared with the long-side end of the trapezoid. Thus in preferred embodiments an amplitude or intensity scale factor is applied the value of which depends upon the location (in two dimensions) of a pixel in the target image space. This amplitude/intensity compensation may be derived from a stored amplitude/intensity map determined, for example, by a calibration procedure or it may comprise one or a product of partial derivatives of a mapping function from the input image to the anti-distorted target image. Thus, broadly speaking, the amplitude/intensity correction may be dependent on a value indicating what change of area in the original, input image results from a change of area in the anti- distorted target image space (at the corresponding position) by the same amount.
As mentioned above, rather than a reverse mapping a forward mapping from the input image space to the distorted target image space may alternatively be employed. This is in general less preferable because such a mapping can leave holes in the (anti-) distorted target image where, in effect, the target image is stretched. Thus mapping pixels of the input image to pixels of the target image may not populate all the pixels of the target
image with values. One approach to address this issue is to map a pixel of the input image to an extended region of the target image, for example, a regular or irregular extended spot, hi this case a single pixel of the input image may map to a plurality of pixels of the target image. Alternatively once pixel values of the target image have been populated using pixels of the input image, pixels of the target image which remain unpopulated may be given values by interpolation between pixels of the target image populated with pixel values. Where a single input image pixel is mapped to an extended region of the target image, these extended regions or spots may overlap in the target image, in which case the value of a target image pixel may be determined by combining more particularly summing, the overlapping values (so that multiple input image pixels may contribute to the value of a single target image pixel). With this approach compensation for per unit area brightness variation is achieved automatically by the summing of the values of the extended spots where these spots overlap in the target image field.
Preferred embodiments of the image projection system provide a multi-colour, more particularly a full colour display. Thus red, green and blue laser illumination of the SLM may be employed, time multiplexed to display three colour planes of the input image in turn. However, since the projection system may operate by diffraction, the blue light diverges less than the red light and thus in preferred embodiments the target image also has three colour planes in which a different scaling is employed for each colour, to compensate for the differing sizes of the projected colour image planes. More particularly, since the red light diverges most, the target image field of the red colour plane is the smallest target image field of the three target image planes (since the target image has "anti-distortion" applied). In general the size of the target image field for a colour is inversely proportional to the wavelength of light used for that colour. In some preferred embodiments, however, rather than a simple scaling by wavelength being applied the distortion (more correctly anti-distortion) of each colour image plane may be mapped to a corresponding colour plane of the target image field using a calibration process which corrects for chromatic aberration within the projection system such as chromatic aberration within the projection optics, chromatic aberration caused by slight misalignment between rays for different colours within the optics, and the light.
The techniques employed in preferred embodiments of the projector, in particular the holographic techniques, facilitate miniaturisation of the projector. These techniques also facilitate handling of extreme distortion caused by projection onto a surface on which the projector is placed, this extreme distortion resulting from the geometry illustrated in later figure Ic in combination with the small size of the projector. Thus in some preferred embodiments the surface onto which the image is projected is no more than Im, 0.5m, 0.3m, 0.2m, 0.15m, or 0.1m away from the output of the projection optics 102. Similarly in embodiments the distance from the output of the projection optics to the furthest edge of the displayed image is substantially greater than the distance from the output of the projection optics to the nearest edge of the displayed image, for example 50%, 100%, 150%, 200% or 250% greater. Depending upon the geometry the acute projection angle may be less than 70°, 65°, 60°, 55°, 50°, or even 45°.
The device may also provide a forward projection mode and incorporate a stand such as a bipod or tripod stand, and preferably also a sensor to automatically detect when the device is in its table-down projection configuration, automatically applying distortion compensation in response to such detection. However in some alternative arrangements rather than mechanically tilting the device, instead the projection optics may be adjusted to alter between forward and table-down projection. This could be achieved with a moveable or switchable mirror, but an alternative approach employs a wide angle or fisheye lens which when translated perpendicular to the output axis of the optics may be employed to move from forward projection to table-down projection at an acute angle.
A mapping between the input image and the anti-distorted target image may comprise either an analytical mapping, based on a mathematical function, or a numerical mapping, for example, derived from a calibration procedure or both. As previously mentioned in some preferred embodiment target image pixels are mapped to input image pixels to lookup target image pixel values. Preferably the target image is also corrected for area mapping distortion and, in a colour system, preferably the different colour planes are appropriately scaled so that they reproduced in the projection surface at substantially the same size.
In preferred embodiments of the above described systems, devices and methods preferably an (AD)OSPR-type procedure (WO2007/031797) is employed to generate the hologram data. Thus in preferred embodiments a single displayed image or image frame is generated using a plurality of temporal holographic subframes displayed in rapid succession such that the corresponding images average in an observer's eye to give the impression of a single, noise-reduced displayed image.
Advantages of the above described embodiments may include providing a video application device wherein buttons for play, stop etc., may be overlaid on the displayed video imaging only when necessary so that the video can be comfortably viewed by the user when the user is not manually interacting with the device. Similar applies to devices which may have pop-up menus, which may in particular be context sensitive depending on a location of touch, and it is preferred that these menus only appear when the user is manually interacting. A further advantage is how to reduce power of a touch sensitive display device, this being achieved by using a lower power proximity sensing system that allows the higher power touch sensing system to operate at lower power or even "sleep" when the user is not manually interacting but, e.g. merely passively observing the displayed image. Such proximity detection may further allow the device to be activated merely by a user's hand or other object approaches the displayed image rather than only by direct touch activation. Thus, power saving and/or a better interface may be achievable.
It is particularly noted that the projector in any one of the embodiments described herein may be holographic since this may advantageously provide a wide through angle long depth of field and very substantial distortion correction with less loss of brightness/efficiency than in non-holographic projectors. These techniques are described in our UK patent application number GB0822336.4 filed on December 8, 2008 hereby incorporated by reference in its entirety.
Thus in embodiments the mapping between a target image for display and an input image is described by a pair of polynomial expansions and, more particularly by two sets of polynomial coefficients for these expansions. If we refer to the target image space using coordinates (x', y'), and the input image using coordinates (x, y) then we can define a location (x, y) in the input image space as a pair of functions f ', g ' of the coordinates in the (anti- distorted) target image space, as follows: f \x \ y <) > x g'O',/)
Likewise: f(x,y) >x' g(χ,y)-
For reasons explained further below, it is preferable that the mapping from the target to the input image rather than vice-versa is employed. An example pair of polynomial expansions is given below:
/ \x \y ') = a00 + aΪOx + aoly + anxy + alox +... where broadly speaking coefficient a00 defines position, al0 and aOi define scale, an defines skew, and a2o and so forth are higher order coefficients. The value of α,y is dependent on the angle of projection θ, on / and onj; the value of Z)// is similarly dependent on θ, i and/ It can be helpful to consider (x, y) space as being "camera" — that is defining what it is desired to project.
In embodiments a single pixel of the target image may maps to a plurality of pixels in the input image. This can be appreciated because the distortion effectively shortens the nearer edge of the input image as compared with the more distant edge from the output optics. Therefore in some preferred embodiments the target image is constructed by stepping through the (x', y') positions in the target image and for each looking up the addresses of the corresponding pixels in the input image and using the values from these pixels to assign a value to the corresponding pixel in the target image. Where multiple input image pixels correspond to a single target image pixel the values of the input image pixels may, for example, be summed or some other approach may be employed for example selecting a value such as a mean, medium or mode value. Thus preferred embodiments apply an inverse mapping, from the target to the input image space. By contrast mapping from the input image to the target image can leave holes in the target image, that is pixels with unpopulated values. In this case a single pixel of the input image may be mapped to a regular or irregular spot with an extended size (over multiple pixels) in the target image, optionally with a super imposed intensity distribution such as a gaussian distribution.
Once the target image T(x ', y ') has been created a hologram H(X, Y) of the target image is generated to approximate the following expression:
wheie N represents the number of pixels in the hologiam m the X and Y-directions (here foi simplicity, the same numbei) The region of the target image space outside the image may be filled with zeios and theiefoie in some preferred implementations the evaluation of H(X, Y) is performed over a window of target image space defined by the target image, for efficiency
In the context of table-down holographic image projection to piovide a multicoloui/full coloui display piefeπed embodiments of the system employ thiee diffeiently scaled and/oi distorted taiget images, one of each of the three laser colours red, green and blue-denoted R, G and B in the figure Thus in embodiments sepaiate functions f \ g ' are provided foi each colour, although m other embodiments a smgle target image/distortion map is employed and scaled according to the wavelength of the laser light used foi the respective colour plane, moie particularly scaled by 1/λ It will be undei stood that each pixel of a hologram calculated fiom the taiget image contributes to substantially the whole displayed image, the displayed image is scaled in mveise pioportion to wavelength - that is the blue image would be smaller because the blue light is diffi acted less, and therefoie the blue target image enlarged so that the projected images for the three colour planes substantially match mside
Referring again to the polynomial expansions described above, for an inverse mapping, that is from taiget to input image space, wheie scaling is applied the (0,0) coefficients are not scaled, the (1,0) and (0,1) coefficients are scaled by reciprocal wavelength, and optionally the coefficients of higher power are scaled accoidingly, foi example the (1,1), (2,0), and (0,2) coefficients being scaled by 1/λ2 and so forth Thus for example, foi 440nm
R 640 „ blue light and 640nm red light aλ 0 = Ct10
In other embodiments, howevei, a set of functions fR ', gR ', fG ', gG ', fB ', gB ' is employed to correct foi chromatic aberration, positioning of the different coloured laseis and the light When mappmg using a foiwaid function fiom the mput image to the target nnage space the scaling applied is to multiply iather than divide by wavelength and the above approaches are adapted mutatis mutandis
It is further desirable to correct foi changes m brightness per unit aiea which iesult from the distortion in acute- angle image projectmg One approach would be to calibrate for this change and provide an anti-distortion calibration map to apply similarly to that foi spatial distortion Another appioach, however, is to deteimme an intensity scale factoi as a function of position, for example by determining what change of aiea m the original, mput image iesults fiom a change of coiiesponding area m the anti-distoited space of taiget nnage by the same amount This can be detei mined by detei mining the derivative of the target image with iespect to the mput image m each of two orthogonal directions in the image plane, more particularly by calculating an intensity scale factor A(x ', y ') accordmg to
dx dy
The skilled person will appreciate that in going from an input image pixel value to a target image pixel value, if the pixel value defines an intensity then this should be multiplied by (I/A) whereas if the pixel value defines amplitude then in going from the input image to the target image the amplitude is multiplied by
A different approach may, however, be employed when forward mapping from the input image to the target image space. In this case where an input image pixel is mapped to an extended area or spot in the target image space area correction may be performed automatically by adding the contributions from overlapping spots in the target image space - that is a target image pixel value maybe determined by adding pixel values from input image pixels whose extended spots overlap that target image pixel.
Furthermore, the use of a spatial light modulator such as a pixellated liquid crystal device, which may be transmissive or reflective, for displaying a target image based on the input image may allow the displayed image to be updated (e.g., scrolled, panned, zoomed) dynamically in response to touch and/or proximity/presence (or movement thereof).
The above-described methods, and responsive device control, may be implemented using processor control code on a data carrier, as previously described.
The techniques described herein have may applications which include, but are not limited to, touch-sensitive displays for the following: mobile phone; PDA; laptop; digital camera; digital video camera; games console; in- car cinema; navigation systems (in-car or personal e.g. wristwatch GPS); head-up and helmet-mounted displays e.g. for automobiles and aviation; watch; personal media player (e.g. photo/video viewer/player, MP3 player, personal video player); dashboard mounted display; laser light show box; personal video projector (a "video iPod (RTM)" concept); advertising and signage systems; computer (including desktop); remote control unit; an architectural fixture incorporating an image display system; more generally any device where it is desirable to share pictures and/or for more than one person at once to view an image.
Furthermore, the features described in the detailed description above may be present in any permutation in any embodiment. As described above, an embodiment may advantageously provide a combined input-output, i.e., interactive, display having a 3-D touch interface (i.e., one that is able to detect and respond to touch/presence of an object in different spatial regions). In particular, any above-described presence sensor system, which is not arranged to detect touch of the displayed image but merely presence and/or approach and/or location in a corresponding second distribution, may be provided to allow sleep and/or wake-up detection for any element of the device, in particular of the touch sensing system comprising light source for the first distribution and touch sensor system. Any number, e.g., 3, of presence sensing systems comprising a further distribution as described above may be present.
No doubt many other effective alternatives will occur to the skilled person. It will be understood that the invention is not limited to the described embodiments and encompasses modifications apparent to those skilled in the art lying within the spirit and scope of the claims appended hereto.
Claims
1. A touch sensitive image display device for projecting a touch sensitive displayed image at an acute angle onto a surface on which the device is placed, the device comprising: a light source to project a substantially two-dimensional first light distribution in a first plane; a light source to project a substantially two-dimensional second light distribution in a second plane, wherein said second plane is different to said first plane; a multi-pixel sensor system to remotely detect touch of an area of said surface within or adjacent to said displayed image by detecting light from said first distribution, and having an output to provide a detected touch signal; said multi-pixel sensor system to remotely detect presence of an object at least partially within said second light distribution by detecting light from said second distribution, and having an output to provide a detected presence signal; and said controller having an input to receive said detected touch signal and an input to receive said detected presence signal and configured to control said touch sensitive image display device responsive to said signals, wherein said device is configured to multiplex projection of the first light distribution and projection of the second light distribution.
2. The touch sensitive image display device according to claim 1, comprising: at least one further light source to project a further substantially two-dimensional second light distribution in a further plane, wherein said further plane is different to each other said plane; and at least one further multi-pixel sensor system, each said further multi-pixel sensor system to remotely detect presence of an object at least partially within at least one said further light distribution by detecting light from said at least one further distribution, and having an output to provide a detected presence signal.
3. The touch sensitive image display device according to claim 2, wherein said device is configured to multiplex projection of the first light distribution, the second light distribution and at least one said further light distribution.
4. The touch sensitive image display device according to any preceding claim, wherein said multi-pixel sensor system to detect touch comprises a plurality of multi-pixel sensors each to remotely detect touch of an area of said surface within or adjacent to said displayed image, and having an output to provide a detected touch signal.
5. The touch sensitive image display device according to any preceding claim, wherein said multi-pixel sensor system to detect presence comprises a plurality of multi-pixel sensors each to remotely detect presence of an object at least partially within said second light distribution, and having an output to provide a detected presence signal.
6. The touch sensitive image display device according to any preceding claim, wherein said light source to project said first light distribution is said light source to project said second light distribution.
7. The touch sensitive image display device according to any preceding claim, wherein said first plane and said second plane are substantially parallel planes.
8. The touch sensitive image display device according to any preceding claim, wherein said multiplexing is wavelength multiplexing.
9. The touch sensitive image display device according to any preceding claim, wherein said multiplexing is time multiplexing.
10. The touch sensitive image display device according to claim 9, wherein said responsive controller is configured to read said detected touch signal and said detected presence signal in synchronism with said time multiplexing.
11. The touch sensitive image display device according to any one of claims 9 and 10, wherein said controller is configured to perform said responsive control by distinguishing between said receiving of said detected touch signal and said receiving of a detected presence signal on the basis of a timing of receiving a said signal.
12. The touch sensitive image display device according to claim 11, configured to determine an action to be initiated by said responsive control on the basis of said distinguishing.
13. The touch sensitive image display device according to any one of claims 9 to 12, further comprising moveable optics to alternately project said first light distribution and said second light distribution.
14. The touch sensitive image display device according to claim 13, wherein said moveable optics comprises a rotatable mirror.
15. The touch sensitive image display device according to any preceding claim, wherein said multi-pixel sensor system to detect touch is to detect a location of said touch and said responsive control performs an action determined on the basis of said detected touch location and controls said device to perform said action.
16. The touch sensitive image display device according to any preceding claim, wherein said multi-pixel sensor system to detect presence is to detect a location of said presence and said responsive control performs an action determined on the basis of said detected presence location and controls said device to perform said action.
17. The touch sensitive image display device according to claim 15 or 16, wherein said action comprises selecting a menu on the basis of said detected presence location and displaying said selected menu.
18. The touch sensitive image display device according to any preceding claim, wherein said sensor system to detect touch is configured to detect light scattered from said first light distribution.
19. The touch sensitive image display device according to any preceding claim, wherein said sensor system to detect presence is configured to detect light scattered from said second light distribution.
20. The touch sensitive image display device according to any preceding claim, wherein said controller is configured to detect hovering of a said object by detecting absence of a said touch detection and occurrence of a said presence detection on the basis of said signals.
21. The touch sensitive image display device according to any preceding claim, further comprising a spatial light modulator (SLM) and a controller to control said SLM, on the basis of data defining said displayed image, to replay a target image distorted to compensate for projection onto said surface at said acute angle.
22. The touch sensitive image display device according to any preceding claim, wherein a multi-pixel sensor of said multi-pixel sensor system to detect touch is a multi-pixel sensor of said multi-pixel sensor system to detect presence.
23. The touch sensitive image display device according to any preceding claim, wherein said multi-pixel sensor system to detect touch is configured to use pixels of said multi-pixel sensor and said multi-pixel sensor system to detect presence is configured to use other pixels of said multi-pixel sensor.
24. A consumer electronic device such as a camcorder, camera, music player, mobile phone, media player or computer, comprising the touch sensitive image display device according to any preceding claim.
25. A touch sensitive image display device for projecting a touch sensitive displayed image at an acute angle onto a surface on which the device is placed, the device comprising: a light source to project a first light distribution; a light source to project a second light distribution; a multi-pixel sensor system to remotely detect location of touch of an area of said surface within or adjacent to said displayed image by detecting change of said first distribution, and having an output to provide a detected touch location signal; a multi-pixel sensor system to remotely detect location of an object at least partially within said second light distribution, and having an output to provide a detected object location signal; a controller having an input to receive said detected touch location signal and an input to receive said detected object location signal, and configured to control said device responsive to said detected touch location signal and to control said device responsive to said detected object location signal.
26. The touch sensitive image display device of claim 25, wherein at least one said output to provide a detected location signal is configured to indicate a light distribution of said location detection.
27. The touch sensitive image display device of claim 25 or 26, comprising: at least one light source to project at least one further light distribution; said multi-pixel sensor system to remotely detect location of an object at least partially within the or each said further light distribution, and having at least one output to indicate a said further distribution and to provide a signal indicating a said location of said object in said indicated further distribution.
28. The touch sensitive image display device of any one of claims 25 to 27, wherein the controller is configured to select an action on the basis of at least one of said signals, and to control said touch sensitive image display device to perform said action.
29. The touch sensitive image display device of claim 28, wherein the controller is configured to select said action further on the basis of at least one said indication of light distribution.
30. The touch sensitive image display device of any one of claims 25 to 29, wherein said touch sensitive image display is operable by a user's finger as a joystick.
31. The touch sensitive image display device of any one of claims 25 to 30, wherein said touch is a touch by a user's finger .
32. The touch sensitive image display device of any one of claims 25 to 31, wherein said object is a user's finger.
33. The touch sensitive image display device of any one of claims 25 to 32, wherein the controller is further configured to perform a said responsive control on the basis of a rate of change of at least one said detected location.
34. The touch sensitive image display device of any one of claims 25 to 33, wherein the controller is further to configured to perform a said responsive control on the basis of a direction of change of at least one said detected location.
35. The touch sensitive image display device of any one of claims 25 to 34, wherein the controller is further configured to perform a said responsive control on the basis of a locus of change of at least one said detected location.
36. The touch sensitive image display device of any one of claims 25 to 35, wherein the controller is further configured to perform a said responsive control on the basis of difference between two said detected locations.
37. The touch sensitive image display device of any one of claims 25 to 36, wherein the controller is further configured to perform a said responsive control on the basis of a rate of change of difference between two said detected locations.
38. The touch sensitive image display device of any one of claims 25 to 37, wherein a said responsive control controls said touch sensitive image display device to update or move at least a portion of said displayed image.
39. The touch sensitive image display device of any one of claims 25 to 38, wherein a said responsive control controls said touch sensitive image display device to display a further image adjacent to or at least partially overlapping said displayed image.
40. The touch sensitive image display device of any one of claims 25 to 39, wherein a said multi-pixel sensor system comprises at least one multi-pixel sensor.
41. The touch sensitive image display device of any one of claims 25 to 40, wherein a said sensor of said multi- pixel sensor system to detect touch is a said sensor of said multi-pixel sensor system to remotely detect location of an object.
42. The touch sensitive image display device of any one of claims 25 to 41, further comprising an anti-distortion system to map a said detected location of touch to a portion of said input image.
43. The touch sensitive image display device of any one of claims 25 to 42, wherein at least one said light source is another of said light sources.
44. A consumer electronic device such as a camcorder, camera, music player, mobile phone, media player or computer, comprising the touch sensitive image display device according to of any one of claims 25 to 43.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/130,742 US8947402B2 (en) | 2008-12-24 | 2009-12-23 | Touch sensitive image display |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0823457.7 | 2008-12-24 | ||
GB0823457A GB2466497B (en) | 2008-12-24 | 2008-12-24 | Touch sensitive holographic displays |
GB0909314.7 | 2009-05-29 | ||
GB0909313.9 | 2009-05-29 | ||
GBGB0909313.9A GB0909313D0 (en) | 2008-12-24 | 2009-05-29 | Display device |
GBGB0909314.7A GB0909314D0 (en) | 2008-12-24 | 2009-05-29 | Display device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010073047A1 true WO2010073047A1 (en) | 2010-07-01 |
Family
ID=40344121
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/GB2009/051638 WO2010073024A1 (en) | 2008-12-24 | 2009-12-03 | Touch sensitive holographic displays |
PCT/GB2009/051768 WO2010073045A2 (en) | 2008-12-24 | 2009-12-23 | Display device |
PCT/GB2009/051770 WO2010073047A1 (en) | 2008-12-24 | 2009-12-23 | Touch sensitive image display device |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/GB2009/051638 WO2010073024A1 (en) | 2008-12-24 | 2009-12-03 | Touch sensitive holographic displays |
PCT/GB2009/051768 WO2010073045A2 (en) | 2008-12-24 | 2009-12-23 | Display device |
Country Status (3)
Country | Link |
---|---|
US (4) | US8514194B2 (en) |
GB (5) | GB2466497B (en) |
WO (3) | WO2010073024A1 (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012172364A2 (en) | 2011-06-16 | 2012-12-20 | Light Blue Optics Ltd | Touch-sensitive display devices |
WO2012172360A2 (en) | 2011-06-16 | 2012-12-20 | Light Blue Optics Ltd | Touch-sensitive display devices |
WO2012172363A2 (en) | 2011-06-16 | 2012-12-20 | Light Blue Optics Ltd | Touch sensitive display devices |
WO2013054096A1 (en) | 2011-10-11 | 2013-04-18 | Light Blue Optics Limited | Touch-sensitive display devices |
WO2013108032A1 (en) | 2012-01-20 | 2013-07-25 | Light Blue Optics Limited | Touch sensitive image display devices |
WO2013108031A2 (en) | 2012-01-20 | 2013-07-25 | Light Blue Optics Limited | Touch sensitive image display devices |
CN103299259A (en) * | 2011-03-15 | 2013-09-11 | 株式会社尼康 | Detection device, input device, projector, and electronic apparatus |
WO2013144599A2 (en) | 2012-03-26 | 2013-10-03 | Light Blue Optics Ltd | Touch sensing systems |
US8902484B2 (en) | 2010-12-15 | 2014-12-02 | Qualcomm Mems Technologies, Inc. | Holographic brightness enhancement film |
US8947401B2 (en) | 2008-12-24 | 2015-02-03 | Light Blue Optics Ltd | Display device |
US9019240B2 (en) | 2011-09-29 | 2015-04-28 | Qualcomm Mems Technologies, Inc. | Optical touch device with pixilated light-turning features |
Families Citing this family (133)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8381135B2 (en) | 2004-07-30 | 2013-02-19 | Apple Inc. | Proximity detector in handheld device |
US9158116B1 (en) | 2014-04-25 | 2015-10-13 | Osterhout Group, Inc. | Temple and ear horn assembly for headworn computer |
US9298007B2 (en) | 2014-01-21 | 2016-03-29 | Osterhout Group, Inc. | Eye imaging in head worn computing |
US9965681B2 (en) | 2008-12-16 | 2018-05-08 | Osterhout Group, Inc. | Eye imaging in head worn computing |
US20150205111A1 (en) | 2014-01-21 | 2015-07-23 | Osterhout Group, Inc. | Optical configurations for head worn computing |
US20150277120A1 (en) | 2014-01-21 | 2015-10-01 | Osterhout Group, Inc. | Optical configurations for head worn computing |
US9366867B2 (en) | 2014-07-08 | 2016-06-14 | Osterhout Group, Inc. | Optical systems for see-through displays |
US9952664B2 (en) | 2014-01-21 | 2018-04-24 | Osterhout Group, Inc. | Eye imaging in head worn computing |
US9715112B2 (en) | 2014-01-21 | 2017-07-25 | Osterhout Group, Inc. | Suppression of stray light in head worn computing |
US9400390B2 (en) | 2014-01-24 | 2016-07-26 | Osterhout Group, Inc. | Peripheral lighting for head worn computing |
US9229233B2 (en) | 2014-02-11 | 2016-01-05 | Osterhout Group, Inc. | Micro Doppler presentations in head worn computing |
GB0920754D0 (en) * | 2009-11-27 | 2010-01-13 | Compurants Ltd | Inamo big book 1 |
US9110495B2 (en) | 2010-02-03 | 2015-08-18 | Microsoft Technology Licensing, Llc | Combined surface user interface |
EP2369443B1 (en) * | 2010-03-25 | 2017-01-11 | BlackBerry Limited | System and method for gesture detection and feedback |
US9760123B2 (en) | 2010-08-06 | 2017-09-12 | Dynavox Systems Llc | Speech generation device with a projected display and optical inputs |
KR101784517B1 (en) * | 2010-08-17 | 2017-10-11 | 엘지이노텍 주식회사 | Optical touch screen and method for assembling the same |
US8542218B2 (en) * | 2010-08-19 | 2013-09-24 | Hyundai Motor Company | Electronic switch apparatus for vehicle |
US8576243B2 (en) * | 2010-10-08 | 2013-11-05 | Hewlett-Packard Development Company, L.P. | Display-color function image conversion |
US20120176341A1 (en) * | 2011-01-11 | 2012-07-12 | Texas Instruments Incorporated | Method and apparatus for camera projector system for enabling an interactive surface |
FR2971066B1 (en) * | 2011-01-31 | 2013-08-23 | Nanotec Solution | THREE-DIMENSIONAL MAN-MACHINE INTERFACE. |
US8847919B2 (en) * | 2011-02-02 | 2014-09-30 | Apple Inc. | Interactive holographic display device |
EP2697720B1 (en) * | 2011-04-13 | 2023-02-22 | Razer (Asia-Pacific) Pte. Ltd. | Computer peripheral display and communication device providing an adjunct 3d user interface |
US8937588B2 (en) * | 2011-06-15 | 2015-01-20 | Smart Technologies Ulc | Interactive input system and method of operating the same |
TWI490733B (en) * | 2011-11-01 | 2015-07-01 | Pixart Imaging Inc | Handwriting system and sensing method thereof |
JP5927867B2 (en) * | 2011-11-28 | 2016-06-01 | セイコーエプソン株式会社 | Display system and operation input method |
US9002058B2 (en) | 2011-12-01 | 2015-04-07 | Microvision, Inc. | Scanned image projection system with gesture control input |
DE102011056006B4 (en) * | 2011-12-01 | 2016-03-10 | Seereal Technologies S.A. | Method for coding a hologram in a light modulation device |
WO2013106606A1 (en) * | 2012-01-10 | 2013-07-18 | Maxim Integrated Products, Inc. | Method and apparatus for activating electronic devices with gestures |
JP5924020B2 (en) * | 2012-02-16 | 2016-05-25 | セイコーエプソン株式会社 | Projector and projector control method |
US20130257825A1 (en) * | 2012-03-31 | 2013-10-03 | Smart Technologies Ulc | Interactive input system and pen tool therefor |
US9633186B2 (en) | 2012-04-23 | 2017-04-25 | Apple Inc. | Systems and methods for controlling output of content based on human recognition data detection |
TW201346516A (en) * | 2012-05-11 | 2013-11-16 | Pixart Imaging Inc | Sensing assembly having power saving capability and sensing method thereof |
US9110527B2 (en) | 2012-06-08 | 2015-08-18 | Apple Inc. | Condition based controls for a display based on at least one operating parameter |
KR101385438B1 (en) | 2012-06-12 | 2014-04-15 | 삼성디스플레이 주식회사 | Touch screen panel |
US20150160741A1 (en) * | 2012-06-20 | 2015-06-11 | 3M Innovative Properties Company | Device allowing tool-free interactivity with a projected image |
DE102012014910A1 (en) * | 2012-07-27 | 2014-01-30 | Volkswagen Aktiengesellschaft | User interface, method for displaying information and program facilitating operation of an operator interface |
GB2506849A (en) | 2012-09-26 | 2014-04-16 | Light Blue Optics Ltd | A touch sensing system using a pen |
KR101691633B1 (en) * | 2012-11-01 | 2017-01-09 | 아이캠, 엘엘씨 | Wireless wrist computing and control device and method for 3d imaging, mapping, networking and interfacing |
US9146668B2 (en) | 2013-01-31 | 2015-09-29 | Hewlett-Packard Development Company, L.P. | Graphical element placement on a display surface |
FR3002052B1 (en) | 2013-02-14 | 2016-12-09 | Fogale Nanotech | METHOD AND DEVICE FOR NAVIGATING A DISPLAY SCREEN AND APPARATUS COMPRISING SUCH A NAVIGATION |
WO2015017346A1 (en) * | 2013-07-30 | 2015-02-05 | Dolby Laboratories Licensing Corporation | Projector display systems having non-mechanical mirror beam steering |
US9778546B2 (en) * | 2013-08-15 | 2017-10-03 | Mep Tech, Inc. | Projector for projecting visible and non-visible images |
US9529195B2 (en) | 2014-01-21 | 2016-12-27 | Osterhout Group, Inc. | See-through computer display systems |
US10684687B2 (en) | 2014-12-03 | 2020-06-16 | Mentor Acquisition One, Llc | See-through computer display systems |
US9841599B2 (en) | 2014-06-05 | 2017-12-12 | Osterhout Group, Inc. | Optical configurations for head-worn see-through displays |
US9575321B2 (en) | 2014-06-09 | 2017-02-21 | Osterhout Group, Inc. | Content presentation in head worn computing |
US9448409B2 (en) | 2014-11-26 | 2016-09-20 | Osterhout Group, Inc. | See-through computer display systems |
US20160019715A1 (en) | 2014-07-15 | 2016-01-21 | Osterhout Group, Inc. | Content presentation in head worn computing |
US9299194B2 (en) | 2014-02-14 | 2016-03-29 | Osterhout Group, Inc. | Secure sharing in head worn computing |
US11103122B2 (en) | 2014-07-15 | 2021-08-31 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
US9671613B2 (en) | 2014-09-26 | 2017-06-06 | Osterhout Group, Inc. | See-through computer display systems |
US9366868B2 (en) | 2014-09-26 | 2016-06-14 | Osterhout Group, Inc. | See-through computer display systems |
US9939934B2 (en) | 2014-01-17 | 2018-04-10 | Osterhout Group, Inc. | External user interface for head worn computing |
US10254856B2 (en) | 2014-01-17 | 2019-04-09 | Osterhout Group, Inc. | External user interface for head worn computing |
US11227294B2 (en) | 2014-04-03 | 2022-01-18 | Mentor Acquisition One, Llc | Sight information collection in head worn computing |
US9810906B2 (en) | 2014-06-17 | 2017-11-07 | Osterhout Group, Inc. | External user interface for head worn computing |
US9829707B2 (en) | 2014-08-12 | 2017-11-28 | Osterhout Group, Inc. | Measuring content brightness in head worn computing |
US10191279B2 (en) | 2014-03-17 | 2019-01-29 | Osterhout Group, Inc. | Eye imaging in head worn computing |
US20150277118A1 (en) | 2014-03-28 | 2015-10-01 | Osterhout Group, Inc. | Sensor dependent content position in head worn computing |
US9746686B2 (en) | 2014-05-19 | 2017-08-29 | Osterhout Group, Inc. | Content position calibration in head worn computing |
US9594246B2 (en) | 2014-01-21 | 2017-03-14 | Osterhout Group, Inc. | See-through computer display systems |
US10649220B2 (en) | 2014-06-09 | 2020-05-12 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
US9766463B2 (en) | 2014-01-21 | 2017-09-19 | Osterhout Group, Inc. | See-through computer display systems |
US11669163B2 (en) | 2014-01-21 | 2023-06-06 | Mentor Acquisition One, Llc | Eye glint imaging in see-through computer display systems |
JP6398248B2 (en) * | 2014-01-21 | 2018-10-03 | セイコーエプソン株式会社 | Position detection system and method for controlling position detection system |
US11892644B2 (en) | 2014-01-21 | 2024-02-06 | Mentor Acquisition One, Llc | See-through computer display systems |
US9740280B2 (en) | 2014-01-21 | 2017-08-22 | Osterhout Group, Inc. | Eye imaging in head worn computing |
JP6375672B2 (en) * | 2014-01-21 | 2018-08-22 | セイコーエプソン株式会社 | Position detecting apparatus and position detecting method |
US20150205135A1 (en) | 2014-01-21 | 2015-07-23 | Osterhout Group, Inc. | See-through computer display systems |
US11737666B2 (en) | 2014-01-21 | 2023-08-29 | Mentor Acquisition One, Llc | Eye imaging in head worn computing |
US9836122B2 (en) | 2014-01-21 | 2017-12-05 | Osterhout Group, Inc. | Eye glint imaging in see-through computer display systems |
US9523856B2 (en) | 2014-01-21 | 2016-12-20 | Osterhout Group, Inc. | See-through computer display systems |
US9651784B2 (en) | 2014-01-21 | 2017-05-16 | Osterhout Group, Inc. | See-through computer display systems |
US11487110B2 (en) | 2014-01-21 | 2022-11-01 | Mentor Acquisition One, Llc | Eye imaging in head worn computing |
US9532715B2 (en) | 2014-01-21 | 2017-01-03 | Osterhout Group, Inc. | Eye imaging in head worn computing |
US9494800B2 (en) | 2014-01-21 | 2016-11-15 | Osterhout Group, Inc. | See-through computer display systems |
US9310610B2 (en) | 2014-01-21 | 2016-04-12 | Osterhout Group, Inc. | See-through computer display systems |
US9753288B2 (en) | 2014-01-21 | 2017-09-05 | Osterhout Group, Inc. | See-through computer display systems |
US9846308B2 (en) | 2014-01-24 | 2017-12-19 | Osterhout Group, Inc. | Haptic systems for head-worn computers |
US9401540B2 (en) | 2014-02-11 | 2016-07-26 | Osterhout Group, Inc. | Spatial location presentation in head worn computing |
US20150241963A1 (en) | 2014-02-11 | 2015-08-27 | Osterhout Group, Inc. | Eye imaging in head worn computing |
EP3111299A4 (en) * | 2014-02-28 | 2017-11-22 | Hewlett-Packard Development Company, L.P. | Calibration of sensors and projector |
US9823623B2 (en) * | 2014-03-27 | 2017-11-21 | City University Of Hong Kong | Conversion of complex holograms to phase holograms |
US20160187651A1 (en) | 2014-03-28 | 2016-06-30 | Osterhout Group, Inc. | Safety for a vehicle operator with an hmd |
US9672210B2 (en) | 2014-04-25 | 2017-06-06 | Osterhout Group, Inc. | Language translation with head-worn computing |
US9651787B2 (en) | 2014-04-25 | 2017-05-16 | Osterhout Group, Inc. | Speaker assembly for headworn computer |
US10853589B2 (en) | 2014-04-25 | 2020-12-01 | Mentor Acquisition One, Llc | Language translation with head-worn computing |
US9423842B2 (en) | 2014-09-18 | 2016-08-23 | Osterhout Group, Inc. | Thermal management for head-worn computer |
CN103995637B (en) * | 2014-04-28 | 2015-08-12 | 京东方科技集团股份有限公司 | Based on the touch control identification device of Doppler effect, method and touch-screen |
US10663740B2 (en) | 2014-06-09 | 2020-05-26 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
US9508195B2 (en) | 2014-09-03 | 2016-11-29 | Microsoft Technology Licensing, Llc | Management of content in a 3D holographic environment |
CN104243882B (en) * | 2014-09-09 | 2018-06-01 | 联想(北京)有限公司 | A kind of projecting method and wearable electronic equipment |
US9778736B2 (en) | 2014-09-22 | 2017-10-03 | Rovi Guides, Inc. | Methods and systems for calibrating user devices |
US9937793B2 (en) * | 2014-09-30 | 2018-04-10 | Continental Automotive Systems, Inc. | Three dimensional view interactive activation system to deploy information |
US10134082B2 (en) | 2014-10-13 | 2018-11-20 | Paypal, Inc. | Virtual display device for an interactive merchant sales environment |
US9684172B2 (en) | 2014-12-03 | 2017-06-20 | Osterhout Group, Inc. | Head worn computer display systems |
US10757382B2 (en) * | 2014-12-18 | 2020-08-25 | Nec Corporation | Projection apparatus and interface apparatus |
USD743963S1 (en) | 2014-12-22 | 2015-11-24 | Osterhout Group, Inc. | Air mouse |
USD751552S1 (en) | 2014-12-31 | 2016-03-15 | Osterhout Group, Inc. | Computer glasses |
USD753114S1 (en) | 2015-01-05 | 2016-04-05 | Osterhout Group, Inc. | Air mouse |
US20160239985A1 (en) | 2015-02-17 | 2016-08-18 | Osterhout Group, Inc. | See-through computer display systems |
US9959658B2 (en) | 2015-02-26 | 2018-05-01 | Rovi Guides, Inc. | Methods and systems for generating holographic animations |
JP6477130B2 (en) * | 2015-03-27 | 2019-03-06 | セイコーエプソン株式会社 | Interactive projector and interactive projection system |
US10200701B2 (en) * | 2015-10-14 | 2019-02-05 | Qualcomm Incorporated | HDR and WCG coding architecture with SDR backwards compatibility in a single bitstream for video coding |
US10824253B2 (en) | 2016-05-09 | 2020-11-03 | Mentor Acquisition One, Llc | User interface systems for head-worn computers |
US10466491B2 (en) | 2016-06-01 | 2019-11-05 | Mentor Acquisition One, Llc | Modular systems for head-worn computers |
US9910284B1 (en) | 2016-09-08 | 2018-03-06 | Osterhout Group, Inc. | Optical systems for head-worn computers |
US10684478B2 (en) | 2016-05-09 | 2020-06-16 | Mentor Acquisition One, Llc | User interface systems for head-worn computers |
JP6780315B2 (en) * | 2016-06-22 | 2020-11-04 | カシオ計算機株式会社 | Projection device, projection system, projection method and program |
US10164631B2 (en) * | 2016-11-09 | 2018-12-25 | Ford Global Technologies, Llc | Holographic proximity switch |
US11755152B2 (en) * | 2017-03-23 | 2023-09-12 | Sony Corporation | Projector with detection function for stabilizing intensity distribution of an irradiation beam |
US10404306B2 (en) | 2017-05-30 | 2019-09-03 | International Business Machines Corporation | Paint on micro chip touch screens |
US10578869B2 (en) | 2017-07-24 | 2020-03-03 | Mentor Acquisition One, Llc | See-through computer display systems with adjustable zoom cameras |
US11409105B2 (en) | 2017-07-24 | 2022-08-09 | Mentor Acquisition One, Llc | See-through computer display systems |
US10422995B2 (en) | 2017-07-24 | 2019-09-24 | Mentor Acquisition One, Llc | See-through computer display systems with stray light management |
JP7024981B2 (en) | 2017-08-01 | 2022-02-24 | シグマ ラボズ,インコーポレイテッド | Systems and methods for measuring radiant thermal energy during additive manufacturing operations |
US10969584B2 (en) | 2017-08-04 | 2021-04-06 | Mentor Acquisition One, Llc | Image expansion optic for head-worn computer |
US11517984B2 (en) | 2017-11-07 | 2022-12-06 | Sigma Labs, Inc. | Methods and systems for quality inference and control for additive manufacturing processes |
TR201718174A2 (en) * | 2017-11-17 | 2019-06-21 | Arcelik As | A HOME APPLIANCE |
KR102568792B1 (en) * | 2017-12-04 | 2023-08-21 | 삼성전자주식회사 | Multi-image display apparatus including diffractive optical element |
DE112019000521B4 (en) | 2018-02-21 | 2022-02-03 | Sigma Labs, Inc. | Additive manufacturing system and additive manufacturing process |
DE112019000498B4 (en) | 2018-02-21 | 2022-06-09 | Sigma Labs, Inc. | Additive manufacturing process |
EP3556702A1 (en) | 2018-03-13 | 2019-10-23 | Otis Elevator Company | Augmented reality car operating panel |
US10712990B2 (en) | 2018-03-19 | 2020-07-14 | At&T Intellectual Property I, L.P. | Systems and methods for a customer assistance station |
US11188154B2 (en) * | 2018-05-30 | 2021-11-30 | International Business Machines Corporation | Context dependent projection of holographic objects |
GB2575658B (en) * | 2018-07-18 | 2020-12-23 | Envisics Ltd | Head-up display |
US11796959B2 (en) | 2019-01-25 | 2023-10-24 | International Business Machines Corporation | Augmented image viewing with three dimensional objects |
CN110308817B (en) * | 2019-06-10 | 2023-04-07 | 青岛小鸟看看科技有限公司 | Touch action identification method and touch projection system |
US11287655B2 (en) * | 2019-06-21 | 2022-03-29 | Samsung Electronics Co.. Ltd. | Holographic display apparatus and method for providing expanded viewing window |
US10880528B1 (en) * | 2019-10-31 | 2020-12-29 | Christie Digital Systems Usa, Inc. | Device, system and method for modulating light using a phase light modulator and a spatial light modulator |
US20220335698A1 (en) * | 2019-12-17 | 2022-10-20 | Ashley SinHee Kim | System and method for transforming mapping information to an illustrated map |
JP7358018B2 (en) * | 2019-12-19 | 2023-10-10 | アルパイン株式会社 | Proximity detection device |
US11487400B1 (en) | 2021-08-13 | 2022-11-01 | International Business Machines Corporation | Aggregated multidimensional user interface display with electronic pen for holographic projection |
Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4384201A (en) * | 1978-04-24 | 1983-05-17 | Carroll Manufacturing Corporation | Three-dimensional protective interlock apparatus |
DE4121180A1 (en) * | 1991-06-27 | 1993-01-07 | Bosch Gmbh Robert | Finger input type interactive screen display system for road vehicle navigation - has panel screen with matrix of sensing elements that can be of infrared or ultrasonic proximity devices or can be touch foil contacts |
US6323942B1 (en) | 1999-04-30 | 2001-11-27 | Canesta, Inc. | CMOS-compatible three-dimensional image sensor IC |
US20020021287A1 (en) * | 2000-02-11 | 2002-02-21 | Canesta, Inc. | Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device |
US6367933B1 (en) | 1998-10-02 | 2002-04-09 | Macronix International Co., Ltd. | Method and apparatus for preventing keystone distortion |
US6491400B1 (en) | 2000-10-24 | 2002-12-10 | Eastman Kodak Company | Correcting for keystone distortion in a digital image displayed by a digital projector |
WO2002101443A2 (en) | 2001-06-12 | 2002-12-19 | Silicon Optix Inc. | System and method for correcting keystone distortion |
WO2005059660A2 (en) | 2003-12-15 | 2005-06-30 | Cambridge University Technical Services Limited | Video holographic apparatus and method |
US20060187199A1 (en) * | 2005-02-24 | 2006-08-24 | Vkb Inc. | System and method for projection |
WO2006108443A1 (en) | 2005-04-13 | 2006-10-19 | Sensitive Object | Method for determining the location of impacts by acoustic imaging |
WO2006134398A2 (en) | 2005-06-14 | 2006-12-21 | Light Blue Optics Ltd | Signal processing system for synthesizing holograms |
WO2007031797A2 (en) | 2005-09-16 | 2007-03-22 | Light Blue Optics Ltd | Methods and apparatus for displaying images using holograms |
US20070222760A1 (en) | 2001-01-08 | 2007-09-27 | Vkb Inc. | Data input device |
WO2007110668A2 (en) | 2006-03-28 | 2007-10-04 | Light Blue Optics Ltd | Holographic display devices |
WO2007141567A1 (en) | 2006-06-02 | 2007-12-13 | Light Blue Optics Ltd | Methods and apparatus for displaying colour images using holograms |
WO2008038275A2 (en) | 2006-09-28 | 2008-04-03 | Lumio Inc. | Optical touch panel |
US7379619B2 (en) | 2005-03-09 | 2008-05-27 | Texas Instruments Incorporated | System and method for two-dimensional keystone correction for aerial imaging |
WO2008146098A1 (en) | 2007-05-28 | 2008-12-04 | Sensitive Object | Method for determining the position of an excitation on a surface and device for implementing such a method |
Family Cites Families (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4827085A (en) * | 1987-11-19 | 1989-05-02 | Ovonic Imaging Systems, Inc. | Voice and image teleconferencing system including paperless facsimile means |
US5498867A (en) | 1993-11-22 | 1996-03-12 | Sachio Uehara | Wavelength-division multiplex digital optical position sensor |
US6281878B1 (en) * | 1994-11-01 | 2001-08-28 | Stephen V. R. Montellese | Apparatus and method for inputing data |
US6031519A (en) * | 1997-12-30 | 2000-02-29 | O'brien; Wayne P. | Holographic direct manipulation interface |
US6690357B1 (en) | 1998-10-07 | 2004-02-10 | Intel Corporation | Input device using scanning sensors |
GB2343023A (en) | 1998-10-21 | 2000-04-26 | Global Si Consultants Limited | Apparatus for order control |
US7050177B2 (en) | 2002-05-22 | 2006-05-23 | Canesta, Inc. | Method and apparatus for approximating depth of an object's placement onto a monitored region with applications to virtual interface devices |
US20030132921A1 (en) | 1999-11-04 | 2003-07-17 | Torunoglu Ilhami Hasan | Portable sensory input device |
US6567190B1 (en) * | 1999-11-05 | 2003-05-20 | Eastman Kodak Company | Multi-functional scanner and method of assembling same |
US8482535B2 (en) | 1999-11-08 | 2013-07-09 | Apple Inc. | Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics |
US6611252B1 (en) | 2000-05-17 | 2003-08-26 | Dufaux Douglas P. | Virtual data input device |
US6650318B1 (en) * | 2000-10-13 | 2003-11-18 | Vkb Inc. | Data input device |
EP1316055A4 (en) | 2000-05-29 | 2006-10-04 | Vkb Inc | Virtual data entry device and method for input of alphanumeric and other data |
US6803906B1 (en) | 2000-07-05 | 2004-10-12 | Smart Technologies, Inc. | Passive touch system and method of detecting user input |
US7259747B2 (en) | 2001-06-05 | 2007-08-21 | Reactrix Systems, Inc. | Interactive video display system |
US7710391B2 (en) | 2002-05-28 | 2010-05-04 | Matthew Bell | Processing an image utilizing a spatially varying pattern |
US7307661B2 (en) | 2002-06-26 | 2007-12-11 | Vbk Inc. | Multifunctional integrated image sensor and application to virtual interface technology |
US7151530B2 (en) | 2002-08-20 | 2006-12-19 | Canesta, Inc. | System and method for determining an input selected by a user through a virtual interface |
US7671843B2 (en) * | 2002-11-12 | 2010-03-02 | Steve Montellese | Virtual holographic input method and device |
DE10260305A1 (en) | 2002-12-20 | 2004-07-15 | Siemens Ag | HMI setup with an optical touch screen |
US7173605B2 (en) | 2003-07-18 | 2007-02-06 | International Business Machines Corporation | Method and apparatus for providing projected user interface for computing device |
US7317954B2 (en) | 2003-12-12 | 2008-01-08 | Conmed Corporation | Virtual control of electrosurgical generator functions |
US7317955B2 (en) | 2003-12-12 | 2008-01-08 | Conmed Corporation | Virtual operating room integration |
US7355593B2 (en) | 2004-01-02 | 2008-04-08 | Smart Technologies, Inc. | Pointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region |
US20050156952A1 (en) * | 2004-01-20 | 2005-07-21 | Orner Edward E. | Interactive display systems |
US7432917B2 (en) | 2004-06-16 | 2008-10-07 | Microsoft Corporation | Calibration of an interactive display system |
US7453419B2 (en) * | 2004-11-24 | 2008-11-18 | Microsoft Corporation | Edge lighting system for interactive display surface |
US7248151B2 (en) | 2005-01-05 | 2007-07-24 | General Motors Corporation | Virtual keypad for vehicle entry control |
TW200627244A (en) | 2005-01-17 | 2006-08-01 | Era Optoelectronics Inc | Data input device |
JP4689684B2 (en) | 2005-01-21 | 2011-05-25 | ジェスチャー テック,インコーポレイテッド | Tracking based on movement |
US20060244720A1 (en) * | 2005-04-29 | 2006-11-02 | Tracy James L | Collapsible projection assembly |
JPWO2007013608A1 (en) | 2005-07-28 | 2009-02-12 | パナソニック株式会社 | Laser light source and display device |
KR100631779B1 (en) | 2005-10-07 | 2006-10-11 | 삼성전자주식회사 | Data input apparatus and method for data input detection using the same |
US7599561B2 (en) | 2006-02-28 | 2009-10-06 | Microsoft Corporation | Compact interactive tabletop with projection-vision |
US7515822B2 (en) * | 2006-05-12 | 2009-04-07 | Microsoft Corporation | Imaging systems' direct illumination level adjusting method and system involves adjusting operation of image sensor of imaging system based on detected level of ambient illumination |
US8588862B2 (en) * | 2006-08-28 | 2013-11-19 | Motorola Mobility Llc | Alert sleep and wakeup for a mobile station |
US8022941B2 (en) * | 2006-10-12 | 2011-09-20 | Disney Enterprises, Inc. | Multi-user touch screen |
GB2445164A (en) | 2006-12-21 | 2008-07-02 | Light Blue Optics Ltd | Holographic image display systems |
GB2448132B (en) | 2007-03-30 | 2012-10-10 | Light Blue Optics Ltd | Optical Systems |
US8094137B2 (en) | 2007-07-23 | 2012-01-10 | Smart Technologies Ulc | System and method of detecting contact on a display |
US8072448B2 (en) | 2008-01-15 | 2011-12-06 | Google Inc. | Three-dimensional annotations for street view data |
US8384005B2 (en) | 2008-06-17 | 2013-02-26 | The Invention Science Fund I, Llc | Systems and methods for selectively projecting information in response to at least one specified motion associated with pressure applied to at least one projection surface |
EP2335138A4 (en) | 2008-08-15 | 2012-12-19 | Qualcomm Inc | Enhanced multi-touch detection |
KR101548997B1 (en) | 2008-09-03 | 2015-09-01 | 엘지전자 주식회사 | Projection display device |
GB2466023A (en) | 2008-12-08 | 2010-06-09 | Light Blue Optics Ltd | Holographic Image Projection Systems |
GB2466497B (en) | 2008-12-24 | 2011-09-14 | Light Blue Optics Ltd | Touch sensitive holographic displays |
US8547327B2 (en) | 2009-10-07 | 2013-10-01 | Qualcomm Incorporated | Proximity object tracker |
GB201110156D0 (en) * | 2011-06-16 | 2011-07-27 | Light Blue Optics Ltd | Touch-sensitive display devices |
-
2008
- 2008-12-24 GB GB0823457A patent/GB2466497B/en not_active Expired - Fee Related
-
2009
- 2009-05-29 GB GBGB0909311.3A patent/GB0909311D0/en active Pending
- 2009-05-29 GB GBGB0909315.4A patent/GB0909315D0/en active Pending
- 2009-05-29 GB GBGB0909313.9A patent/GB0909313D0/en active Pending
- 2009-05-29 GB GBGB0909314.7A patent/GB0909314D0/en active Pending
- 2009-12-03 WO PCT/GB2009/051638 patent/WO2010073024A1/en active Application Filing
- 2009-12-03 US US13/130,738 patent/US8514194B2/en not_active Expired - Fee Related
- 2009-12-23 US US13/130,741 patent/US8947401B2/en not_active Expired - Fee Related
- 2009-12-23 WO PCT/GB2009/051768 patent/WO2010073045A2/en active Application Filing
- 2009-12-23 US US13/130,742 patent/US8947402B2/en not_active Expired - Fee Related
- 2009-12-23 WO PCT/GB2009/051770 patent/WO2010073047A1/en active Application Filing
-
2013
- 2013-07-03 US US13/934,223 patent/US9557855B2/en not_active Expired - Fee Related
Patent Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4384201A (en) * | 1978-04-24 | 1983-05-17 | Carroll Manufacturing Corporation | Three-dimensional protective interlock apparatus |
DE4121180A1 (en) * | 1991-06-27 | 1993-01-07 | Bosch Gmbh Robert | Finger input type interactive screen display system for road vehicle navigation - has panel screen with matrix of sensing elements that can be of infrared or ultrasonic proximity devices or can be touch foil contacts |
US6367933B1 (en) | 1998-10-02 | 2002-04-09 | Macronix International Co., Ltd. | Method and apparatus for preventing keystone distortion |
US6323942B1 (en) | 1999-04-30 | 2001-11-27 | Canesta, Inc. | CMOS-compatible three-dimensional image sensor IC |
US20020021287A1 (en) * | 2000-02-11 | 2002-02-21 | Canesta, Inc. | Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device |
US6491400B1 (en) | 2000-10-24 | 2002-12-10 | Eastman Kodak Company | Correcting for keystone distortion in a digital image displayed by a digital projector |
US20070222760A1 (en) | 2001-01-08 | 2007-09-27 | Vkb Inc. | Data input device |
WO2002101443A2 (en) | 2001-06-12 | 2002-12-19 | Silicon Optix Inc. | System and method for correcting keystone distortion |
WO2005059660A2 (en) | 2003-12-15 | 2005-06-30 | Cambridge University Technical Services Limited | Video holographic apparatus and method |
US20060187199A1 (en) * | 2005-02-24 | 2006-08-24 | Vkb Inc. | System and method for projection |
US7379619B2 (en) | 2005-03-09 | 2008-05-27 | Texas Instruments Incorporated | System and method for two-dimensional keystone correction for aerial imaging |
WO2006108443A1 (en) | 2005-04-13 | 2006-10-19 | Sensitive Object | Method for determining the location of impacts by acoustic imaging |
WO2006134398A2 (en) | 2005-06-14 | 2006-12-21 | Light Blue Optics Ltd | Signal processing system for synthesizing holograms |
WO2007031797A2 (en) | 2005-09-16 | 2007-03-22 | Light Blue Optics Ltd | Methods and apparatus for displaying images using holograms |
WO2007110668A2 (en) | 2006-03-28 | 2007-10-04 | Light Blue Optics Ltd | Holographic display devices |
WO2007141567A1 (en) | 2006-06-02 | 2007-12-13 | Light Blue Optics Ltd | Methods and apparatus for displaying colour images using holograms |
WO2008038275A2 (en) | 2006-09-28 | 2008-04-03 | Lumio Inc. | Optical touch panel |
WO2008146098A1 (en) | 2007-05-28 | 2008-12-04 | Sensitive Object | Method for determining the position of an excitation on a surface and device for implementing such a method |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9557855B2 (en) | 2008-12-24 | 2017-01-31 | Promethean Limited | Touch sensitive holographic displays |
US8947401B2 (en) | 2008-12-24 | 2015-02-03 | Light Blue Optics Ltd | Display device |
US8902484B2 (en) | 2010-12-15 | 2014-12-02 | Qualcomm Mems Technologies, Inc. | Holographic brightness enhancement film |
EP2687959A4 (en) * | 2011-03-15 | 2014-10-08 | Nikon Corp | Detection device, input device, projector, and electronic apparatus |
CN103299259A (en) * | 2011-03-15 | 2013-09-11 | 株式会社尼康 | Detection device, input device, projector, and electronic apparatus |
EP2687959A1 (en) * | 2011-03-15 | 2014-01-22 | Nikon Corporation | Detection device, input device, projector, and electronic apparatus |
WO2012172364A2 (en) | 2011-06-16 | 2012-12-20 | Light Blue Optics Ltd | Touch-sensitive display devices |
WO2012172363A2 (en) | 2011-06-16 | 2012-12-20 | Light Blue Optics Ltd | Touch sensitive display devices |
WO2012172360A2 (en) | 2011-06-16 | 2012-12-20 | Light Blue Optics Ltd | Touch-sensitive display devices |
US9019240B2 (en) | 2011-09-29 | 2015-04-28 | Qualcomm Mems Technologies, Inc. | Optical touch device with pixilated light-turning features |
WO2013054096A1 (en) | 2011-10-11 | 2013-04-18 | Light Blue Optics Limited | Touch-sensitive display devices |
WO2013108032A1 (en) | 2012-01-20 | 2013-07-25 | Light Blue Optics Limited | Touch sensitive image display devices |
WO2013108031A2 (en) | 2012-01-20 | 2013-07-25 | Light Blue Optics Limited | Touch sensitive image display devices |
WO2013144599A2 (en) | 2012-03-26 | 2013-10-03 | Light Blue Optics Ltd | Touch sensing systems |
Also Published As
Publication number | Publication date |
---|---|
US8947402B2 (en) | 2015-02-03 |
US20110254811A1 (en) | 2011-10-20 |
US20130293516A1 (en) | 2013-11-07 |
WO2010073045A2 (en) | 2010-07-01 |
US8514194B2 (en) | 2013-08-20 |
WO2010073045A3 (en) | 2010-08-26 |
WO2010073024A1 (en) | 2010-07-01 |
US20110248963A1 (en) | 2011-10-13 |
GB0909311D0 (en) | 2009-07-15 |
GB0909314D0 (en) | 2009-07-15 |
US9557855B2 (en) | 2017-01-31 |
GB2466497B (en) | 2011-09-14 |
GB2466497A (en) | 2010-06-30 |
GB0909315D0 (en) | 2009-07-15 |
US20110251905A1 (en) | 2011-10-13 |
GB0823457D0 (en) | 2009-01-28 |
US8947401B2 (en) | 2015-02-03 |
GB0909313D0 (en) | 2009-07-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8947402B2 (en) | Touch sensitive image display | |
US20140362052A1 (en) | Touch Sensitive Image Display Devices | |
Hirsch et al. | BiDi screen: a thin, depth-sensing LCD for 3D interaction using light fields | |
US11330193B2 (en) | Imaging device based on lens assembly with embedded filter | |
EP2127367B1 (en) | Multimedia player displaying 2 projection images | |
JP6059223B2 (en) | Portable projection capture device | |
KR101861393B1 (en) | Integrated low power depth camera and projection device | |
US20150083917A1 (en) | Infrared light director for gesture or scene sensing fsc display | |
US20060103811A1 (en) | Image projection system and method | |
JP2007514241A (en) | Built-in interactive video display system | |
JP2007514242A (en) | Built-in interactive video display system | |
US8970693B1 (en) | Surface modeling with structured light | |
US20150084928A1 (en) | Touch-enabled field sequential color display using in-cell light sensors | |
US20150084994A1 (en) | Touch-enabled field-sequential color (fsc) display using a light guide with light turning features | |
US9454265B2 (en) | Integration of a light collection light-guide with a field sequential color display | |
KR102304308B1 (en) | Electronic system with gaze alignment mechanism and method of operation thereof | |
KR102166590B1 (en) | Multiple laser drive system | |
US20140247249A1 (en) | Touch Sensitive Display Devices | |
KR101507458B1 (en) | Interactive display | |
CN104714769B (en) | data processing method and electronic equipment | |
US20120062518A1 (en) | Touch Sensing Systems | |
JP6807286B2 (en) | Imaging device and imaging method | |
JP2011124678A (en) | Projector | |
CN117581149A (en) | Aerial suspension image display device | |
CN109413239B (en) | Electronic device, control method for electronic device, and control device for electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09803894 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13130742 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09803894 Country of ref document: EP Kind code of ref document: A1 |