|Publication number||US6456262 B1|
|Application number||US 09/568,196|
|Publication date||24 Sep 2002|
|Filing date||9 May 2000|
|Priority date||9 May 2000|
|Publication number||09568196, 568196, US 6456262 B1, US 6456262B1, US-B1-6456262, US6456262 B1, US6456262B1|
|Inventors||Cynthia S. Bell|
|Original Assignee||Intel Corporation|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (10), Non-Patent Citations (4), Referenced by (81), Classifications (21), Legal Events (5)|
|External Links: USPTO, USPTO Assignment, Espacenet|
This invention relates generally to microdisplays.
A microdisplay is a small display that normally cannot be viewed by the user unless the user's eye is proximate to the display. Thus, conventionally, a microdisplay is viewed with the user's eye constrained in an eyepiece for magnified viewing the small-sized display. The user may actually see a large virtual display, floating at the eyes' natural resting focus distance of about one meter.
Microdisplays have many advantages because the display takes up relatively small space in an electronic device. Microdisplays may be utilized in electronic devices such as cameras, telephones, and the like. In addition, microdisplays may be included in head-mounted devices wherein the display is positioned in front of the user's eye.
Microdisplays may also have low power consumption, the capability to display very high-resolution content, and may cost less than traditional displays. Microdisplays are advantageous for portable information devices because of their low power consumption, portability and pocketability.
Manually operated controls on the portable electronic device may be awkward to use in conjunction with the microdisplay. This is because the microdisplay needs to be held up to the user's eye. When the microdisplay is in position for viewing, it may make manual manipulations difficult to perform because the user's head and nose may be in the way.
Existing microdisplays convey an impression of a flat, two dimensional field, viewing such displays may be visually uninteresting and may result in fatigue.
Thus, there is a need for better ways to use devices with microdisplays.
FIG. 1 is a side elevational view of one embodiment in the present invention in position in front of the user's eye in accordance with one embodiment of the present invention;
FIG. 2 is a schematic side elevational view of one embodiment of the optical system for the device shown in FIG. 1;
FIG. 3 is a top plan view of one embodiment of the optical system for the device shown in FIG. 1;
FIG. 4 is a depiction of software in one embodiment of the invention;
FIG. 5 is a flow chart for software useful in one embodiment of the invention;
FIG. 6 is a screen display that may be utilized with one embodiment of the present invention;
FIG. 7 is another screen display in accordance with an embodiment of the present invention;
FIG. 8 is still another screen display in accordance with one embodiment of the present invention;
FIG. 9 is another screen display in accordance with one embodiment of the present invention;
FIG. 10 is another screen display in accordance with one embodiment of the present invention;
FIG. 11 is a diagram showing how the microdisplay may display one element in clear focus and other elements deliberately blurred in accordance with one embodiment of the invention;
FIG. 12 is a schematic depiction of the user's eye relative to its plane of focus;
FIG. 13 is a flow chart for software in one embodiment of the present invention; and
FIG. 14 is a block diagram showing hardware for one embodiment of the present invention.
Referring to FIG. 1, in accordance with one embodiment of the present invention, an electronic device 10 may be fitted with a microdisplay that is viewable through an eyepiece 12 positioned proximate to the user's eye. With the user's head “H” arranged near the top of the electronic device 10, the user can view a microdisplay that is mounted within the electronic device 10.
The electronic device 10 is illustrated as being a cellular phone having an antenna 14. However, the present invention is applicable to a wide variety of portable electronic devices including cameras, personal digital assistants, appliances, games, and calculators, as just a few examples. In general, these devices may have an electronic microdisplay controlled by a processor-based system.
Microdisplays are well known and may use a wide variety of technologies. For example, liquid crystal over semiconductor (LCOS) technologies may be utilized to form a liquid crystal display of very small size directly on an integrated circuit. The integrated circuit may include the display as well as other components such as a processor, as one example.
An optical system 10, shown in FIG. 2, may be utilized in one embodiment of the present invention. The user's eye, indicated at “E,” is positioned in the eyepiece 12 where the user views the microdisplay 16. The microdisplay 16 may be viewed through a folded optical system, to enhance compactness, which may include a front lens 24 and a back lens 26. The front lens 24 may be useful for image element magnification and the back lens 26 may be useful for image element magnification and reflection.
In FIG. 3, the user's eye E is also positioned in the eyepiece 12 where infrared light may be reflected off the user's eye E and used to determine what the user is looking at. A pair of infrared eye illuminators 28 may be arranged to emit infrared light into the user's eye E. The infrared light reflections from the user's eye E may be reflected off a dichroic mirror 18, as shown in FIG. 2.
The dichroic mirror 18 is wavelength selective. It tends to pass the visible and reflect the infrared light portion of the spectrum according to the direction arrows shown in the diagram. The front lens 24 may also be useful for eye gaze detection.
An infrared imaging subsystem 20 may capture infrared light that is reflected off the user's eye E and the dichroic mirror 18. A capture lens 22 may be positioned between the infrared imaging subsystem 20 and the dichroic mirror 18. The eye image is recorded on the infrared imaging subsystem 20 through the capture lens 22. The infrared imaging subsystem 20 and infrared eye illuminators 28 may constitute a gaze data module 32.
Software 35, shown in FIG. 4, determines what the user is looking at and adjusts the display accordingly. The eye gaze module 34 receives gaze data indicating what the user is looking at from the imaging subsystem 20. In turn, the eye gaze software module 34 generates control signals for controlling the illuminators 28. The module 34 may also provide information to a gaze resolution module 36, which provides information to a depth calculation module 37.
The eye gaze module 34, which may possess images reflected from eye surfaces known as the Purkinje images, may spatially resolve the user's gaze direction from points receiving from points receiving the strongest infrared reflections. Depending on the amount of eye relief and the size of the exit pupil, the eye gaze module 34 may also detect the sclera-iris boundary.
The gaze resolution module 36 may determine if the spatial position of the user's gaze developed by the module 34 is coincident with any user selectable activation point displayed on the microdisplay 16. If the user's gaze coincides with a user selectable activation point, the depth calculation module 37 may determine an altered depth of field based upon that information.
Timing information 38 may be utilized by the gaze resolution module 36 to determine if the user has gazed upon an activation point on the microdisplay 16 long enough to indicate a voluntary activation request. A display controller (not shown in FIG. 4) may provide display information 33 to the gaze resolution module 36 about the x y coordinates for image elements displayed on the microdisplay 16. Moreover, the depth calculation module 37 may generate display control signals 39 to adjust the altered depth of field on the microdisplay 16 based on the user's gaze.
Tracking the various features of the eye, including determining the sclera-iris boundary, the iris-pupil boundary, and tracking infrared reflections from the cornea and the lens surfaces, commonly known as the Purkinje images, are all well known. These features may be tracked to determine what the user is actually looking at. In particular, the Purkinje image element measurements are well suited to the type of eye gaze measurement possible with microdisplays.
A number of techniques are known for coordinating what the user is looking at with what is displayed in front of the user. For example, in U.S. Pat. No. 5,857,120, the x y coordinates of a watching point are determined. The accumulated stagnant time that the viewer gazed at each watching point is also determined. The eye movement trajectory between fixation periods is tracked. Likewise, the frequency of gazing at various watching points may be evaluated. A weighting may be developed to give emphasis to the points watched closest to the time of a pertinent activity, such as a picture capture request. A correlation factor may be developed between subsequent watching points.
Software 40, shown in FIG. 5, may be utilized to control what is displayed on the microdisplay 16 in response to what the user is looking at. The software 40 begins by activating the microdisplay 16 and initializing eye tracking, as indicated in block 42. The illuminators 28 are enabled to detect and process the eye reflection image, as indicated in block 44. The sclera-iris boundary may be detected by the imaging subsystem 20 in one embodiment of the present invention, as indicated in block 46. Additionally, the first Purkinje image x y positions are sent to the eye gaze module 34, as also indicated in block 46.
Thereafter, as shown in block 48, the illuminator 28 and imaging subsystem 20 may be disabled. The gaze resolution module 36 then calculates the eye axis of view or gaze, as indicated in block 50. If the eye axis of view intersects with a known activation point being displayed, as shown in diamond 52, it is assumed that the user may be indicating that he or she wishes to select the gazed upon activation point. A time threshold may be useful to separate field voluntary eye allocation periods from involuntary eye saccades, in which the eye rapidly transverses the visual field. However, the threshold time may be short enough to be very natural to the user.
A check in diamond 54 determines whether this is the first time that the user has gazed at the activation point. If it is, and the user has gazed at the activation point for more than a threshold period, the appearance of the activation point is adjusted in a first way to give the user feedback that his or her gaze is being assessed, as indicated in block 60.
After the user has gazed at the activation point a given number of times, such as two, and the minimum threshold time per gaze is met, as determined in diamond 62, the displayed content is changed as indicated in block 64.
If the user's gaze has not met the first or second voluntary gaze duration time threshold, the system waits for another opportunity to check the gaze position, as indicated in block 58. Similarly, if the user's eye gaze did not intersect with a known activation point (diamond 52) the system goes back into a waiting mode, as indicated in block 56.
In one embodiment of the invention, the display alterations may be utilized as an indication to the user that a highlighted activation point has been selected. For example, the activation point selected may be brought into clear focus, whereas other image elements on the same display 16 are blurred. In effect, the gazed upon activation point has the depth of field centered around it, whereas the other displayed image elements may be at the out of focus and are out of focus. That is, the image element the user is interested in automatically defines the depth of field of the display 16 and the other image elements are thereby defined as being in front of or in back of the current the depth of field. As a result, when the user is viewing the microdisplay 16, the user may be given a three-dimensional viewing effect. Thus, the selected activation point is identified as being clearly in focus compared to other items.
The assignment of the depth of field, which may be centered around various displayed image elements, can be accomplished in a number of ways. First, each of the displayed image elements may have pre-stored x y coordinates. This is particularly viable in cases when the image element is generated by the display's own processor-based system. In such case, the programmer may assign x y coordinates to each of the image element. Thus, the display 16 may assume a depth of field before any image element is selected and another depth of field thereafter. In effect, some image elements may be blurred while the image element gazed upon is enlarged and placed in focus.
Second, in other embodiments, depth of field information may be conveyed with information that is received from a video source. For example, depth information is encoded for each object captured by cameras made by 3DV Systems, Ltd. (Yokneam 20692, Israel). Those cameras produce a video stream for television shows and movies. This real time 3D camera employs a scanning laser beam and a time-of-flight detection system to capture the third dimension.
The 3DV Systems' device captures video with a camera having x, y and z information. To use natural video context, the depth of field (z information) introduced by the camera's taking lens may be taken into account. This can be managed by recording a scene multiple times or simultaneously with multiple cameras. In each recording, the taking lens is focused at a different subject distance so that in focus image element slices of the scene are available, each slice at a different focal distance. When the user gazes upon a specific image element, the recording with that image element in best focus is retrieved and displayed.
In sum, once it is known what the user is gazing at, and the depth of field information has been obtained, the item gazed at can be automatically placed in the center of the depth of field and other objects may be placed out of the depth of field and thereby automatically blurred.
Alternatively, image elements that are gazed upon may be enlarged relative to other image elements. In some embodiments, gazed upon image elements may be more focussed and enlarged relative to image elements that are not gazed upon.
Referring to FIG. 6, an exemplary microdisplay screen 68 is shown, which may include a plurality of graphical user interface icons 70, 72, 74 and 76. In the embodiment in which the electronic device 10 is a cellular telephone, the icons 70, 72, 74 may be provided to enable the user to initiate a call, to hang up a call, or to select a menu of items. Icons 74 for the most important items may be shown in the foreground of the display, in larger size. Icons 72 and 70 for items that amount to submenu possibilities may be aligned above a particular icon 74 and shown in progressively smaller size. For example, the call icon 74 is in the foreground and enlarged whereas the record 72 and time 70 icons are aligned above the call icon 74 and are progressively smaller in size.
In FIG. 6, the user has gazed on the icon 76 that has been highlighted (indicated by the double line) in response to the user's gaze. As a result of the user's selection of the icon 76, a new display screen 68 a may be viewed, as indicated in FIG. 7. In this case, the microdisplay screen 68 a has been altered so that the icon 76 a has been enlarged and has been brought into clear focus. In addition, the particular telephone number which the user desires to store, is displayed in the enlarged icon 76 a. Conversely, all the other displayed items have been automatically blurred and have been taken out of focus relative to the icon 76 a, as indicated by their reduced size and by dashed lines in FIG. 7. Thus, in relation to the selected image element, the non-selected image elements may be reduced in size and may be physically blurred.
Objects 78 and 80 are viewed by a camera C as shown in FIG. 8. The distance of each object 78 and 80 from the camera C is designated as distances “Q” and “R”, respectively.
A display screen 77, shown in FIG. 9, displays the image elements 78 a and 80 a of the objects 78 and 80 captured by the camera C. If the user gazes on the image element 78 a, as displayed on the microdisplay 16, the depth of field of the screen 77 may automatically shift to center around image element 78 a. As a result, image element 78 a may be in clear and in focus, as indicated by solid lines. Conversely, the image element 80 a may be depicted as =extending beyond the far boundary of the depth of field. As a result, the image element 80 a may be blurred and presented as out of focus, as indicated by the dashed lines in FIG. 9.
A display screen 79 may result when the user gazes on the image element 80 b instead of the image element 78 a. In this case, the depth of field may be centered around the image element 80 b, bringing it into clear focus (as indicated by solid lines). Because image element 78 b may be depicted as being outside of the near boundary of the depth of field, it may be blurred and appear out of focus (as indicated by dashed lines).
In sum, the user may view a two-dimensional image element of a three dimensional scene. The depth of field of the scene, and hence the in-focus image element, may be determined to be image element the user gazes upon.
The blurring operation, illustrated in FIG. 11, may be implemented by the depth calculation module 37. The user's eye E is at a distance “s” (from the entrance pupil of the eye's lens) to the “best plane of focus”. The distance “L2” is the distance to the far boundary of what is perceived to be sharp. The distance “L1” is the distance to the near boundary of what is perceived to be sharp. The distance “d” is the diameter of the lens aperture, (its focal length divided by the F number). The letter “c” indicates the circle of confusion in the object plane. The circle of confusion is the size of a circle that is indistinguishable from a true point by the human eye. Typically, it is in the range of 1.5 to 3.4 minutes of arc depending on the viewing conditions and observer acuity.
From simple geometry then L2 is equal to sd/(d−c) and L1 is equal to sd/(d+c). Any pixels in the image element with a z value distance falling between L1 and L2 need not be modified as they would be perceived as being sharp by someone viewing the scene in person. Pixels with z values in the front or back of the desired focus plane may need to be blurred based on their distance from the plane of focus.
The amount of blur can be calculated by geometry. Any point appears to the eye spread to the diameter in which it intercepts the plane of best focus. A point at a distance L3 appears with the blur circle diameter c′ as shown at the plane of best focus in FIG. 12. Thus, a digital image element blur filter may be applied to the pixels at plane L3 to render them for proper appearance. Any conventional blur filter function may be used here such as a neighborhood averaging, low pass filtering or the like.
The radius of pixels to be used in an averaging function can be readily correlated to the blur transform of the original dot or line pair. Thus, blur functions can be designed with calibrated impact. As result, a look up table may be utilized to expedite the processing. In summary, the distance in z values of a pixel in the focus plane can be used as an index into a look up table to access the blur function control parameter needed to achieve the desired blur appearance.
Referring to FIG. 13, software 82 may be utilized to accomplish the adjustment in the depth of field, according to one embodiment of the present invention. Initially, the image element that is the object of the user's gaze is identified using the eye gaze module 34 and gaze resolution module 36 as indicated in block 84. The relative depth information is acquired (block 86). The subject image element is brought into clear focus, as indicated in block 88, and may be enlarged. The other image elements are subjected to digital processing, using software 114, to blur them as described above and as indicated in block 90.
Referring to FIG. 14, in one embodiment of the present invention, the electronic device 10 is a cellular telephone. The device 10 includes a bus 92 that couples a processor 94 and a system memory 96. In one embodiment, the processor 94 may execute the software modules 34, 36 and 37 of FIG. 4. An image element processing device 98 may likewise be coupled to the bus 92 together with a radio frequency (RF) processing chip device 100. The device 98 may be used for image element processing tasks including implementing the blur function.
The infrared eye illuminators 28 and infrared imaging subsystem 20 may also be coupled to the bus 92. The display controller 110 may be coupled to the microdisplay 16 and the bus 92. A storage device 112, coupled to the bus 92, may be a semiconductor storage such as a flash memory. It may store the software 40 and 82 in accordance with one embodiment of the present invention. In addition, the storage device 112 may store software 114 for implementing the blur function and the blur function lookup table. A keypad 116 for manual entry may be provided as well. Other components such as microphones, and the like are omitted for clarity purposes.
While the present invention has been described with respect to a limited number of embodiments, those skilled in the art will appreciate numerous modifications and variations therefrom. It is intended that the appended claims cover all such modifications and variations as fall within the true spirit and scope of this present invention.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US4028725 *||21 Apr 1976||7 Jun 1977||Grumman Aerospace Corporation||High-resolution vision system|
|US4479784 *||3 Mar 1981||30 Oct 1984||The Singer Company||Eye line-of-sight responsive wide angle visual system|
|US4513317 *||28 Sep 1982||23 Apr 1985||The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration||Retinally stabilized differential resolution television display|
|US5731805 *||25 Jun 1996||24 Mar 1998||Sun Microsystems, Inc.||Method and apparatus for eyetrack-driven text enlargement|
|US5857120||25 Jan 1995||5 Jan 1999||Canon Kabushiki Kaisha||Eye axis detector and camera with eye axis detector|
|US5892624||2 Mar 1998||6 Apr 1999||Siliscape||Compact display system with two stage magnification and immersed beam splitter|
|US5905525 *||11 Jul 1996||18 May 1999||Minolta Co., Ltd.||Image display apparatus having a display controlled by user's head movement|
|US5912721 *||12 Mar 1997||15 Jun 1999||Kabushiki Kaisha Toshiba||Gaze detection apparatus and its method as well as information display apparatus|
|US6175352 *||26 Jun 1997||16 Jan 2001||Sharp Kabushiki Kaisha||Address generator display and spatial light modulator|
|US6315273 *||16 May 2000||13 Nov 2001||Lionel Davies||Gas and liquid contact apparatus: illuminated|
|1||"3-D Displays", Heinrich-Hertz Institut, pp. 1-3, Feb. 5, 2000.|
|2||"ZCAM(TM) Depth Camera Enables a Host of Unique Applications Previously Unavailabe to Production Professional", 3DV Systema Ltd., pp. 1-3, Feb. 5, 2000.|
|3||"ZCAM™ Depth Camera Enables a Host of Unique Applications Previously Unavailabe to Production Professional", 3DV Systema Ltd., pp. 1-3, Feb. 5, 2000.|
|4||Kay Talmi, Jin Liu, "Eye and Gaze Tracking for Visually Controlled Interactive Stereoscopic Displays", Heinrich-Hertz Institut, p. 1-5, Feb. 5, 2000.|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US7091867||20 Feb 2004||15 Aug 2006||Agilent Technologies, Inc.||Wavelength selectivity enabling subject monitoring outside the subject's field of view|
|US7113170 *||8 Nov 2002||26 Sep 2006||Swisscom Mobile Ag||Method and terminal for entering instructions|
|US7245273 *||30 Jan 2002||17 Jul 2007||David Parker Dickerson||Interactive data view and command system|
|US7306337||26 Feb 2004||11 Dec 2007||Rensselaer Polytechnic Institute||Calibration-free gaze tracking under natural head movement|
|US7639208 *||13 May 2005||29 Dec 2009||University Of Central Florida Research Foundation, Inc.||Compact optical see-through head-mounted display with occlusion support|
|US7762665||21 Mar 2003||27 Jul 2010||Queen's University At Kingston||Method and apparatus for communication between humans and devices|
|US7857700||12 Sep 2003||28 Dec 2010||Igt||Three-dimensional autostereoscopic image display for a gaming apparatus|
|US7872635||14 May 2004||18 Jan 2011||Optimetrics, Inc.||Foveated display eye-tracking system and method|
|US7878910 *||13 Sep 2005||1 Feb 2011||Igt||Gaming machine with scanning 3-D display system|
|US7928926 *||14 Jun 2007||19 Apr 2011||Panasonic Corporation||Display apparatus and method for hands free operation that selects a function when window is within field of view|
|US7948451 *||17 Jun 2005||24 May 2011||Totalförsvarets Forskningsinstitut||Interactive method of presenting information in an image|
|US7969383||9 Apr 2007||28 Jun 2011||Metaio Gmbh||Interactive data view and command system|
|US8025404 *||30 Sep 2009||27 Sep 2011||Eyetect, L.L.C.||Method and apparatus for monitoring eye tremor|
|US8032842||25 Jul 2007||4 Oct 2011||Korea Institute Of Science & Technology||System and method for three-dimensional interaction based on gaze and system and method for tracking three-dimensional gaze|
|US8155479||28 Mar 2008||10 Apr 2012||Intuitive Surgical Operations Inc.||Automated panning and digital zooming for robotic surgical systems|
|US8292433||19 Sep 2005||23 Oct 2012||Queen's University At Kingston||Method and apparatus for communication between humans and devices|
|US8322856||27 Jun 2012||4 Dec 2012||Queen's University At Kingston||Method and apparatus for communication between humans and devices|
|US8487838||30 Aug 2011||16 Jul 2013||John R. Lewis||Gaze detection in a see-through, near-eye, mixed reality display|
|US8500282||22 Sep 2011||6 Aug 2013||Boston Brainstem, Inc.||Method and apparatus for monitoring eye tremor|
|US8593375 *||27 May 2011||26 Nov 2013||Gregory A Maltz||Eye gaze user interface and method|
|US8614673||30 May 2012||24 Dec 2013||May Patents Ltd.||System and method for control based on face or hand gesture detection|
|US8614674||18 Jun 2012||24 Dec 2013||May Patents Ltd.||System and method for control based on face or hand gesture detection|
|US8672482||19 Apr 2013||18 Mar 2014||Queen's University At Kingston||Method and apparatus for communication between humans and devices|
|US8686923||17 May 2011||1 Apr 2014||Metaio Gmbh||Interactive data view and command system|
|US8808164||28 Mar 2008||19 Aug 2014||Intuitive Surgical Operations, Inc.||Controlling a robotic surgical tool with a display monitor|
|US8810401||16 Mar 2009||19 Aug 2014||Nokia Corporation||Data processing apparatus and associated user interfaces and methods|
|US8872762||8 Dec 2011||28 Oct 2014||Primesense Ltd.||Three dimensional user interface cursor control|
|US8881051||5 Jul 2012||4 Nov 2014||Primesense Ltd||Zoom-based gesture user interface|
|US8928558||12 Jul 2013||6 Jan 2015||Microsoft Corporation||Gaze detection in a see-through, near-eye, mixed reality display|
|US8933876||8 Dec 2011||13 Jan 2015||Apple Inc.||Three dimensional user interface session control|
|US8959013||25 Sep 2011||17 Feb 2015||Apple Inc.||Virtual keyboard for a non-tactile three dimensional user interface|
|US8988519 *||20 Mar 2012||24 Mar 2015||Cisco Technology, Inc.||Automatic magnification of data on display screen based on eye characteristics of user|
|US8998414||26 Sep 2011||7 Apr 2015||Microsoft Technology Licensing, Llc||Integrated eye tracking and display system|
|US9030498||14 Aug 2012||12 May 2015||Apple Inc.||Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface|
|US9110504||15 Mar 2013||18 Aug 2015||Microsoft Technology Licensing, Llc||Gaze detection in a see-through, near-eye, mixed reality display|
|US9122311||23 Aug 2012||1 Sep 2015||Apple Inc.||Visual feedback for tactile and non-tactile user interfaces|
|US9153195||30 Jan 2012||6 Oct 2015||Microsoft Technology Licensing, Llc||Providing contextual personal information by a mixed reality device|
|US9158375||23 Dec 2012||13 Oct 2015||Apple Inc.||Interactive reality augmentation for natural interaction|
|US20020101568 *||30 Jan 2002||1 Aug 2002||Eberl Heinrich A.||Interactive data view and command system|
|US20040125283 *||30 Dec 2002||1 Jul 2004||Samson Huang||LCOS imaging device|
|US20040174496 *||26 Feb 2004||9 Sep 2004||Qiang Ji||Calibration-free gaze tracking under natural head movement|
|US20040179155 *||24 Mar 2004||16 Sep 2004||Samson Huang||LCOS imaging device|
|US20040183749 *||21 Mar 2003||23 Sep 2004||Roel Vertegaal||Method and apparatus for communication between humans and devices|
|US20040196399 *||1 Apr 2003||7 Oct 2004||Stavely Donald J.||Device incorporating retina tracking|
|US20040212626 *||27 Jun 2002||28 Oct 2004||Urban Lyxzen||System and a method for user interaction|
|US20050047629 *||25 Aug 2003||3 Mar 2005||International Business Machines Corporation||System and method for selectively expanding or contracting a portion of a display using eye-gaze tracking|
|US20050185243 *||20 Feb 2004||25 Aug 2005||Wenstrand John S.||Wavelength selectivity enabling subject monitoring outside the subject's field of view|
|US20050243054 *||29 Apr 2004||3 Nov 2005||International Business Machines Corporation||System and method for selecting and activating a target object using a combination of eye gaze and key presses|
|US20060093998 *||19 Sep 2005||4 May 2006||Roel Vertegaal||Method and apparatus for communication between humans and devices|
|US20070132663 *||11 Dec 2006||14 Jun 2007||Olympus Corporation||Information display system|
|US20070136064 *||8 Dec 2006||14 Jun 2007||Carroll David W||Mobile personal computer with movement sensor|
|US20070296646 *||14 Jun 2007||27 Dec 2007||Kakuya Yamamoto||Display apparatus and control method thereof|
|US20080024392 *||17 Jun 2005||31 Jan 2008||Torbjorn Gustafsson||Interactive Method of Presenting Information in an Image|
|US20080157946 *||9 Apr 2007||3 Jul 2008||David Parker Dickerson||Interactive data view and command system|
|US20080181452 *||25 Jul 2007||31 Jul 2008||Yong-Moo Kwon||System and method for Three-dimensional interaction based on gaze and system and method for tracking Three-dimensional gaze|
|US20090113297 *||25 Oct 2007||30 Apr 2009||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Requesting a second content based on a user's reaction to a first content|
|US20090113298 *||24 Oct 2007||30 Apr 2009||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Method of selecting a second content based on a user's reaction to a first content|
|US20090245600 *||28 Mar 2008||1 Oct 2009||Intuitive Surgical, Inc.||Automated panning and digital zooming for robotic surgical systems|
|US20090248036 *||28 Mar 2008||1 Oct 2009||Intuitive Surgical, Inc.||Controlling a robotic surgical tool with a display monitor|
|US20090315869 *||24 Dec 2009||Olympus Corporation||Digital photo frame, information processing system, and control method|
|US20100007601 *||10 Jul 2007||14 Jan 2010||Koninklijke Philips Electronics N.V.||Gaze interaction for information display of gazed items|
|US20100049075 *||30 Sep 2009||25 Feb 2010||Ciaran Bolger||Method and apparatus for monitoring eye tremor|
|US20100182232 *||22 Jan 2009||22 Jul 2010||Alcatel-Lucent Usa Inc.||Electronic Data Input System|
|US20120019662 *||26 Jan 2012||Telepatheye, Inc.||Eye gaze user interface and method|
|US20120173999 *||13 Sep 2010||5 Jul 2012||Paolo Invernizzi||Method and apparatus for using generic software applications by means of ocular control and suitable methods of interaction|
|US20120278745 *||20 Apr 2012||1 Nov 2012||Samsung Electronics Co., Ltd.||Method and apparatus for arranging icon in touch screen terminal|
|US20130250086 *||20 Mar 2012||26 Sep 2013||Cisco Technology, Inc.||Automatic magnification of data on display screen based on eye characteristics of user|
|US20130321265 *||7 Aug 2013||5 Dec 2013||Primesense Ltd.||Gaze-Based Display Control|
|US20130321270 *||6 Aug 2013||5 Dec 2013||Tobii Technology Ab||Arrangement, method and computer program for controlling a computer apparatus based on eye-tracking|
|US20130326431 *||6 Aug 2013||5 Dec 2013||Tobii Technology Ab||Arrangement, method and computer program for controlling a computer apparatus based on eye-tracking|
|US20140049452 *||29 Oct 2013||20 Feb 2014||Telepatheye, Inc.||Eye gaze user interface and calibration method|
|US20140118243 *||26 Feb 2013||1 May 2014||University Of Seoul Industry Cooperation Foundation||Display section determination|
|US20140198033 *||13 Jan 2014||17 Jul 2014||Seiko Epson Corporation||Head-mounted display device, control method for head-mounted display device, and image display system|
|US20150049012 *||23 Jan 2014||19 Feb 2015||Qualcomm Incorporated||Visual, audible, and/or haptic feedback for optical see-through head mounted display with user interaction tracking|
|US20150185832 *||30 Dec 2013||2 Jul 2015||Lenovo (Singapore) Pte, Ltd.||Display alignment based on eye tracking|
|DE102013226973A1 *||20 Dec 2013||25 Jun 2015||Siemens Aktiengesellschaft||Verfahren und Vorrichtung zur gleichzeitigen Darstellung eines medizinischen Bildes und/oder eines grafischen Bedienelementes|
|EP2325722A1||22 Mar 2004||25 May 2011||Queen's University At Kingston||Method and apparatus for communication between humans and devices|
|WO2007026368A2 *||3 Sep 2006||8 Mar 2007||Arthur Rabner||Multi-functional optometric - ophthalmic system for testing, diagnosing, or treating, vision or eyes of a subject, and methodologies thereof|
|WO2012107892A2 *||9 Feb 2012||16 Aug 2012||Primesense Ltd.||Gaze detection in a 3d mapping environment|
|WO2013036237A1 *||8 Sep 2011||14 Mar 2013||Intel Corporation||Eye gaze based location selection for audio visual playback|
|WO2013170073A1 *||9 May 2013||14 Nov 2013||Nokia Corporation||Method and apparatus for determining representations of displayed information based on focus distance|
|U.S. Classification||345/8, 345/472, 348/61, 351/209, 348/63, 351/210|
|International Classification||A61B3/113, G06F3/033, G06F3/048, G06T1/00|
|Cooperative Classification||G02B2027/0127, G06K9/00604, A61B3/113, G02B2027/014, H04N9/3173, G02B2027/0138, G02B2027/0187, G02B27/017|
|European Classification||H04N9/31R9, A61B3/113, G02B27/01C|
|9 May 2000||AS||Assignment|
|17 Mar 2006||FPAY||Fee payment|
Year of fee payment: 4
|3 May 2010||REMI||Maintenance fee reminder mailed|
|24 Sep 2010||LAPS||Lapse for failure to pay maintenance fees|
|16 Nov 2010||FP||Expired due to failure to pay maintenance fee|
Effective date: 20100924