WO1986001963A1 - Improvements relating to an apparatus and method related to control through eye gaze direction - Google Patents
Improvements relating to an apparatus and method related to control through eye gaze direction Download PDFInfo
- Publication number
- WO1986001963A1 WO1986001963A1 PCT/AU1985/000232 AU8500232W WO8601963A1 WO 1986001963 A1 WO1986001963 A1 WO 1986001963A1 AU 8500232 W AU8500232 W AU 8500232W WO 8601963 A1 WO8601963 A1 WO 8601963A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- eye
- calibration
- viewing
- eye gaze
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Theoretical Computer Science (AREA)
- Ophthalmology & Optometry (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Multimedia (AREA)
- Position Input By Displaying (AREA)
- Eye Examination Apparatus (AREA)
Abstract
An eye gaze direction control arrangement using a light source (4) and a detector (5) giving the direction of eye gaze by detecting the positon on a user's eye (3) of the reflection of the light source and using such information to locate a cursor position on a video screen (2) and providing for a recalibration initiation achievable by a user to ensure consistent alignment of such a cursor with actual viewing direction. The calibration may be initiated by the user (1) providing multiple interruptions of the incident light within a predetermined time period (e.g. eye closures). The use (1) is then required to attempt to direct his gaze at various predetermined locations (7, 8, 9, 10) from which an average provides the calibration.
Description
IMPROVEMENTS RELATING TO AN APPARATUS AND METHOD RELATED TO CONTROL THROUGH EYE GAZE DIRECTION
This invention relates to a method and apparatus associated with control through direction of eye gaze.
The use of the direction in which a person is looking to effect control of some operation while not new of itself has hitherto incurred some difficulties.
In order to ensure that there is an adequate correlation between the direction that a person is looking and a target location, it has been found that an act effecting a calibration provides significant advantage.
As a major application of this method and apparatus will be to assist handicapped persons who may have difficulty keeping a head still, or even activate through limb pressure a controlling switch, it is a problem that this invention addresses to provide reliable means by which reasonable alignment can be maintained.
The problem is to interpret whether a user has a temporary difficulty such as for instance a tremor, or whether a major shift to relative positions has occurred so that original alignment characteristics no longer apply.
An object of this invention accordingly is to provide means which will maintain through even difficult situations, good correspondence between a viewing direction of a user and a screen based cursor.
According to a preferred arrangement, there is a source of radiant energy providing a beam of light which when directed into a person ' s eye will result in a reflection and the position of this reflection with respect to the eye is responsive to the direction of gaze of that person.
If a beam of radiant energy is shone into a person's eye, it is possible to identify the reflection of that radiant energy from an appropriate portion of the eye and by knowing the position and direction of the beam of radiant energy, it then becomes possible by determining the location of intersection onto a target which has a known relationship with respect to the source of the beam, some assumptions as to the gaze direction of the eye.
Unfortunately, however, unless a person's head is fully restrained so as to maintain a certain selected eye position and the eye itself with respect to different people has idential characteristics, any assumptions made with respect to the eye direction have to be very approximate indeed.
The purpose of this invention according to its broadest concept is to provide a technique for effecting a calibration procedure so as to provide for any proposed user and a system either a higher accuracy than has hitherto been available or a method to achieve such accuracy which has advantage over any system that has hitherto been used.
In the application of the invention, especially in the preferred embodiment, the techniques involve both computers, hardware items of appropriate type, and software control of the hardware items.
In broader references to the invention it is accordingly understood that such techniques can be achieved either by software controlling hardware, or by hardware in which the techniques are integral with respect to the hardware items.
Accordingly the invention can be said to reside in an eye gaze direction control arrangement including ¬
(1) a light source,
(2) a detector adapted to detect the position of the reflection of the light source on the user's eye,
(3) and means responsive to such detected position and adapted to control the position of a cursor on a screen and maintain this in an approximate alignment with a viewing direction of the user, said means being further adapted to effect a recalibration attempt sequence for calibration of the cursor viewing direction alignment upon an activation signal being effected by the user.
It has been found according to a preferred technique that such an activation signal as can be effected by the user can include interrupting the reflected light pathway a selected number of times within a selected time period.
This can simply mean the closing of the eyelids twice within a two-second period which would not be a normal closure and opening rate, but is something that a user can easily and quickly achieve even in most extreme circumstances.
A turning away of the eye so that reflection communication is entirely lost, can also be used to achieve this result.
For more able users, any limb movement can be used either to intersect a communicating beam or in another instance such movement can be used to activate a switch which can be interrogated from time to time in any controlling programme.
When such a recalibration attempt sequence has been activated, it is a problem that a user may not momentarily be looking at the first calibration viewing target location.
Assistance can be achieved by using a second calibration viewing target location and then comparing correction factors that need to be applied in the case of one location as compared to the second, and if there is not a sufficiently close correlation, proceed to a further calibration viewing target location and comparing again a correction factor with previous results.
Accordingly it may be .preferred that recalibration is effected by a first calibration viewing target location to be caused to be noticeable by a user, and causing information to be stored
identifying the angular direction then detected of a reflection of the radiant beam from the eye when it is to be expected to be viewing the calibration viewing target location then causing a second calibration viewing target location to be noticeable by the user, at a position different from the first said calibration position and such as to cause the eye of a user to assume a different angular relationship with respect to the detection equipment and causing information identifying such second angular position to be stored and then effecting a correction factor to be used for maintaining alignment of the position of the cursor on the screen with a viewing direction of the user.
There is advantage if the calibration viewing targets are located in a non axial alignment such as along a horizontal axis, or in a further case along a vertical axis.
In this way it is easier to compare results in which an x, y co-ordinate location should with a further calibration viewing target location on a common horizontal axis, have a very similar x co-ordinate reading, and likewise with locations on a vertical axis in which the co-ordinates will have a common or very close y co-ordinate reading.
Some people have difficulty maintaining a consistent eye direction. This can be because of an essential instability in the eye or it may be a body tremor or it may be because of difficulties resulting from a disease such as, Parkinsons disease.
While an accurate correlation between eye direction and cursor position on a screen is desirable, a greater tolerance of position can be accommodated with still the apparatus being useful, and accordingly for each recalibration process there is proposed that the said means are adapted to compare the correction factor calculated for the first of the target locations with a correction factor calculated for the second target locations and if the difference between the two is larger than a selected allowable difference, provide for a further target location to be offered to the user and continue the calculation of correction factor and comparison until at least two target locations provide a correction factor within the selected range.
However, if the target locations needed to be offered exceeds a selected number, then there will be provided an enlargement of the range of acceptable difference in correction factor and a continuing offering of target locations.
The method of this invention can be said to reside in the method of determining eye gaze direction control which includes the steps of directing a light source through a user's eye and detecting such reflection with means to enable the viewing direction of the eye to then be determinable subject to calibration, and providing a calibration of a cursor viewing direction alignment upon an activating signal being effected by the user.
Reference has not been made thus far to the specific type of hardware that is being used as the target for viewing and while a video display unit can be used, at least according to the broader concept of this invention such calibration locations can be located in any environment which is useful for the purpose.
The invention will be better understood when referred to a preferred embodiment which shall be described with the assistance of a drawing in which FIG. 1 is a schematic drawing showing a user arranged with a viewing screen and with a target light being directed through the eye of the user and being detected by a video type camera.
Accordingly, the user 1 is positioned facing a video screen 2 with eyes 3 in approximate horizontal alignment.
A small target light 4 radiates light onto the eyes 3 so that the position of the reflection of this light from the eyes is detectable within the video camera 5.
Such an arrangement is a known arrangement.
However, to provide for consistent alignment of the apparent direction of sight of the eyes 3 of the user 1, there is separately provided software within a computer which acting on previously provided factors such as the relative position and direction of the light source 4 and the camera 5 and the position of the viewing screen 2, a correction factor which will modify each received target light reflection and cause a cursor to
be aligned on the video screen 2 in a perceived alignment with the viewing direction of the eye 3.
However, substantial difficulties in the location of the head 6 of the user 1 can alter the relative angular relationships of the light 4, the eye 3 and the screen 2.
Such a substantial realignment is best then followed by altering the correction factor so that viewing direction once again can be brought into alignment with a cursor and therefore the location of interest at any time on the viewing video screen 2.
Accordingly, a calibration technique is provided in which a first calibration viewing target location 7 is displayed on the video screen 2 this comprising a cross which is rotated about its centre fast enough to be noticeable but not too fast as to be indistinguishable. Such a rotation speed can be two rotations each second. The programme provides for the first calibration viewing target location 7 to be maintained for a sufficient period of time for calculation of a new correction factor, and also to ensure consistent viewing direction during a first calibration period of time.
Such a period for the first location 7 can be three seconds after which a second location at 8 is then used, after which a third location 9 is used, after which a fourth location 10 is used.
For each of the further locations, a further correction factor is calculated, such a correction factor being a discernible difference between
a first standard x and y co-ordinate position of a cursor and the difference that would appear to be necessary to bring the cursor to a new position being the actual viewing impact position of the eye of the user 3.
Subsequent to a second viewing target location being proposed at 8, and this being at an inclined relative relationship to the position of target 7, there will then be two correction factors available,
The programme is adapted to compare the respective correction factors and if necessary compare any difference between the two correction factors to a preselected maximum tolerance.
By using two targets which have an inclined relationship with respect to both x and y axes, a single correction factor can then be proposed taking into account both x and y deviations and also deviations in inclination.
A further check is then available by performing two further calculations on the new locations
9 and 10 and obtaining a proposed correction factor for once again both x and y axes and orientation, and then comparing such correction factor with that obtained with targets 7 and 8.
If these are within a selected tolerance, preferably the average of the two correction factors can be calculated and placed in memory for then afterwards continuing application to each of the incoming detected reflected eye angular relationships.
As stated previously, however, it is not always clear that a user has been looking precisely at the target location or within an acceptable
range for such a target location, and accordingly a further check is made comparing the y axis deviation between perceived eye angles for target locations 7 and 9 which are indeed on a common horizontal axis and 8 and 10 which are also on a common horizontal axis, and likewise the tolerance of x axis alignment for target locations 7 and 1 in one instance and 9 and 8 in the other.
It is not always easy for a user 1 to call up a recalibration attempt sequence, and accordingly there is provided that if within any two seconds of time while the apparatus is in use and the eye of the user is having a target light reflection detected during other than a recalibration attempt sequence, the path of light is intersected twice for a period which lies within the range of 0.1 to 0.5 of a second on each occasion, then this will activate a recalibration sequence in which the apparatus will revert from whatever other task it is operating to commence showing on the screen a recalibration target location and continuing with the sequence as has been previously described.
The cutting off of reflection can be achieved in any of many different ways, but according to this embodiment this is achieved by the eyelids of the user being closed within the time constraint of two times within a two-second period.
The apparatus will not be able to readily interpret whether the shut-off of reflection is by way of an eyelid or by way of a full turning of the head away from the viewing apparatus, but of course the result can be the same provided the same time restraint is achieved.
Likewise, a separately activated switch can be operated and can be discernible to call for a recalibration attempt sequence in a further embodiment.
In the case of some users, it will be difficult for them to ensure a sufficiently consistent eye direction for a sufficient period of time to enable adequate recalibration at any instance.
The apparatus is intended to be of value to even the most highly handicapped persons and it is well understood that in some instances accuracy of alignment of eye direction is less important than continuing ability to control the position of a cursor and activities controlled by such position, even though such control within an adequate correction factor might be quite approximate.
In other words, it could be that a person using the apparatus could be placed at risk if at least some control is not available even if generally inaccurate.
Accordingly, there is proposed that before a recalibration attempt, a first comparison of correction factors is made and compared to a first selected tolerance.
If the comparison is not within the selected tolerance, a further two inclined orientated target viewing locations will be provided and once again such correction factors as are calculated compared with an immediately preceding calculated correction factor.
Such sequence will be repeated four times after which the tolerance allowable will be enlarged by a preselected factor and a further recalibration attempt sequence will be effected.
Such enlargement upon a selected number of unsuccessful recalibration attempt sequences will be continued until at least the best possible correction factor in the circumstances has been achieved.
This then describes the embodiments presently put into practice.
The appartus and techniques are applied by computer programmes.
The techniques of computer programming are common place and the embodiment can be effected by appropriate programming of any appropriate computer with sufficient memory and with appropriate means to interpret results from a video camera, and control the positioning of a cursor and other illustrations on a video display screen.
While reference has been made to an activation signal for a recalibration attempt sequence being achievable by the user comprising breaking the beam, a further signal can include detection of a selected eye movement of the user, or sequence of movements.
If such a sequence is not that which is likely to occur in normal usage, such activation can then be voluntarily effected.
In a preferred instance the movement of the eye to follow from a first target to a second target, which is horizontally coincident with the first target but spaced well apart from this, and then back again and have the detection for recalibration attempt sequence effected upon this movement being repeated at least three times within a three-second period can be appropriate.
Obviously alternate sequences intended not to duplicate a sequence ordinarily occurring can be used.
Claims
1. An eye gaze direction control arrangement including -
(1) a light source,
(2) a detector adapted to detect the position of light from the light source being reflected from a user's eye,
(3) and means responsive to such detected position and adapted to control the position of a cursor on a screen and maintain this in an approximate alignment with a viewing direction of the user, said means being further adapted to effect a recalibration attempt sequence for calibration of the cursor viewing direction alignment upon an activation signal being effected by the user.
2. An eye gaze direction control arrangment as in Claim 1 further characterised in that the said means are adapted to effect a recalibration attempt sequence upon a selected number of interruptions of the reflected light within a selected period of time.
3. An eye gaze direction control arrangement as in Claim 1 or 2 further characterised in that recalibration is effected by a first calibration viewing target location to be caused to be noticeable by a user, and causing information to be stored identifying the angular direction then detected of a reflection of the radiant beam from the eye when it is to be expected to be viewing the calibration viewing target location then causing a second calibration viewing target location to be noticeable by the user, at a position different from the first said calibration position and such as to cause the eye of a user to assume a different angular relationship with respect to the detection equipment and causing information identifying such second angular position to be stored and then effecting a correction factor to be used for maintaining alignment of the position of the cursor on the screen with a viewing direction of the user.
4. An eye gaze direction control arrangement as in Claim 2 or 3 further characterised in that the said means are adapted to enter a recalibration attempt sequence upon a selected eye of the user being closed and opened a selected number of times within a selected period of time.
5. An eye gaze direction control arrangement as in either of Claims 3 or 4 further characterised in that the said means are adapted to effect a recalibration attempt sequence upon an initial triggering by the user in which the sequence includes at least three calibration viewing targets, two of which are aligned in a vertical direction and two of which are aligned in a horizontal direction.
6. An eye gaze direction control arrangement as in Claim 5 further characterised in that the said means are adapted to compare horizontal and vertical alignment data from the calibration viewing targets and if the difference is larger than a selected allowable difference, provide for a further target location to be offered to the user and continue the calculation of correction factor and comparison until at least three target locations provide a comparison within the selected range.
7. An eye gaze direction control arrangement as in Claim 6 further characterised in that the said means are adapted to enlarge the range of acceptable difference in horizontal and vertical alignment of the detected eye gaze direction when viewing the calibration target locations after more than a selected number of comparisons have not been within the selected range.
8. A method of determining eye gaze direction control which includes the steps of directing a light source through a user's eye and detecting such reflection with means to enable the viewing direction of the eye to then be determinable subject to calibration, and providing a calibration of a cursor viewing direction alignment upon an activating signal being effected by the user.
9. An eye gaze direction control arrangement substantially described in the specification with reference to and as illustrated by the accompanying drawings .
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB8612298A GB2177276A (en) | 1984-09-24 | 1985-09-24 | Improvements relating to an apparatus and method related to control through eye gaze direction |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AUPG729284 | 1984-09-24 | ||
AUPG7292 | 1984-09-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1986001963A1 true WO1986001963A1 (en) | 1986-03-27 |
Family
ID=3770772
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/AU1985/000232 WO1986001963A1 (en) | 1984-09-24 | 1985-09-24 | Improvements relating to an apparatus and method related to control through eye gaze direction |
Country Status (3)
Country | Link |
---|---|
JP (1) | JPS62500493A (en) |
GB (1) | GB2177276A (en) |
WO (1) | WO1986001963A1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1987007497A1 (en) * | 1986-06-04 | 1987-12-17 | P Lsg Rd Goete | An optical device |
EP0294812A2 (en) * | 1987-06-10 | 1988-12-14 | Research Development Foundation | Calibration controller for controlling electrically operated machines |
FR2648243A1 (en) * | 1989-06-09 | 1990-12-14 | Metrovision Sarl | METHOD FOR SERVICING THE MEANS FOR ADJUSTING AN OPTICAL APPARATUS, MEANS FOR CARRYING OUT SAID METHOD, AND APPARATUS EQUIPPED THEREWITH |
WO1991006263A1 (en) * | 1989-11-07 | 1991-05-16 | Paalsgaard Goete | A communication device |
US5621424A (en) * | 1992-08-24 | 1997-04-15 | Olympus Optical Co., Ltd. | Head mount display apparatus allowing easy switching operation from electronic image to external field image |
US5767821A (en) * | 1989-11-07 | 1998-06-16 | Paalsgaard; Goete | Communication device |
DE19906706A1 (en) * | 1999-02-18 | 2000-08-24 | Bayerische Motoren Werke Ag | Instrument panel with virtual display for motor vehicle dashboard |
US7158097B2 (en) | 2002-10-09 | 2007-01-02 | Canon Kabushiki Kaisha | Gaze tracking system |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3186072B2 (en) * | 1991-01-08 | 2001-07-11 | キヤノン株式会社 | Equipment with gaze detection device |
CN1071030C (en) * | 1992-06-02 | 2001-09-12 | 佳能株式会社 | Optical instrument with sight detector |
DE4330265B4 (en) * | 1992-09-07 | 2004-07-29 | Canon K.K. | Device for detecting the visual axis of an eye of a person operating an optical device |
GB2273991B (en) * | 1992-09-07 | 1996-04-17 | Canon Kk | Optical equipment with device for detecting direction of visual axis |
US6035054A (en) * | 1992-10-29 | 2000-03-07 | Canon Kabushiki Kaisha | Visual axis detection apparatus and optical apparatus provided therewith |
GB2369673B (en) * | 2000-06-09 | 2004-09-15 | Canon Kk | Image processing apparatus |
GB2412431B (en) * | 2004-03-25 | 2007-11-07 | Hewlett Packard Development Co | Self-calibration for an eye tracker |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB1175945A (en) * | 1967-08-23 | 1970-01-01 | Honeywell Inc | Improvements in or relating to Optical Systems |
US3986030A (en) * | 1975-11-03 | 1976-10-12 | Teltscher Erwin S | Eye-motion operable keyboard-accessory |
US4034401A (en) * | 1975-04-22 | 1977-07-05 | Smiths Industries Limited | Observer-identification of a target or other point of interest in a viewing field |
US4109145A (en) * | 1974-05-20 | 1978-08-22 | Honeywell Inc. | Apparatus being controlled by movement of the eye |
FR2382056A1 (en) * | 1977-02-28 | 1978-09-22 | Inst Nat Sante Rech Med | Eye movement tracking system - uses IR beam and TV camera to register reflections from cornea and retina |
AU544490B2 (en) * | 1980-12-31 | 1985-05-30 | Ibm Corp. | Eye controlled information transfer |
-
1985
- 1985-09-24 WO PCT/AU1985/000232 patent/WO1986001963A1/en unknown
- 1985-09-24 GB GB8612298A patent/GB2177276A/en not_active Withdrawn
- 1985-09-24 JP JP60504275A patent/JPS62500493A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB1175945A (en) * | 1967-08-23 | 1970-01-01 | Honeywell Inc | Improvements in or relating to Optical Systems |
US4109145A (en) * | 1974-05-20 | 1978-08-22 | Honeywell Inc. | Apparatus being controlled by movement of the eye |
US4034401A (en) * | 1975-04-22 | 1977-07-05 | Smiths Industries Limited | Observer-identification of a target or other point of interest in a viewing field |
US3986030A (en) * | 1975-11-03 | 1976-10-12 | Teltscher Erwin S | Eye-motion operable keyboard-accessory |
FR2382056A1 (en) * | 1977-02-28 | 1978-09-22 | Inst Nat Sante Rech Med | Eye movement tracking system - uses IR beam and TV camera to register reflections from cornea and retina |
AU544490B2 (en) * | 1980-12-31 | 1985-05-30 | Ibm Corp. | Eye controlled information transfer |
Non-Patent Citations (1)
Title |
---|
IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, Vol. BME-21, No. 4, July 1974, New York (US), J. MERCHANT et al.: "Remote Measurement Of Eye Direction Allowing Subject Motion Over One Cubic Foot Of Space", pages 309-317 * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1987007497A1 (en) * | 1986-06-04 | 1987-12-17 | P Lsg Rd Goete | An optical device |
EP0294812A2 (en) * | 1987-06-10 | 1988-12-14 | Research Development Foundation | Calibration controller for controlling electrically operated machines |
EP0294812A3 (en) * | 1987-06-10 | 1989-10-25 | Research Development Foundation | Calibration controller for controlling electrically operated machines |
FR2648243A1 (en) * | 1989-06-09 | 1990-12-14 | Metrovision Sarl | METHOD FOR SERVICING THE MEANS FOR ADJUSTING AN OPTICAL APPARATUS, MEANS FOR CARRYING OUT SAID METHOD, AND APPARATUS EQUIPPED THEREWITH |
US5094522A (en) * | 1989-06-09 | 1992-03-10 | Philippe Sourdille | Method of automatic control of means for adjusting an apparatus for observing a subject, means for performing the method, and apparatus equipped with these means |
WO1991006263A1 (en) * | 1989-11-07 | 1991-05-16 | Paalsgaard Goete | A communication device |
US5767821A (en) * | 1989-11-07 | 1998-06-16 | Paalsgaard; Goete | Communication device |
US5621424A (en) * | 1992-08-24 | 1997-04-15 | Olympus Optical Co., Ltd. | Head mount display apparatus allowing easy switching operation from electronic image to external field image |
DE19906706A1 (en) * | 1999-02-18 | 2000-08-24 | Bayerische Motoren Werke Ag | Instrument panel with virtual display for motor vehicle dashboard |
US7158097B2 (en) | 2002-10-09 | 2007-01-02 | Canon Kabushiki Kaisha | Gaze tracking system |
Also Published As
Publication number | Publication date |
---|---|
GB2177276A (en) | 1987-01-14 |
GB8612298D0 (en) | 1986-06-25 |
JPS62500493A (en) | 1987-03-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO1986001963A1 (en) | Improvements relating to an apparatus and method related to control through eye gaze direction | |
CN109765994B (en) | Improvements in protection and access to data on computing devices | |
US5138304A (en) | Projected image light pen | |
US6367932B1 (en) | Apparatus and method for visual field testing | |
US7600873B2 (en) | Method of determining the spatial relationship of an eye of a person with respect to a camera device | |
US6152563A (en) | Eye gaze direction tracker | |
US6364486B1 (en) | Method and apparatus for training visual attention capabilities of a subject | |
US5872594A (en) | Method for open loop camera control using a motion model to control camera movement | |
US7705876B2 (en) | Stereoscopic image display | |
US8094122B2 (en) | Guides and indicators for eye movement monitoring systems | |
US8328691B2 (en) | Feedback device for guiding and supervising physical excercises | |
US8350906B2 (en) | Control method based on a voluntary ocular signal particularly for filming | |
US20040046711A1 (en) | User-controlled linkage of information within an augmented reality system | |
US20140201844A1 (en) | Detection of and privacy preserving response to observation of display screen | |
US20030142068A1 (en) | Selective real image obstruction in a virtual reality display apparatus and method | |
US6426740B1 (en) | Visual-axis entry transmission apparatus and method therefor | |
US11399173B2 (en) | System and method for receiving user input in virtual/augmented reality | |
WO1999056274A1 (en) | Vision pointer method and apparatus | |
JP2006323769A (en) | Facing input device | |
JPH074345B2 (en) | Medical diagnostic device by masking the gazing point | |
CN104715213A (en) | Public viewing security for public computer users | |
US6033072A (en) | Line-of-sight-information input apparatus and method | |
JP2006309291A (en) | Pointing device and method based on pupil detection | |
AU4956685A (en) | Improvements relating to an apparatus and method related to control through eye gaze direction | |
JP2991134B2 (en) | Attention point detection system on screen |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AU CH DE DK FI GB JP NO SE US |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |