US20080096643A1 - System and method for using image analysis of user interface signals for program control - Google Patents

System and method for using image analysis of user interface signals for program control Download PDF

Info

Publication number
US20080096643A1
US20080096643A1 US11/502,298 US50229806A US2008096643A1 US 20080096643 A1 US20080096643 A1 US 20080096643A1 US 50229806 A US50229806 A US 50229806A US 2008096643 A1 US2008096643 A1 US 2008096643A1
Authority
US
United States
Prior art keywords
program
user
user interface
level
visual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/502,298
Inventor
Shalini Venkatesh
Srinivasan Laksmanan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Avago Technologies International Sales Pte Ltd
Original Assignee
Avago Technologies General IP Singapore Pte Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Avago Technologies General IP Singapore Pte Ltd filed Critical Avago Technologies General IP Singapore Pte Ltd
Priority to US11/502,298 priority Critical patent/US20080096643A1/en
Assigned to AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. reassignment AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LAKSHMANAN, SRINIVASAN, VENKATESH, SHALINI
Priority to JP2007207612A priority patent/JP2008043760A/en
Publication of US20080096643A1 publication Critical patent/US20080096643A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6027Methods for processing data by generating or executing the game program using adaptive systems learning from user actions, e.g. for skill level adjustment

Definitions

  • the present invention relates generally to interactive computer systems and methods, and more particularly to systems and methods for using image analysis of a user interface signal to control a program.
  • Electronic games using computers, game consoles or hand held consoles typically employ an input device, a processor, and a visual display.
  • the input device can be a mouse, joystick or other form of controller which allow the player to input responses into the processor.
  • the processor can be a part of a computer system or it can be a dedicated game system, such as XBOX® and PLAYSTATION® game systems.
  • the processor communicates with the display to show visual and/or audio images of the game.
  • Games typically place the player in situations where quick and/or agile response is required and a player's score is typically reflective of his/her ability to respond quickly to the challenges presented by the game.
  • a mouse, joystick or a controller is used to relay positional information which is used by the game to move an object on the display.
  • Playing at an appropriate level of play will ensure that the player is challenged by the game, and minimizes the risk of overwhelming or boring the player.
  • user skill levels must be set before a game is started there is not a convenient way in which the user can adjust the skill level short of restarting the game. Often, the user does not even know that the skill level is wrong for that user.
  • FIG. 1 is a diagram of one embodiment of the present invention
  • FIGS. 2A and 2B are overviews of embodiments of the invention.
  • FIG. 3 shows one example of image processing used in the invention.
  • FIG. 4 illustrates a block diagram of one embodiment of the present invention.
  • one embodiment of the present invention includes a user visual interface, such as device 22 , which communicates with processor 12 via processing circuit 30 .
  • the program code or algorithm for a game program may be stored in a memory device, such as memory 14 , and executed by processor 12 .
  • the processing shown in circuit 30 (as will be discussed with respect to FIG. 3 ) can be accomplished in circuitry, in software as part of an algorithm, or partially in each.
  • the processing or circuitry can be separate from, or integral to, processor 12 .
  • User interface 22 may be a stand alone device as shown in FIG. 2 where the visual input device is shown marked on top of display 201 , or it can be incorporated into a computer mouse, a game controller, a joystick, or any other device, such as keyboard 202 , through which the player is able to input his/her actions into processor 12 .
  • Device 22 can be an analog or digital camera or an LED display for capturing images, such as eye blinks, head movements, frequency of face touches, swinging motion of arms or legs, head nods, etc.
  • Processor 12 may include memory devices, microprocessors, and any components which are utilized to execute the program code for a game program.
  • the processor system may be a part of a computer system, a video game console, or a hand held gaming device.
  • the visual image for example, eye blinks, head nods, changes in facial images, etc.
  • the visual image can be used as an indicator of a player's response level.
  • fast eye blinks may suggest a player becoming frustrated or confused.
  • Changes of a player's response level during game play can be used to change a game program aspect, for example the play difficulty could be adjusted.
  • amplifier 31 of image processing circuit 30 receives the image signal, for example, from device 22 shown in FIGS. 2A and 2B .
  • This signal would typically, but not always, be in analog form and would be amplified by amplifier 31 and filtered by filter 32 .
  • the filtering would remove signals outside certain ranges as desired.
  • This filtered signal would be conditioned by circuit 33 and, if necessary, converted to digital format by A/D converter 34 . This signal would then be presented to processor 12 .
  • changes of the image could be detected and signaled to the processor, or, if desired, the entire conditioned signal could be presented to the processor for a determination of image change and a determination of what such an image change means in terms of user discomfort, etc.
  • processor 12 may modify the system, for example by adjusting the difficulty level of the game being played. In one embodiment of the invention, this includes switching between preset difficulty levels defined in the game or to use a difficulty level that is indicated by a database or algorithm to be appropriate with the determined response level. In another embodiment, the processing system may increase or decrease the speed with which it executes the game algorithm. In a still further embodiment, the difficulty can be continuously adjusted until the user response moves within a defined range.
  • FIG. 4 represents a flowchart of one embodiment 40 of the system used to adjust skill level of a game, or other application, based on received visual images.
  • Process 402 acquires a signal from user interface 22 and by image analysis of the signal as discussed with respect to FIG. 3 , as performed by process 403 and process 404 , a determination is made by process 405 if some significant feature of the image has changed. If it has changed from a previous reading (or a previous set of readings as stored in memory), then process 406 determines the response level of the player based on the nature of the received visual images. Process 402 then adjusts the skill level of the game or other parameters based on the received visual image data.
  • Images can be stored for a period of time in a memory (not shown) and an image can be compared against the previous image, or against a composite (or average) of previous images to determine of the user is agitated, bored, wondering, intense, etc. Rapid eye changes could, for example, mean the user is agitated.
  • data can be maintained on a user by user basis such that for a particular user a correlation can be made between visual image changes and action (poor, good, etc.) taken by that user. This correlation then can be the foundation of changes made to the program upon future detection of similar visual changes.
  • a spread sheet program can have different instructions presented based on the frequency of eye movement or other visual image.

Abstract

Advantage is taken of the fact that as a player becomes uncomfortable, visually observable aspects of the player's physiological state, such as the frequency of eye blinks, may change. By analyzing such changes and/or other image changes, as obtained from signals generated by an input device, the system is able to determine changes in a player's comfort level and change the game accordingly.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present application is related to concurrently filed, co-pending, and commonly-assigned U.S. patent application Ser. No. ______, Attorney Docket No. 70051558-01, entitled “SYSTEM AND METHOD FOR USING WAVELET ANALYSIS OF A USER INTERFACE SIGNAL FOR PROGRAM CONTROL,” the disclosure of which is hereby incorporated herein by reference.
  • TECHNICAL FIELD
  • The present invention relates generally to interactive computer systems and methods, and more particularly to systems and methods for using image analysis of a user interface signal to control a program.
  • BACKGROUND OF THE INVENTION
  • Electronic games using computers, game consoles or hand held consoles typically employ an input device, a processor, and a visual display. The input device can be a mouse, joystick or other form of controller which allow the player to input responses into the processor. The processor can be a part of a computer system or it can be a dedicated game system, such as XBOX® and PLAYSTATION® game systems. The processor communicates with the display to show visual and/or audio images of the game.
  • These electronic games typically have different levels of play in order to provide a level of play which is entertaining for the player. For many games, the level of play is selected by the user prior to playing the game, and changing the level of play will require the user to restart the game. A player will often have to try out a variety of levels before finding a level which is suitable for his/her level of play.
  • Games typically place the player in situations where quick and/or agile response is required and a player's score is typically reflective of his/her ability to respond quickly to the challenges presented by the game. In many situations, a mouse, joystick or a controller is used to relay positional information which is used by the game to move an object on the display.
  • Playing at an appropriate level of play will ensure that the player is challenged by the game, and minimizes the risk of overwhelming or boring the player. However, when user skill levels must be set before a game is started there is not a convenient way in which the user can adjust the skill level short of restarting the game. Often, the user does not even know that the skill level is wrong for that user.
  • BRIEF SUMMARY OF THE INVENTION
  • Advantage is taken of the fact that as a player becomes uncomfortable, visually observable aspects of the player's physiological state, such as the frequency of eye blinks, may change. By analyzing such changes and/or other image changes, as obtained from signals generated by an input device, the system is able to determine changes in a player's comfort level and change the game accordingly.
  • The foregoing has outlined rather broadly the features and technical advantages of the present invention in order that the detailed description of the invention that follows may be better understood. Additional features and advantage of the invention will be described hereinafter which form the subject of the claims of the invention. It should be appreciated by those skilled in the art that the conception and specific embodiment disclosed may be readily utilized as a basis for modifying or designing other structures for carrying out the same purposes of the present invention. It should also be realized by those skilled in the art that such equivalent constructions do not depart from the spirit and scope of the inventions as set forth in the appended claims. The novel features which are believed to be characteristic of the invention, both as to its organization and method of operation, together with further objects and advantages will be better understood from the following description when considered in connection with the accompanying figures. It is to be expressly understood, however, that each of the figures is provided for the purpose of illustration and description only and is not intended as a definition of the limits of the present invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • For a more complete understanding of the present invention, reference is now made to the following description taken in conjunction with the accompanying drawings, in which:
  • FIG. 1 is a diagram of one embodiment of the present invention;
  • FIGS. 2A and 2B are overviews of embodiments of the invention;
  • FIG. 3 shows one example of image processing used in the invention; and
  • FIG. 4 illustrates a block diagram of one embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • As shown in FIG. 1, one embodiment of the present invention includes a user visual interface, such as device 22, which communicates with processor 12 via processing circuit 30. The program code or algorithm for a game program may be stored in a memory device, such as memory 14, and executed by processor 12. The processing shown in circuit 30 (as will be discussed with respect to FIG. 3) can be accomplished in circuitry, in software as part of an algorithm, or partially in each. The processing or circuitry can be separate from, or integral to, processor 12.
  • User interface 22 may be a stand alone device as shown in FIG. 2 where the visual input device is shown marked on top of display 201, or it can be incorporated into a computer mouse, a game controller, a joystick, or any other device, such as keyboard 202, through which the player is able to input his/her actions into processor 12. Device 22 can be an analog or digital camera or an LED display for capturing images, such as eye blinks, head movements, frequency of face touches, swinging motion of arms or legs, head nods, etc. Processor 12 may include memory devices, microprocessors, and any components which are utilized to execute the program code for a game program. The processor system may be a part of a computer system, a video game console, or a hand held gaming device.
  • The visual image, for example, eye blinks, head nods, changes in facial images, etc., can be used as an indicator of a player's response level. For example, fast eye blinks may suggest a player becoming frustrated or confused. Changes of a player's response level during game play can be used to change a game program aspect, for example the play difficulty could be adjusted.
  • As shown in FIG. 3, in one embodiment, amplifier 31 of image processing circuit 30 receives the image signal, for example, from device 22 shown in FIGS. 2A and 2B. This signal would typically, but not always, be in analog form and would be amplified by amplifier 31 and filtered by filter 32. The filtering would remove signals outside certain ranges as desired. This filtered signal would be conditioned by circuit 33 and, if necessary, converted to digital format by A/D converter 34. This signal would then be presented to processor 12.
  • Note that as part of the filtering and/or conditioning, changes of the image could be detected and signaled to the processor, or, if desired, the entire conditioned signal could be presented to the processor for a determination of image change and a determination of what such an image change means in terms of user discomfort, etc.
  • Once the player's response level is determined, processor 12 may modify the system, for example by adjusting the difficulty level of the game being played. In one embodiment of the invention, this includes switching between preset difficulty levels defined in the game or to use a difficulty level that is indicated by a database or algorithm to be appropriate with the determined response level. In another embodiment, the processing system may increase or decrease the speed with which it executes the game algorithm. In a still further embodiment, the difficulty can be continuously adjusted until the user response moves within a defined range.
  • FIG. 4 represents a flowchart of one embodiment 40 of the system used to adjust skill level of a game, or other application, based on received visual images. Process 402 acquires a signal from user interface 22 and by image analysis of the signal as discussed with respect to FIG. 3, as performed by process 403 and process 404, a determination is made by process 405 if some significant feature of the image has changed. If it has changed from a previous reading (or a previous set of readings as stored in memory), then process 406 determines the response level of the player based on the nature of the received visual images. Process 402 then adjusts the skill level of the game or other parameters based on the received visual image data. Images can be stored for a period of time in a memory (not shown) and an image can be compared against the previous image, or against a composite (or average) of previous images to determine of the user is agitated, bored, wondering, intense, etc. Rapid eye changes could, for example, mean the user is agitated. If desired, data can be maintained on a user by user basis such that for a particular user a correlation can be made between visual image changes and action (poor, good, etc.) taken by that user. This correlation then can be the foundation of changes made to the program upon future detection of similar visual changes.
  • Note that there could be a table established of user levels such that the received visual images are averaged (or otherwise saved) over a period of time can be used to determine an actual graded level of user comfort. Based on the gradation at any point in time the same (or other) program can be adjusted harder or easier.
  • Note that while a game system has been described, the concepts taught herein can be used to control the operation aspects of any program running on any processor. For example, a spread sheet program can have different instructions presented based on the frequency of eye movement or other visual image.
  • Although the present invention and its advantages have been described in detail, it should be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims. Moreover, the scope of the present application is not intended to be limited to the particular embodiments of the process, machine, manufacture, composition of matter, means, methods, and steps described in the specification. As one of ordinary skill in the art will readily appreciate from the disclosure of the present invention, process, machines, manufacture, compositions of matter, means, methods, or steps, presently existing or later to be developed that perform substantially the same results as the corresponding embodiments described herein may be utilized according to the present invention. Accordingly, the appended claims are intended to include within their scope such process, machines, manufacture, composition of matter, means, methods or steps.

Claims (20)

1. A method of controlling a program running on a processor comprising:
using visual imaging analysis on a signal from a user interface;
determining a user's response level; and
adjusting said program in response to a determined user response.
2. The method of claim 1 wherein said using comprises:
matching a received image against a previously received image.
3. The method of claim 1 wherein said determined response is based on a change in received images.
4. The method of claim 2 wherein the determining is based on a rapid change in received images.
5. The method of claim 1 wherein said program is a game program and said adjusting results in a change of a skill level presented to said user.
6. A system comprising:
a user interface to a program running on a processor;
means for performing visual image processing on a signal from said user interface during execution of a program on said processor; and
means for adjusting at least one operational aspect of said program in response to performed visual imaging analysis.
7. The system of claim 6 wherein said program is a game and said input interface is a camera.
8. The system of claim 6 wherein said operational aspect is selected from the list of:
difficulty level;
instructional level; and
amount of information displayed.
9. The system of claim 6 wherein said signal is a series of visual images.
10. The system of claim 9 wherein said visual images are selected from the list of: eye blinks, face touches, leg movement, arm movement and head movement.
11. The system of claim 6 wherein said image processing means comprises means for measuring the present video images from said user interface against at least one previous video image from said user interface.
12. The system of claim 6 wherein said visual image processing means comprises:
circuitry for amplifying and filtering analog signals.
13. The system of claim 12 wherein said analog signals are from a camera.
14. The system of claim 6 wherein said visual image performing means comprises:
an algorithm for processing input signals form a digital camera.
15. A method of using a program comprising:
setting a program level;
determining changes in a user response level during program use by monitoring visual images of said user as obtained from a user interface signal; and
adjusting said program level based on determined changes in said user's video images over time.
16. The method of claim 15 wherein said determining comprises using differences between visual images as received from said user interface signal.
17. The method of claim 16 wherein said user interface signal represents user image changes while said user is operating a program input device of a game controller component.
18. The method of claim 17 wherein said program level is a difficulty level of said game program.
19. The method of claim 18 wherein said difficulty level is decreased when said visual image monitoring indicates an increase in the rate of visual image change between interface signals.
20. The method of claim 18 wherein said difficulty level is increased when said visual image monitoring indicates substantially no changes between user interface signals.
US11/502,298 2006-08-10 2006-08-10 System and method for using image analysis of user interface signals for program control Abandoned US20080096643A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US11/502,298 US20080096643A1 (en) 2006-08-10 2006-08-10 System and method for using image analysis of user interface signals for program control
JP2007207612A JP2008043760A (en) 2006-08-10 2007-08-09 System and method for use of image analysis of user interface signal for program control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11/502,298 US20080096643A1 (en) 2006-08-10 2006-08-10 System and method for using image analysis of user interface signals for program control

Publications (1)

Publication Number Publication Date
US20080096643A1 true US20080096643A1 (en) 2008-04-24

Family

ID=39178026

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/502,298 Abandoned US20080096643A1 (en) 2006-08-10 2006-08-10 System and method for using image analysis of user interface signals for program control

Country Status (2)

Country Link
US (1) US20080096643A1 (en)
JP (1) JP2008043760A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190143216A1 (en) * 2017-11-15 2019-05-16 International Business Machines Corporation Cognitive user experience optimization

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5555461B2 (en) * 2009-09-08 2014-07-23 株式会社タイトー Audio output device, audio output program

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4332566A (en) * 1979-09-04 1982-06-01 Mazeski Conrad A Monitoring attention and cognition and the effect of sensory motor, nutritional, and other biochemical factors thereon
US5362069A (en) * 1992-12-03 1994-11-08 Heartbeat Corporation Combination exercise device/video game
US5370399A (en) * 1981-11-12 1994-12-06 Richard Spademan, M.D. Game apparatus having incentive producing means
US5377100A (en) * 1993-03-08 1994-12-27 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Method of encouraging attention by correlating video game difficulty with attention level
US5570698A (en) * 1995-06-02 1996-11-05 Siemens Corporate Research, Inc. System for monitoring eyes for detecting sleep behavior
US5583795A (en) * 1995-03-17 1996-12-10 The United States Of America As Represented By The Secretary Of The Army Apparatus for measuring eye gaze and fixation duration, and method therefor
US5774591A (en) * 1995-12-15 1998-06-30 Xerox Corporation Apparatus and method for recognizing facial expressions and facial gestures in a sequence of images
US5772508A (en) * 1995-09-28 1998-06-30 Amtex Co., Ltd. Game or play facilities controlled by physiological information
US5944530A (en) * 1996-08-13 1999-08-31 Ho; Chi Fai Learning method and system that consider a student's concentration level
US20020136435A1 (en) * 2001-03-26 2002-09-26 Prokoski Francine J. Dual band biometric identification system
US6648822B2 (en) * 2000-07-24 2003-11-18 Sharp Kabushiki Kaisha Communication apparatus and communication method for outputting an estimate of a patient's mental state
US20040193413A1 (en) * 2003-03-25 2004-09-30 Wilson Andrew D. Architecture for controlling a computer using hand gestures
US6827579B2 (en) * 2000-11-16 2004-12-07 Rutgers, The State University Of Nj Method and apparatus for rehabilitation of neuromotor disorders
US6890262B2 (en) * 2001-07-19 2005-05-10 Konami Corporation Video game apparatus, method and recording medium storing program for controlling viewpoint movement of simulated camera in video game
US6913536B2 (en) * 2001-03-23 2005-07-05 Nintendo Co., Ltd. Game machine and program therefor
US20060056509A1 (en) * 2004-09-16 2006-03-16 Tooru Suino Image display apparatus, image display control method, program, and computer-readable medium
US7367882B2 (en) * 2001-10-11 2008-05-06 Konami Corporation Game system and computer program for permitting user selection of game difficulty and setting of control character ability parameter
US7488294B2 (en) * 2004-04-01 2009-02-10 Torch William C Biosensors, communicators, and controllers monitoring eye movement and methods for using them

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04325180A (en) * 1991-04-24 1992-11-13 Sanyo Electric Co Ltd Emotional motion sensitive type game machine
JPH07163758A (en) * 1993-12-16 1995-06-27 Matsushita Electric Ind Co Ltd Information processor
JPH09330158A (en) * 1996-06-11 1997-12-22 Omron Corp Data processor, game controller, learning device, data processing method, game control method, and learning method
JPH1165422A (en) * 1997-08-22 1999-03-05 Omron Corp Method for evaluating mental and bodily state of opeator, and method and system for controlling contents of operation using equipment
JP2001252265A (en) * 2000-03-08 2001-09-18 Sharp Corp Biofeedback apparatus
JP2002018135A (en) * 2000-07-06 2002-01-22 Atlus Co Ltd Game device

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4332566A (en) * 1979-09-04 1982-06-01 Mazeski Conrad A Monitoring attention and cognition and the effect of sensory motor, nutritional, and other biochemical factors thereon
US5370399A (en) * 1981-11-12 1994-12-06 Richard Spademan, M.D. Game apparatus having incentive producing means
US5362069A (en) * 1992-12-03 1994-11-08 Heartbeat Corporation Combination exercise device/video game
US5377100A (en) * 1993-03-08 1994-12-27 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Method of encouraging attention by correlating video game difficulty with attention level
US5583795A (en) * 1995-03-17 1996-12-10 The United States Of America As Represented By The Secretary Of The Army Apparatus for measuring eye gaze and fixation duration, and method therefor
US5570698A (en) * 1995-06-02 1996-11-05 Siemens Corporate Research, Inc. System for monitoring eyes for detecting sleep behavior
US5772508A (en) * 1995-09-28 1998-06-30 Amtex Co., Ltd. Game or play facilities controlled by physiological information
US5774591A (en) * 1995-12-15 1998-06-30 Xerox Corporation Apparatus and method for recognizing facial expressions and facial gestures in a sequence of images
US5944530A (en) * 1996-08-13 1999-08-31 Ho; Chi Fai Learning method and system that consider a student's concentration level
US6648822B2 (en) * 2000-07-24 2003-11-18 Sharp Kabushiki Kaisha Communication apparatus and communication method for outputting an estimate of a patient's mental state
US6827579B2 (en) * 2000-11-16 2004-12-07 Rutgers, The State University Of Nj Method and apparatus for rehabilitation of neuromotor disorders
US6913536B2 (en) * 2001-03-23 2005-07-05 Nintendo Co., Ltd. Game machine and program therefor
US20020136435A1 (en) * 2001-03-26 2002-09-26 Prokoski Francine J. Dual band biometric identification system
US6890262B2 (en) * 2001-07-19 2005-05-10 Konami Corporation Video game apparatus, method and recording medium storing program for controlling viewpoint movement of simulated camera in video game
US7367882B2 (en) * 2001-10-11 2008-05-06 Konami Corporation Game system and computer program for permitting user selection of game difficulty and setting of control character ability parameter
US20040193413A1 (en) * 2003-03-25 2004-09-30 Wilson Andrew D. Architecture for controlling a computer using hand gestures
US7488294B2 (en) * 2004-04-01 2009-02-10 Torch William C Biosensors, communicators, and controllers monitoring eye movement and methods for using them
US20060056509A1 (en) * 2004-09-16 2006-03-16 Tooru Suino Image display apparatus, image display control method, program, and computer-readable medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190143216A1 (en) * 2017-11-15 2019-05-16 International Business Machines Corporation Cognitive user experience optimization
US10632387B2 (en) * 2017-11-15 2020-04-28 International Business Machines Corporation Cognitive user experience optimization
US11185781B2 (en) 2017-11-15 2021-11-30 International Business Machines Corporation Cognitive user experience optimization

Also Published As

Publication number Publication date
JP2008043760A (en) 2008-02-28

Similar Documents

Publication Publication Date Title
US9069441B2 (en) Method and apparatus for adjustment of game parameters based on measurement of user performance
JP7183281B2 (en) Dynamic allocation of contextual assistance during gameplay
US9095775B2 (en) User interface and method of user interaction
CA2683728C (en) Vision cognition and coordination testing and training
US8123602B2 (en) Game device and program
US9233302B2 (en) Game program, game device and game control method
JP7286656B2 (en) Identifying player engagement to generate contextual gameplay assistance
JP7267291B2 (en) Assignment of contextual gameplay aids to player reactions
TWI664995B (en) Virtual reality multi-person board game interacting system, initeracting method, and server
US20210113916A1 (en) User adaptation system and method
JP5816213B2 (en) GAME DEVICE AND PROGRAM
US10747308B2 (en) Line-of-sight operation apparatus, method, and medical device
US20080096643A1 (en) System and method for using image analysis of user interface signals for program control
Cuaresma et al. A comparison between tilt-input and facial tracking as input methods for mobile games
JP2011183074A (en) Game system, game server, game device, method of controlling game system, method of controlling game server, method of controlling game device, and program
US20200306624A1 (en) Peripersonal boundary-based augmented reality game environment
US20230201719A1 (en) Dynamic game models
JP2008043751A (en) System and method for performing program control using wavelet analysis of user interface signal
JP5285100B2 (en) GAME SYSTEM, GAME SYSTEM CONTROL METHOD, AND PROGRAM
JP2010220689A (en) Program, information storage medium, and game device
JP2022061352A (en) Game system and game control method
CN112203733B (en) Dynamically configuring contextual assistance during gameplay
US11908097B2 (en) Information processing system, program, and information processing method
JP5816224B2 (en) GAME DEVICE AND PROGRAM
JP2023166352A (en) Game program, game system, game device and game processing method

Legal Events

Date Code Title Description
AS Assignment

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:VENKATESH, SHALINI;LAKSHMANAN, SRINIVASAN;REEL/FRAME:018652/0464;SIGNING DATES FROM 20060728 TO 20060801

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION