US20030062675A1 - Image experiencing system and information processing method - Google Patents

Image experiencing system and information processing method Download PDF

Info

Publication number
US20030062675A1
US20030062675A1 US10/254,789 US25478902A US2003062675A1 US 20030062675 A1 US20030062675 A1 US 20030062675A1 US 25478902 A US25478902 A US 25478902A US 2003062675 A1 US2003062675 A1 US 2003062675A1
Authority
US
United States
Prior art keywords
attitude
player
game
image
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/254,789
Inventor
Hideo Noro
Hiroaki Sato
Taichi Matsui
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2001300544A external-priority patent/JP3584229B2/en
Priority claimed from JP2001300545A external-priority patent/JP3584230B2/en
Application filed by Canon Inc filed Critical Canon Inc
Assigned to CANON KABUSHIKI KAISHA reassignment CANON KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NORO, HIDEO, MATSUI, TAICHI, SATO, HIROAKI
Publication of US20030062675A1 publication Critical patent/US20030062675A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F3/00Board games; Raffle games
    • A63F3/00643Electric board games; Electric features of board games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F3/00Board games; Raffle games
    • A63F3/00895Accessories for board games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/24Electric games; Games using electronic circuits not otherwise provided for
    • A63F2009/2401Detail of input, input devices
    • A63F2009/243Detail of input, input devices with other kinds of input
    • A63F2009/2432Detail of input, input devices with other kinds of input actuated by a sound, e.g. using a microphone
    • A63F2009/2433Voice-actuated
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1012Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving biosensors worn by the player, e.g. for measuring heart beat, limb activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • A63F2300/6676Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera by dedicated player input
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • the present invention relates to a mixed reality technology for providing a game which proceeds by positioning items on a game board.
  • a board game which utilizes a board including areas divided thereon and proceeds by placing, removing or displacing pieces on such areas.
  • games utilizing objects of certain three-dimensional shapes as such pieces there are well known, for example, chess, checker, backgammon, igo, Japanese chess, Japanese backgammon etc.
  • games utilizing cards as such pieces called card games.
  • the games utilizing playing cards are known in numerous kinds, such as bridge, stud poker, draw poker, black jack etc.
  • Such board game or card game itself often assumes a certain event, and the item (for example piece) often assumes a particular animal or a particular person.
  • the board or piece has a shape determined in advance, and the pattern thereof does not change according to the proceeding of the game.
  • the shape or pattern of the pieces do not change according to the situation of the game. For example, in a battle scene, it is not that an actual battle takes place in front of the player, or, in case the player draws a card indicating “the angel gives an instruction”, it is not that an angle actually speaks up.
  • the conventional MR game mentioned in the foregoing provides sufficient feeling of reality but involves a very tedious setting of the game environment. There is often required a large-scale work for preparing the scene setting, and the positions of the objects in the setting have to be measured for each setting. It is also difficult to alter the content of the game.
  • an object of the present invention is to improve the feeling of reality of a board game in addition to the interestingness thereof, and to facilitate understanding of the situation of proceeding of the game.
  • Another object of the present invention is, in comparison with the conventional games utilizing the MR technology, to facilitate installation of the setting and to enable relatively flexible alteration of the content of the game.
  • an image experiencing system for a game which proceeds by placing items on a game board, the system comprising:
  • player position and attitude determining means for obtaining position/attitude information of the view of a player
  • generation means for generating computer graphics according to the items of the game board, corresponding to the position/attitude information of the view of the aforementioned player;
  • a head mounted display capable of displaying thus generated computer graphics in superposition with the image of a real world.
  • FIG. 1 is a view showing an example of the configuration of an image experiencing system in a first embodiment
  • FIG. 2 is a view showing a board and the appearance thereof in calculating the height of view point of a player
  • FIG. 3 is a view showing the internal configuration of a see-through HMD
  • FIG. 4 is a view showing an example of the configuration of an image experiencing system in a first embodiment
  • FIG. 5 is a view showing an example of the configuration of an image experiencing system in a second embodiment
  • FIG. 6 is a view showing an example of the configuration of an image experiencing system in a third embodiment
  • FIG. 7 is a UML activity chart showing the process of a most likelihood position and attitude determining unit constituting a component of the image experiencing system of the third embodiment
  • FIG. 8 is a chart indicating the difference between a value, measured with a position and attitude sensor constituting a component of the image experiencing system of a fourth embodiment, and a real value as a function of elapsing time;
  • FIG. 9 is a view showing an example of the configuration of an image experiencing system in a fourth embodiment
  • FIG. 10 is a UML activity chart showing the process of a position-and-attitude sensor information processing unit constituting a component of the image experiencing system of the fourth embodiment
  • FIG. 11 is a view showing an example of the configuration of an image experiencing system in a fifth embodiment
  • FIG. 12 is a UML activity chart showing the process of a most likelihood position and attitude determining unit constituting a component of the image experiencing system of the fifth embodiment
  • FIG. 13 is a view showing an example of the configuration of an image experiencing system in a sixth embodiment
  • FIG. 14 is a UML activity chart showing the process of an attitude sensor information processing unit constituting a component of the image experiencing system of the sixth embodiment
  • FIG. 15 is a view showing an example of the configuration of an image experiencing system in a seventh embodiment
  • FIG. 16 is a UML activity chart showing the process of a piece operation recognition unit constituting a component of the image experiencing system of the seventh embodiment
  • FIG. 17 is a view showing an example of the configuration of an image experiencing system in a tenth embodiment
  • FIG. 18 is a view showing an example of card patterns to be used for explaining the function of the image experiencing system of the tenth embodiment
  • FIG. 19 is a view showing recognition areas on a card, to be used for explaining the function of the image experiencing system of the tenth embodiment
  • FIG. 20 is a UML activity chart showing the process of a piece image recognition unit constituting a component of the image experiencing system of the tenth embodiment
  • FIG. 21 is a view showing an example of the configuration of an image experiencing system in a eleventh embodiment
  • FIG. 22 is a UML activity chart showing the process of an on-board piece image recognition unit constituting a component of the image experiencing system of the eleventh embodiment
  • FIG. 23 is a view showing an example of the configuration of an image experiencing system in a twelfth embodiment
  • FIGS. 24 and 25 are UML activity charts showing the process of a piece operation recognition unit constituting a component of the image experiencing system of the twelfth embodiment
  • FIG. 26 is a view showing an example of the configuration of an image experiencing system in a thirteenth embodiment
  • FIG. 27 is a view showing the difference in the output from the piece image recognition unit, for explaining a piece image recognition-guide display instruction unit constituting a component of the image experiencing system of the thirteenth embodiment
  • FIG. 28 is a UML activity chart showing the process of the piece image recognition-guide display instruction unit constituting a component of the image experiencing system of the thirteenth embodiment
  • FIG. 29 is a view showing an example of the guide to be displayed in the display unit of the HMD by the image experiencing system of the thirteenth embodiment
  • FIG. 30, composed of FIGS. 30A and 30B, and FIG. 31, composed of FIGS. 31A, 31B and 31 C, are views showing examples of the configuration of an image experiencing system in a fourteenth embodiment
  • FIG. 32 is a view showing a fifteenth embodiment in a conceptual image
  • FIG. 33 is a schematic view showing the configuration of the fifteenth embodiment
  • FIG. 34 is a view showing markers
  • FIG. 35 is a view showing guides
  • FIG. 36 is a view showing card identification
  • FIG. 37 is a view showing a phase proceeding by voice
  • FIG. 38 is a flow chart of a card reading unit
  • FIG. 39 is a flow chart of a position and attitude grasp unit.
  • FIG. 40 is a flow chart of a game management unit.
  • each player wears a see-through HMD (head mounted display) for displaying CF (computer graphics) in superposition with the actual situation of the board game or card game, executed in a limited field of a game board.
  • the CG changes according to the proceeding of the game. For example, in a chess game, a knight piece is represented by CG of a knight on horse back, and at the displacement of the piece, the CG shows a running horse. Also, when a piece captures an opponent piece, there is given fighting and winning CG against CG corresponding to the opponent piece.
  • Such image experiencing system provides, in addition to the interestingness of the actual board game itself, an improved feeling of reality and facilitates grasping the situation of proceeding of the game.
  • the image experiencing system of the present invention is easier in setting and allows to relatively easily accommodate alteration of the content of the game.
  • FIG. 1 is a view showing an example of the configuration of the image experiencing system of a first embodiment.
  • a game board 101 constituting the field of game, and players execute the game by placing, removing or moving pieces on the board 101 .
  • the player wears a see-through HMD 103 in playing the game.
  • a position and attitude sensor 104 is fixed to the HMD 103 and detects the position and attitude of the view of the player.
  • Position/attitude includes both the “position” and “attitude”.
  • position/attitude information means both the “position information” and “attitude information”.
  • Porture means information indicating a point in a specified spatial coordinate system, and is represented, in case of an XYZ orthogonal coordinate system, by a set of three values (x, y, z). Also in case of representing an object on the earth, there can be employed a set of three values of a latitude, a longitude and a height (or a depth).
  • “Attitude” means a direction from the point represented by the “position”, and can be represented by the position of an arbitrary point on such direction, or, in case of the XYZ orthogonal coordinate system, by the angles of the viewing line with the axes of the coordinate system or by specifying a direction of the viewing line (for example -Z direction) and indicating the amounts of rotation from such specified direction about the axes of the coordinate system.
  • the “position” has 3 freedoms
  • the “attitude” also has 3 freedoms.
  • the position and attitude sensor is capable of values of 6 freedoms on “position” and “attitude”.
  • Such position and attitude sensor is commercially available in various forms, for example one utilizing a magnetic field, one based on image processing of a marker photographed by an external camera, or one based on the combination of a gyro sensor and an acceleration sensor.
  • the head position of the player scarcely changes during the play. It is therefore possible also to use the position information calibrated at the start of the game and to use an attitude sensor which measures the attitude information only. In such case, the measurement is made only on the attitude information of 3 freedoms during the game, but the system can process the values of 6 freedoms including the position information initially calibrated. In other words, the attitude sensor and the calibrated position data can be considered to constitute a position and attitude sensor.
  • FIG. 2 shows a method of calibration before the game.
  • a game board 101 is provided with markers 201 for identification on four corners.
  • the markers 201 are provided in a square arrangement, with a length of a side of unity (1).
  • the player is positioned in front of the board, with a view point 202 positioned at the center position of the board and at a distance d from the front side of the board.
  • the rear side of the board appears shorter than the front side.
  • the height h of the view point 202 from the plane of the board 101 can be determined as:
  • the input/output of the image to or from the HMD 103 and the position/attitude information from the position/attitude sensor 104 are processed by a game console or a PC 102 .
  • FIG. 3 is a view showing the internal configuration of the see-through HMD 103 , which includes a video see-through type and an optical see-through type.
  • the light from the external field does not directly reach the eyes of the player.
  • the light from the external field is deflected by a two-sided mirror 301 and enters an image pickup device 302 .
  • An image presented to the player is displayed on a display device (display unit) 303 and enters the eyes the player via the two-sided mirror 301 . If the output image of the image pickup device (image pickup unit) 392 is directly supplied to the display device 303 , the HMD becomes a mere seeing glass, but such output image is processed by the game console or PC 102 in the course of such supply to display the generated CG in superposition.
  • the light directly reaches the eyes of the player, and the separately generated CG are simultaneously displayed and appear in superposition to the eyes of the player.
  • the light from the external field pass through a half mirror and enters the eyes of the player.
  • an image displayed on a display device is reflected by the half mirror and enters the eyes of the player.
  • the image pickup device is unnecessary in this case, but it is required if an image at the view point of the player is used in image processing.
  • a separate camera for image processing may be fixed on the HMD 103 .
  • the game console or PC 102 manages the proceeding of the game as in the ordinary game.
  • the required position/attitude information is limited to the position/attitude relationship between the board 101 and the HMD 103 , and there is not required setting of the scene or stage properties at each installation or the calibration of the sensors to be mounted on the players.
  • the present embodiment requires only a more compact set, in comparison with the MR game, and is easy in the installation work, including the calibration.
  • the details of the MR game are described, for example, in the 22 A: Mixed Reality in the papers of the fourth Convention of the Japanese Virtual Reality Society, by “Design and Implementation for MR Amusement Systems”.
  • the present embodiment can also improve the feeling of reality in comparison with the conventional board games.
  • FIG. 4 shows an example of the configuration of an image experiencing system in which the present embodiment is applied.
  • the game console or PC 102 manages the proceeding of the game by a game management unit 401 .
  • a CG generation unit 405 generates CG (computer graphics) corresponding to each scene. For generating a CG image seen from the view point of the player, the CG generation unit 405 acquires the position and attitude information of the view of the player from a player position and attitude determining means 402 , which includes, for example, a position and attitude sensor 104 , and a position/attitude sensor information processing unit 403 for analyzing such information thereby determining the position and attitude information of the view of the player.
  • the position and attitude sensor information processing unit 403 executes, for example, a format conversion of the data obtained from the position and attitude sensor 104 , a transformation into a coordinate system employed in the present system, and a correction for the difference between the mounting position of the position and attitude sensor and the view point of the HMD 103 .
  • the CG generated in the CG generation unit 405 and corresponding to an image seen from the view point of the player, are superimposed in an image composition unit 404 , in case of the HMDof video tYsee-through type, with an image obtained from the image pickup unit 302 of the HMD 103 , for display on the image display unit 303 .
  • the image pickup unit 302 and the image composition unit 404 can be dispensed with since the image synthesis is unnecessary, and the output of the CG generation unit 405 is directly displayed on the display unit 303 .
  • the game management unit 401 stores information relating to the game itself, or the rules of the game, and, in the course of a game, retains the current status or scene and determines and manages a next state to which the game is to proceed. Also for presenting a scene by CG to the player, it issues a drawing instruction for CG to the CG generation unit 405 .
  • the CG generation unit 405 places model data, which are an internal representation corresponding to each character, in a world, which is an internal representation of a virtual world in which the players are playing.
  • the model data and the world are internally represented by a method called scene graph, and, after the generation of the scene graph of the world, the scene graphs is subjected to rendering.
  • the rendering is executed on a scene seen from the position and attitude, given from the player position and attitude determining means 402 .
  • the rendering may be executed on an unrepresented internal memory or on a display memory called a frame buffer.
  • the rendering is assumed to be executed on the unrepresented internal memory.
  • the image composition unit 404 superimposes the CG, generated by the CG generation unit 405 , with an image obtained by the image pickup unit 302 in the HMD 103 .
  • a method called alpha blending there can be utilized a method called alpha blending.
  • the image composition unit 404 has a pixel output format RGBA including an opacity A (alpha value; 0 A 1) in addition to the intensities of three primary colors RGB, the image synthesis can be executed utilizing such opacity value A.
  • Such alpha blending process can also be executed in the CG generation unit 405 , but separate components are illustrated for ease of explanation of the functions.
  • the present invention is not limited to games but also is applicable to various field such as education, presentation, simulation or visualization.
  • FIG. 5 shows an example of the configuration of the image experiencing system of a second embodiment, which is different from the embodiment shown in FIG. 4 only in the configuration of the player position and attitude determining means 402 .
  • the player position and attitude determining means 402 is composed of a camera 501 fixed to the HMD 103 and a board image recognition unit 502 .
  • the image of the board 101 in the image taken by the camera 501 , varies depending on the position of the view point 202 of the player. Therefore the image taken by the camera 501 is analyzed by the board image recognition unit 502 , to determine the position and the attitude of the view of the player.
  • the board image recognition unit recognizes the image of the board 101 , but the image obtained by the camera 501 appears distorted when the markers 201 are attached to the game board 101 .
  • the position and attitude of the camera 501 can be determined based on such distortion. It is known that the position and attitude of the camera 501 can be determined if the markers 201 in at least four points are correlated. Thus determined position and attitude of the camera 501 are corrected based on the difference between the camera position and the position of the view point of the player to output the position and attitude of the view point 202 of the player.
  • image pickup unit 302 may be used instead of the camera 501 for similar effects.
  • the player position and attitude determining means is composed of the camera fixed to the see-through HMD and the board image recognition unit, and the board image recognition unit determines the relative position and attitude between the board and the view point of the player based on the image of the board taken by the camera, so that the position and attitude information of the view point of the player can be determined without the position and attitude sensor, whereby the configuration can be simplified.
  • FIG. 6 shows an example of the configuration of the image experiencing system of a third embodiment, which is different from the embodiments shown in FIGS. 4 and 5 only in the configuration of the player position and attitude determining means 402 .
  • the player position and attitude determining means 402 is provided with both components of the means 402 shown in FIGS. 4 and 5, and additionally with a most likely position and attitude determining unit 601 .
  • the output of the position and attitude sensor 104 is susceptible to external perturbations and is rather unstable. Therefore, it is conceivable to use the position and attitude information from the board image recognition unit 502 as the output of the player position and attitude determining means, but the board 101 is not necessarily always included in the image taking range of the camera, and there may also be an element hindering the recognition, such as a hand of the player. Therefore, the reliability of the position and attitude information is deteriorated in such situation.
  • the information from the position and attitude sensor 104 is utilized only in such situation.
  • Such configuration allows to obtain the output of the position and attitude information without interruption, and to obtain higher precise position and attitude information while the board 101 is recognized.
  • FIG. 7 is a UML activity chart showing the process of the most likely position and attitude determining unit.
  • the unit awaits the position and attitude information from the position and attitude sensor information processing unit 403 and from the board image recognition unit 502 .
  • the position and attitude information data from the board image recognition unit 502 are suitable. If suitable, the information from the board image recognition unit 502 is used as the output of the player position and attitude determining means 402 , but, if not suitable, there is used the information from the position and attitude sensor information processing unit 403 .
  • the present embodiment allows to obtain reliable position and attitude information even in case any of the position and attitude information determined by plural methods.
  • FIG. 8 is a chart showing the difference dV between the measured value and the true value as a function of elapsing time in the abscissa. There are experienced a small fluctuation as indicated by a broken line, and a large shift as indicated by a solid line, and the present embodiment deals with the case of a large shift as indicated by the solid line.
  • FIG. 9 shows an example of the image experiencing system of the fourth embodiment.
  • the basic configuration is same as in FIG. 6, except that the output of the board image recognition unit 502 is entered into the position and attitude sensor information processing unit 403 in addition to the most likely position and attitude determining unit 601 , and the output of the position and attitude sensor information processing unit 403 is entered into the board image recognition unit 502 in addition to the most likely position and attitude determining unit 601 .
  • the input of the output of the position and attitude sensor information processing unit 403 into the board image recognition unit 502 is considered negligible.
  • FIG. 10 is a UML activity chart showing the process of the position and attitude sensor information processing unit 403 .
  • the information from the position and attitude sensor 104 is processed in normal manner to calculate the position and attitude information, and its value is retracted as a variable LastSensorPro.
  • the variable LastSensorPro is an object variable which is referred to also from another thread to be explained in the following.
  • the calculated position and attitude information is added with a correction value to obtain a return value which is temporarily retracted.
  • the correction value is also an object variable, of which value is set by another thread to be explained in the following.
  • the return value is a local variable, which is only temporarily used for an exclusive execution. Finally, the return value is returned as the output of the position and attitude sensor information processing unit 403 .
  • the aforementioned correction value which is from time to time renewed in response to an output from the board image recognition unit 502 , is calculated in the following manner.
  • the input from the position and attitude sensor information processing unit 403 to the board image recognition unit 502 is considered negligible, but the present invention is not limited to such case.
  • the correction value can be renewed also in the board image recognition unit 502 in a similar manner as in the position and attitude sensor information processing unit. Also the extent of correction on the respective values may be varied depending on the confidences on such values.
  • the correction value for the position and attitude determining means at the lower side is so renewed as to substantially directly release the output at the higher side, but, in case the difference is not so large, the correction value is so renewed as to execute corrections in small amounts on all the values.
  • the renewal of the correction value allows to constantly obtain the position and attitude information on the view point of the player, with a high reliability, in a continuous manner, thereby avoiding the unpleasant feeling resulting from a sudden shift in the CG drawing position.
  • the position and attitude sensor can be provided by the combination of a gyro sensor and an acceleration sensor, in which the gyro sensor detects the attitude information only.
  • attitude sensor can be utilized as the position and attitude sensor if the position is calibrated in advance. Such calibration can be dispensed with if there is simultaneously provided position and attitude determining means consisting of the camera 501 and the board image recognition unit 502 .
  • the position and attitude information is basically calculated by image processing, and, if such information obtained by the image processing is unsuitable, the attitude data alone can be compensated by the value supplied from the attitude sensor.
  • the change in the viewing field resulting from a change in the attitude of the HMD 103 is considered much larger than that resulting from a change in the position of the HMD 103 , so that the compensation of the attitude information alone can be considered significantly useful. For this reason, an attitude sensor is fixed, in addition to the camera 510 , to the HMD 103 .
  • the present embodiment is different only in the configuration of the player position and attitude determining means 302 . More specifically, the position and attitude sensor 103 is replaced by an attitude sensor 1101 , the position and attitude sensor information processing unit 403 is replaced by an attitude sensor information processing unit 1102 , and the most likely position and attitude determining unit 601 is replaced by a most likely attitude determining unit 1103 .
  • the basic process flow is same as in the third embodiment.
  • the output data of the attitude sensor 1101 are processed by the attitude sensor information processing unit 1102 to provide attitude information. It is to be noted that the position and attitude sensor information processing unit 403 outputs the position information, in addition to the attitude information.
  • FIG. 12 is a UML activity chart showing the process of the most likely attitude determining unit 1103 .
  • the unit awaits the attitude information from the attitude sensor information processing unit 1102 and the image recognition information from the board image recognition unit 502 . When both data become available, there is discriminated whether the image recognition information is suitable.
  • the position information alone therein is set as an object variable LastIPPos. Then the image recognition information is returned and the process is terminated.
  • the attitude sensor information is used as the attitude information, and the variable LastIPPos set in the foregoing is used as the deficient position information. Then the position and attitude information, obtained by combining both data, is returned and the process is terminated.
  • the most likely attitude determining unit determines the most reliable attitude of the view point of the player based on the respective output values, utilizing the reliabilities thereof, there can be obtained highly reliable attitude information on the view point of the player, even in case any of the output values is unsuitable.
  • the present embodiment allows to dispense with the calibration of the position and attitude of the view point of the user, which is indispensable in case of employing the attitude sensor only, and to provide an inexpensive system because an attitude sensor can be employed instead of the position and attitude sensor.
  • the attitude information obtained from the attitude sensor information processing unit 1102 shows fluctuation.
  • the present embodiment resolves such drawback by a method similar to the fourth embodiment.
  • FIG. 13 shows an example of the configuration of the image experiencing system of a sixth embodiment, which is same in the configuration as in FIG. 11 except that the output of the board image recognition unit 502 is supplied not only to the most likely attitude determining unit 1103 but also to the attitude sensor information processing unit 1102 . However, the attitude sensor information processing unit 1102 receives only the attitude information within the position and attitude information.
  • FIG. 14 is a UML activity chart showing the process of the attitude sensor information processing unit 1102 .
  • the information from the attitude sensor 1101 is processed in the normal manner to obtain the attitude information, of which value is retracted as a variable LastSensorDir.
  • the variable LastSensorDir is an object variable which is referred to also from another thread to be explained in the following.
  • the calculated attitude information is added with a correction value to obtain a return value which is temporarily retracted.
  • the correction value is also an object variable, of which value is set by another thread to be explained in the following.
  • the return value is a local variable, which is only temporarily used for an exclusive execution. Finally, the return value is returned as the output of the attitude sensor information processing unit 1102 .
  • the aforementioned correction value which is from time to time renewed in response to an output from the board image recognition unit 502 , is calculated in the following manner.
  • the input from the attitude sensor information processing unit 1102 to the board image recognition unit 502 is considered negligible, but the present invention is not limited to such case.
  • the correction value can be renewed also in the board image recognition unit 502 in a similar manner as in the attitude sensor information processing unit. Also the extent of correction on the respective values may be varied depending on the confidences of such values.
  • the correction value for the attitude determining means at the lower side, or a portion relating to the attitude determination in the position and attitude determining means is so renewed as to substantially directly release the output at the higher side, but, in case the difference is not so large, the correction value is so renewed as to execute corrections in small amounts on all the values.
  • the present embodiment allows to obtain the position and attitude information on the view point of the player, including the attitude information of a high reliability, in a continuous manner.
  • the board 101 constituting a field of the board game, includes certain areas and the players execute the game by placing, removing or moving pieces in, from or between these areas.
  • the game management unit 401 grasps the situation of the scene or proceeding of the game, to enable the CG generation unit 405 to generate CG matching such scene or proceeding of the game, whereby the game is felt more realistic to the players.
  • piece operation recognition means for recognizing “which piece” is “placed/removed” in or from “which area”.
  • FIG. 15 shows an example of the configuration of the image experiencing system of a seventh embodiment.
  • a piece operation recognition unit A 1501 is composed of a special mark such as a bar code attached to a piece, and a special mark recognition unit A 1502 such as a bar code reader for regonizing the special mark.
  • the special mark is to be attached on the piece and is therefore omitted from FIG. 15.
  • the special mark is used only for identifying the piece, and can not only be an ordinary printed mark but can also be based on so-called RFID system utilizing an IC chip or the like.
  • the special mark recognition unit A 1502 may be provided in each area on the board 101 , or may be provided collectively for plural or all the areas on the board.
  • the data from the special mark recognition unit A 1502 are transferred to a special mark recognition unit B 1503 , and then to a piece operation recognition unit B 1504 .
  • the special mark recognition unit B 1503 analyzes the information from the special mark recognition unit A 1502 and converts it into a data format required by the piece operation recognition unit B 1504 .
  • the information of “which area” can be identified from the special mark recognition unit B 1503 releasing the output and need not be released.
  • the information indicating “which area” is outputted for thus covered areas.
  • the input from the special mark recognition unit A 1502 is for example a number of 10 digits, such input is converted, for example by a conversion table, into information indicating “which area”.
  • the piece operation recognition unit B 1504 recognizes “which piece” is “placed/removed” in or from “which area”, and transfer the result of such recognition, as the result obtained by the piece operation recognition unit A 1501 , to the game management unit 401 .
  • the game management unit 401 causes the game to proceed, based on the result of recognition from the piece operation recognition unit A 1501 . In the actual proceeding of the game, there may be required information that “which piece” is moved from “which area” to “which area”. Such information is judged by the game management unit 401 , by combining information that a piece is “removed from an area j” and information that “a piece i is placed in an area k”. In this case, if the piece placed in the area j is a piece i, there is judged that “a piece i is moved from an area j to an area k”. The piece in the area j can be identified as the piece i by managing and referring to the history by the game management unit.
  • FIG. 16 is a UML activity chart showing the process of the piece operation recognition unit.
  • the special mark recognition unit A 1501 is provided in each area on the board 101 , whereby a special mark recognition unit i corresponds to the area i.
  • the special mark recognition unit B 1503 returns a special mark identifier j when the piece j is placed, and a particular special mark identifier None when the piece is removed.
  • the piece operation recognition unit A 1501 awaits the input from the special mark recognition unit, and outputs a result “a piece is removed from the area i” or “a piece j is placed in the area i” respectively if the special mark identifier j is None or otherwise.
  • the present embodiment allows the game to proceed, based on the actual operations of the players. Since the CG can be generated matching the scene or proceeding situation of the game, it is felt as a more realistic game to the players.
  • a bar code can be used as the special mark identifier, corresponding to claim 10.
  • the bar code is widely utilized for example in the field of distribution of commodities, and has various features such as easy availability, high accuracy in recognition, stability in recognition, inexpensiveness etc. Particularly in case of a card game, the bar code can be printed simultaneously with the printing of the cards. Also an invisible bar code can be used for attaching the special mark without affecting the design of the cards.
  • An RFID system or radio frequency identification technology which is a non-contact automatic identification technology utilizing radio frequency, can be used as the special mark recognition means.
  • a device called tag or transponder is attached to an article, and an ID specific to the tag is read by a reader.
  • the tag is composed of a semiconductor circuitry including a control circuit, a memory etc. constructed as a single chip, and an antenna.
  • the reader emits an inquiring electric wave, which is also used as electric energy, so that the tag does not require a battery.
  • the tag In response to the inquiring wave, the tag emits the ID stored in advance in the memory. The reader reads such ID, thereby identifying the article.
  • the RFID system is widely employed for example in the ID card or the like, and has features of easy availability, high accuracy in recognition, stability in recognition, inexpensiveness etc. If the tag is incorporated inside the piece, it can be recognized without affecting at all the external appearance of the piece. Also the piece and the board have a larger freedom in designing, since the surface of the piece need not be flat and a non-metallic obstacle may be present between the tag and the reader.
  • the “piece” can be recognized, even without the special mark recognition unit, by an image recognition process on the image obtained with a camera.
  • the piece recognition can be achieved by the pattern on the card surface in case of a card game, or by the shape of the piece in case of chess or the like, or by the shape of the piece and the pattern drawn thereon in other games.
  • FIG. 17 shows an example of the configuration of the image experiencing system of a tenth embodiment.
  • the present embodiment is different only in the configuration of the piece operation recognition means 1501 , wherein the special mark recognition unit 1502 corresponds to a piece recognition camera 1701 and the special mark recognition unit 1503 corresponds to a piece image recognition unit 1702 .
  • the piece operation recognition unit 1504 remains same.
  • FIG. 20 is a UML activity chart showing the process of the piece image recognition unit 1702 .
  • the recognition is executed in two stages, namely the detection of a frame, and then the detection of a pattern. In case the frame cannot be detected, it is judged that the card is not present, and a piece identifier None is returned.
  • the method of frame detection is not illustrated, but can be achieved, for example, by detecting straight lines by Huffman transformation or the like and judging a frame from the positional relationship of such lines.
  • the frame In case the frame is detected, there is then executed detection of the pattern. As shown in FIG. 19, the interior of the frame is divided into four areas, and the color is detected in each area. Various methods are available also for the color detection. For example, in case of detecting white and black only, there is only utilized the luminocity information and the average luminocity is determined in the object area, and the area is judged as black or white respectively if such average luminocity is lower or higher than a predetermined value T B .
  • the areas are numbered from 1 to 4 as shown in FIG. 19, and, if the combination of colors in the areas 1 to 4 are black-white-white-black or white-black-black-white, a piece identifier 1 is returned. If the colors are black-black-white-white, white-white-black-black, black-white-black-white or white-black-white-black, a piece identifier 2 is returned. Any other combination indicates an unexpected card or an erroneous recognition of the frame, and an identifier None is returned, indicating the absence of the card. In case the image recognition is repeated, a same result is outputted in succession.
  • the camera 501 fixed to the HMD 103 can be used for the piece recognition.
  • the system can be simplified as the camera fixed to the HMD of the player is used for the piece operation recognition means.
  • the board 101 can be recognized by the board image recognition unit 502 , the areas provided on the board 101 can be identified.
  • the piece operation recognition means can be constituted by recognizing the pieces in such areas.
  • FIG. 21 shows an example of the configuration of the image experiencing system of an eleventh embodiment.
  • Piece operation recognition means 1501 is composed of an on-board piece image recognition unit 2101 and a piece operation recognition unit 1504 , and the image data to be recognized are entered from the camera 501 while the recognition information of the board 101 is entered from the board image recognition unit 502 .
  • the output information of the board image recognition unit 502 indicates the position and attitude of the view point of the player, from which the position of the board on the image can be easily calculated.
  • FIG. 22 is a UML activity chart showing the process of the on-board piece image recognition unit.
  • the unit receives an image input from the camera 501 , and then the position and attitude of the view point of the HMD from the board image recognition unit 502 .
  • the position and attitude of the board 101 on the input image is calculated from the position and attitude of the view point of the HMD. Then the positions and attitudes of the areas on the input image are calculated from the position and attitude information of the predetermined areas on the board.
  • the image in such position is cut out and subjected to image recognition to recognize the piece in each area. Since the information of each area includes the attitude information, such information may also be utilized in the image recognition to improve the accuracy.
  • the piece is brought to a position at a distance of 30 cm in front of the camera.
  • the card is judged exposed to the camera when the frame arrives at a specified position on the image, and the recognition of the card is executed in such position.
  • the recognition rate can be improved by positioning the piece at a specified position with respect to a specified camera. Also it is possible to simplify the configuration of the recognition unit.
  • FIG. 23 shows an example of the configuration of the image experiencing system of a twelfth embodiment.
  • the present embodiment is different only in the configuration of the piece operation recognition means 1501 .
  • the on-board piece image recognition unit 2101 may be same as that in the eleventh embodiment, or may be further simplified since there is only required judgment that a piece is “placed” or “removed”.
  • the piece image recognition unit can be same as that shown in the tenth embodiment.
  • a piece operation recognition unit 2301 different from the piece operation recognition unit 1504 , receives the inputs from both the on-board piece image recognition unit 2101 and the piece image recognition unit 1702 .
  • FIGS. 24 and 25 are UML activity charts showing the process of the piece operation recognition unit 2301 .
  • FIG. 24 shows a state of receiving information “a piece j is recognized” from the piece image recognition unit 1702
  • FIG. 25 shows a state of receiving information “a piece is placed/removed in an area i” from the on-board piece image recognition unit 2101 .
  • the input to the piece image recognition unit 1702 may be executed, instead of the camera 501 fixed to the HMD 103 , by an exclusive camera such as a separately prepared document camera. Also a similar effect can be attained by replacing the combination of the exclusive camera and the piece image recognition unit 1702 by the special mark recognition unit 1502 and the special mark recognition unit 1503 , prepared separately.
  • the piece can be recognized by the exposure of such piece in front of the camera 501 , but such exposure position is not easily understandable for the player. Also if the image experiencing system is so designed as to improve the ease of use by the players, the spatial range for recognition inevitably becomes wider to result in complication of the recognition unit or in a loss in the recognition rate.
  • the player can expose the piece, without doubt or hesitation, to the spatially limited recognition area. This can be achieved by displaying a guide on the display unit 303 of the HMD 103 , and exposing the piece by the player so as to match the displayed guide.
  • FIG. 26 shows an example of the configuration of the image experiencing system of a thirteenth embodiment.
  • the configuration remains same except that the piece image recognition unit 1702 is replaced by a piece image recognition/guide display instruction unit 2601 , from which information is outputted to the CG generation unit 405 .
  • the piece image recognition/guide display instruction unit 2601 is same in configuration as the piece image recognition unit 1702 , except that it outputs a guide display instruction in case the confidence on the result of recognition is less than a certain level.
  • FIG. 27 shows the difference of the outputs of the piece image recognition unit 1702 and the piece image recognition/guide display instruction unit 2601 .
  • the recognition engine is similar to the piece image recognition unit 1702 , there is judged a recognized state and the result of recognition is outputted if the confidence on the recognition is at least equal to a certain value Th higher than the threshold value of the piece image recognition unit 1702 .
  • Th higher than the threshold value of the piece image recognition unit 1702 .
  • a high recognition rate can be easily realized as the threshold value Th is higher.
  • Tl is same as the recognition threshold Th in case of the piece image recognition unit 1702 , but is selected lower in case of the piece image recognition/guide display instruction unit 2601 .
  • a guide display instruction is given in case the confidence is neither “recognized” nor “not recognized”.
  • FIG. 28 is a UML chart showing the process of the piece image recognition/guide display instruction unit 2601 . If the confidence is selected within a range of 1-0, there stands a relationship 0 ⁇ Tl ⁇ Th ⁇ 1. After the piece image recognition process, if the confidence on the result of recognition is lower than Tl, the situation is judged non-recognized and no action is executed. If the confidence is higher than Th, the situation is judged recognized and the result of recognition is transferred to the piece operation recognition unit 2301 . If the situation is neither of the foregoing, a guide display instruction is issued.
  • the guide display can be, for example, as shown in FIG. 29.
  • a guide for assisting such exposure is prepared by CG and is displayed in superposition in the HMD of the player, the player can easily place and expose the piece in a spatially appropriate position.
  • the event on the board 101 has to be shared by all the players. This can be achieved logically by sharing a game management unit 401 by all the players. Physically, such unit may consist of a single exclusive PC or a specified exclusive game console including other constituent components, or may be provided in plural game consoles or PCs as in the case of a dispersed database.
  • the game console or PC 102 assumes a configuration as shown in FIG. 4, and the gate management unit 401 also reflects the result of operations executed by other players.
  • FIGS. 30A and 30B show examples of the configuration of the image experiencing system of a fourteenth embodiment.
  • a game console or PC 102 is assigned to each player, and such consoles or PCs are mutually connected by a network.
  • the information flowing in the network is utilized for synchronizing the contents of the game management units 401 in the game consoles or PCs.
  • the piece operation recognition means is provided in each game console or PC 102 , with each piece being recognized from plural view points. Ikn such case, it is also possible to exchange the information of recognition through the network, and to utilize the result of recognition of a higher reliability.
  • FIGS. 31A to 31 C show another examples of the configuration of the image experiencing system of the fourteenth embodiment. Game contents are contained in a game server on the internet, and the game consoles or PCs 102 of the players are connected through the internet.
  • the game management unit 401 is composed of local game management units 3101 and a game server 3102 , the latter being provided in an independent equipment.
  • the local game management unit 3101 deals with matters relating only to each player and those requiring feedback to each local player without delay in time. Also data and programs relating to the individual game contents are downloaded from the game server 3102 through the internet, either at the start of the game or in the course of execution thereof.
  • the present embodiment allows plural players to play on a single board, and to display the result of complex operations by the plural players in each HMD based on the view point of each player, thereby enabling experience of a game played by plural players.
  • the present embodiment provides a system of executing a game by adapting MR (mixed reality) technology to a card game, thereby combining a real field and a virtual field by CG (computer graphics).
  • MR mixed reality
  • CG computer graphics
  • the card game is already known in various forms, such as poker or black jack utilizing the playing cards. Recently in fashion is a card game utilizing cards in which the cartoon characters are assigned.
  • Such card game is played on a play sheet or a game board, by players each holding cards. Each card records a cartoon character and its attributes or its specialty skill. The players execute the game by using these cards, and the game is won or lost by the offensive method or power and the defensive method or power, which are determined by the combination of the cards.
  • FIG. 32 is a conceptual view of the game of the present embodiment.
  • the two players respectively wear the see-through HMDs, and are positioned across a board, which constitutes the battle space of the game.
  • the two players play the game by placing the respective cards on the board or moving the cards placed thereon.
  • CG matching the characteristics of each card are displayed on each card.
  • FIG. 33 schematically shows the configuration of the system of the present embodiment.
  • the player wears an HMD 3321 , which is provided with a camera 3320 and a three-dimensional position and attitude measuring means 3322 .
  • the HMD 3321 is connected to a game management unit 3325 while the camera 3320 and the three-dimensional position and attitude measuring means 3322 are connected to a position and attitude grasp unit 3329 , both through signal cables.
  • the camera 3320 is matched with the view of the player and photographs the objects observed by the player.
  • the obtained image data are transferred to the position and attitude grasp unit.
  • the image taken by the camera 3320 contains markers 3331 shown in FIG. 34.
  • the markers 3331 are provided in predetermined positions on the play board, and the positional information of such markers is inputted in advance in the position and attitude grasp unit 3329 . Therefore the area observed by the player can be estimated from the markers 3321 appearing on the image of the camera 3320 .
  • the three-dimensional position and attitude measuring means 3322 measures the position and attitude of the HMD worn by the player, and the measured position and attitude data are supplied to the position and attitude grasp unit 3329 .
  • the position and attitude grasp unit 3329 calculates the range observed by the player.
  • a roof 3327 on which installed is a card recognition camera 3328 .
  • the card recognition camera 3328 may cover the entire area of the play board 3326 , but it is also possible to divide the play board into four areas and to place four card recognition cameras 3328 respectively corresponding to these divided areas, or to place card recognition cameras 3328 by a number corresponding to that of the areas in which the cards are to be placed.
  • the card recognition camera 3328 constantly watches the play board during the game, and the obtained image is transferred to a card reading unit 3324 , which identifies the card on the play board 3326 , based on the obtained image.
  • FIG. 34 is a plan view of the play board 3326 , on which the markers 3331 are provided.
  • Each marker is formed specified color and shape, and the information of such color, shape and position is registered in advance in the position and attitude grasp unit 3329 .
  • a position in the play board corresponding to the taken image can be identified. Then, based on the result of such identification, it is possible to estimate the area of the board observed by the player.
  • the play board 3326 is provided with guides 3341 in a 5 ⁇ 2 arrangement for the player at the front side and also in a 5 ⁇ 2 arrangement for the player at the rear side.
  • the guide 3341 defines an area in which the card is to be placed, and a card placed in any other area will be irrelevant from the proceeding of the game.
  • the precision of card recognition can be improved since the card position is clearly determined by the guide 3341 , which can be composed, for example, of a recess or a ridge corresponding to the card size.
  • step S 701 After the process is started in a step S 701 , the sequence enters an image capturing phase in a step S 702 , in which the image of the play board from the camera 3328 is captured. Then the captured image is identified in a card identification phase of a step S 703 .
  • FIG. 36 is a plan view of the play board 3326 in a state where three cards are placed on the guides shown in FIG. 35.
  • FIG. 36 shows a state where a card ‘G’ is placed at (5, 1), a card ‘2’ at (4, 3) and a card ‘1’ at (3, 4).
  • a step S 703 analyzes the camera image, detects the cards placed on the board by the image recognition technology, and identifies the coordinate value and the kind of each card. The coordinate value and the kind, thus identified, of the cards are retained as card arrangement data.
  • a step S 704 compares the present card arrangement data with the prior data. If the comparison shows no change in the arrangement data, the sequence returns to the image capturing step S 702 . If the arrangement is changed, a step S 705 executes a change in the card arrangement data and the sequence returns to the image capturing step S 702 .
  • step S 801 After the process is started in a step S 801 , the sequence enters an image capturing phase in a step S 802 , in which the image from the camera 3320 attached to the HMD of the player is captured. Then a step S 803 fetches the attitude data from the three-dimensional position and attitude measuring means 3322 . A step S 804 identifies the markers 3331 from the image fetched in the step S 802 , thereby estimating the view point of the player. Then a step S 805 determines the more exact position and attitude of the view point of the player, based on the attitude information obtained in S 803 and the estimated information in S 804 . Thereafter the sequence returns to S 802 .
  • a step S 902 waits for any instruction from the player on the proceeding of the game. If any event arrives, the sequence proceeds to a step S 903 for identifying the kind of the event. If the event is a signal for advancing to a next phase, the sequence proceeds to a step S 904 , but, if otherwise, the sequence returns to S 902 to wait for a next event.
  • the signal for advancing to a next phase is generated by identifying an operation inducing a phase advancement.
  • Such operation inducing a phase advancement may be recognized by various methods.
  • the game advancement can be judged, for example, by recognition of a voice of the player by a voice recognition unit as shown in FIG. 37, or by image recognition of a card exposed by the player in a large image size to the camera 3320 attached to the HMD worn by the player, or by image recognition of a card placement in a specified position on the board in the image obtained from the card recognition camera 3328 .
  • a step S 904 reads the card arrangement data determined by the card reading unit 3324 . Then a step S 905 fetches, from the card arrangement data, the data of the cards relating to the current phase, and calculates the offensive character and the offensive characteristics (offensive power and method) of the offensive side, based on the arrangement and combination of the cards. Then a step S 906 fetches, from the card arrangement data as in the step S 905 , the data of the cards relating to the current phase, and calculates the defensive character and the defensive characteristics (defensive power and method) of the defensive side, based on the arrangement and combination of the cards. The calculation of the offensive and defensive characteristics in the steps S 905 and S 906 is executed according to the rules of the game.
  • a step S 907 calculates the result of battle according to the combination of characters of the offensive and defensive sides, and generates a battle scene matching such result.
  • a step S 908 acquires the view point of the player derived by the position and attitude grasp unit 3329 , then a step S 909 generates CG of the battle scene seen from the view point of the player, and a step S 910 synthesizes the image obtained from the camera 3320 with the CG generated in S 909 .
  • the real field and the virtual field of the CG are superimposed by the MR technology in the HMD worn by the player, thereby displaying a virtual CG character on the card.
  • the present invention allows to combine the real world with the virtual CG world by displaying the virtual CG corresponding to the viewing point, thereby realizing much higher excitement in a game utilizing a play board.
  • the card recognition camera 3328 cannot be installed in satisfactory manner, it is also possible, in order to improve the reading accuracy of the card recognition camera 3328 , to adopt a method in which the player exposes a card in front of the HMD 3321 worn by the player to identify the kind of the card by the camera 3320 , before such card is placed. Also in order to improve the reading accuracy of the card recognition camera 3328 , it is possible to position such camera 3328 in a specified position and to identify the kind of a card by such camera 3328 before such card is placed.
  • a card is employed as the item of the game, but there may also be employed other items such as a piece.
  • the board may have a three-dimensionally stepped structure and the character may be displayed in a position corresponding to the height of such stepped structure.
  • a model of the three-dimensionally stepped structure of the play board is registered in advance, and the position of the synthesized CG is controlled based on the three-dimensional structural information corresponding to the position of the view point.
  • the foregoing embodiment analyzes the image of the camera 3328 and recognizes the pattern on the card by the image recognition technology, but it is also possible to attach a bar code to the card and to identify the card with a bar code reader.
  • the present invention also includes a case of supplying a computer of an apparatus or a system, connected with various devices so as to operate such devices for realizing the functions of the aforementioned embodiments, with program codes of a software realizing the functions of the aforementioned embodiments and causing a computer (or CPU or MPU) of such apparatus or system to execute the program codes thereby operating such devices and realizing the functions of the aforementioned embodiments.
  • the program codes themselves of such software realize the functions of the aforementioned embodiments, and the program codes themselves, and means for supplying the computer with such program codes, for example a memory medium storing such program codes, constitute the present invention.
  • the memory medium for supplying such program codes can be, for example, a floppy disk, a hard disk, an optical disk, a magnetooptical disk, a CD-ROM, a magnetic tape, a non-volatile memory card or a ROM.
  • the present invention naturally includes not only a case where the functions of the aforementioned embodiments are realized by the execution of the supplied program codes by the computer but also a case where the functions of the aforementioned embodiments are realized by the cooperation of such program codes with an OS (operating system) or another application software or the like functioning on the computer.
  • OS operating system
  • the present invention further includes a case where the supplied program codes are once stored in a memory provided in a function expansion board of the computer or a function expansion unit connected to the computer and a CPU or the like provided on such function expansion board or function expansion unit executes all the processes or a part thereof under the instructions of such program codes.

Abstract

The invention intends to increase the feeling of reality in a board game in addition to the interestingness of the game itself and to facilitate understanding of the proceeding situation of the game.
The invention provides an image experiencing system for a game which proceeds by placing items on a game board, comprising player position/attitude determining unit for determining the position/direction information of the view of the player, a generation unit for generating computer graphics based on the items on the game board, according to the position/attitude information of the view of the player, and a head-mounted display for displaying the generated computer graphics in superposition with the image of the real field.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0001]
  • The present invention relates to a mixed reality technology for providing a game which proceeds by positioning items on a game board. [0002]
  • 2. Related Background Art [0003]
  • Among various games, there is already known a board game, which utilizes a board including areas divided thereon and proceeds by placing, removing or displacing pieces on such areas. Among the games utilizing objects of certain three-dimensional shapes as such pieces, there are well known, for example, chess, checker, backgammon, igo, Japanese chess, Japanese backgammon etc. Also there are known games utilizing cards as such pieces, called card games. The games utilizing playing cards are known in numerous kinds, such as bridge, stud poker, draw poker, black jack etc. [0004]
  • Also among the card games, there is known so-called card battle, which utilizes cards specific to the game and in which each card is given a specific function. Also many games utilizing the playing cards do not use a particular board because the divided areas are quite simple, but, in such games, it can be considered that a board including invisible divided areas is present and its existence is recognized and shared by the players. [0005]
  • Such board game or card game itself often assumes a certain event, and the item (for example piece) often assumes a particular animal or a particular person. The board or piece has a shape determined in advance, and the pattern thereof does not change according to the proceeding of the game. [0006]
  • On the other hand, there is also known a game utilizing the MR (mixed reality) technology. In such game, the environment of the game is constructed with a real setting with scene settings and stage properties, and the players execute the game by actually entering such environment. In most cases, each player wears a see-through HMD (head mounted display), which displays a CG (computer graphics) image matching the proceeding of the game, in superposition with an image that can be seen when the HMD is not worn. [0007]
  • In the conventional board games mentioned above, the shape or pattern of the pieces do not change according to the situation of the game. For example, in a battle scene, it is not that an actual battle takes place in front of the player, or, in case the player draws a card indicating “the angel gives an instruction”, it is not that an angle actually speaks up. [0008]
  • Therefore, even though the game itself assumes a certain scene, the game lacks the feeling of reality because of the lack of corresponding display. Also for a similar reason, it is difficult to grasp the situation of proceeding of the game at a sight. [0009]
  • On the other hand, the conventional MR game mentioned in the foregoing provides sufficient feeling of reality but involves a very tedious setting of the game environment. There is often required a large-scale work for preparing the scene setting, and the positions of the objects in the setting have to be measured for each setting. It is also difficult to alter the content of the game. [0010]
  • SUMMARY OF THE INVENTION
  • In consideration of the foregoing, an object of the present invention is to improve the feeling of reality of a board game in addition to the interestingness thereof, and to facilitate understanding of the situation of proceeding of the game. [0011]
  • Another object of the present invention is, in comparison with the conventional games utilizing the MR technology, to facilitate installation of the setting and to enable relatively flexible alteration of the content of the game. [0012]
  • The above-mentioned objects can be attained, according to the present invention, by an image experiencing system for a game which proceeds by placing items on a game board, the system comprising: [0013]
  • player position and attitude determining means for obtaining position/attitude information of the view of a player; [0014]
  • generation means for generating computer graphics according to the items of the game board, corresponding to the position/attitude information of the view of the aforementioned player; and [0015]
  • a head mounted display capable of displaying thus generated computer graphics in superposition with the image of a real world.[0016]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a view showing an example of the configuration of an image experiencing system in a first embodiment; [0017]
  • FIG. 2 is a view showing a board and the appearance thereof in calculating the height of view point of a player; [0018]
  • FIG. 3 is a view showing the internal configuration of a see-through HMD; [0019]
  • FIG. 4 is a view showing an example of the configuration of an image experiencing system in a first embodiment; [0020]
  • FIG. 5 is a view showing an example of the configuration of an image experiencing system in a second embodiment; [0021]
  • FIG. 6 is a view showing an example of the configuration of an image experiencing system in a third embodiment; [0022]
  • FIG. 7 is a UML activity chart showing the process of a most likelihood position and attitude determining unit constituting a component of the image experiencing system of the third embodiment; [0023]
  • FIG. 8 is a chart indicating the difference between a value, measured with a position and attitude sensor constituting a component of the image experiencing system of a fourth embodiment, and a real value as a function of elapsing time; [0024]
  • FIG. 9 is a view showing an example of the configuration of an image experiencing system in a fourth embodiment; [0025]
  • FIG. 10 is a UML activity chart showing the process of a position-and-attitude sensor information processing unit constituting a component of the image experiencing system of the fourth embodiment; [0026]
  • FIG. 11 is a view showing an example of the configuration of an image experiencing system in a fifth embodiment; [0027]
  • FIG. 12 is a UML activity chart showing the process of a most likelihood position and attitude determining unit constituting a component of the image experiencing system of the fifth embodiment; [0028]
  • FIG. 13 is a view showing an example of the configuration of an image experiencing system in a sixth embodiment; [0029]
  • FIG. 14 is a UML activity chart showing the process of an attitude sensor information processing unit constituting a component of the image experiencing system of the sixth embodiment; [0030]
  • FIG. 15 is a view showing an example of the configuration of an image experiencing system in a seventh embodiment; [0031]
  • FIG. 16 is a UML activity chart showing the process of a piece operation recognition unit constituting a component of the image experiencing system of the seventh embodiment; [0032]
  • FIG. 17 is a view showing an example of the configuration of an image experiencing system in a tenth embodiment; [0033]
  • FIG. 18 is a view showing an example of card patterns to be used for explaining the function of the image experiencing system of the tenth embodiment; [0034]
  • FIG. 19 is a view showing recognition areas on a card, to be used for explaining the function of the image experiencing system of the tenth embodiment; [0035]
  • FIG. 20 is a UML activity chart showing the process of a piece image recognition unit constituting a component of the image experiencing system of the tenth embodiment; [0036]
  • FIG. 21 is a view showing an example of the configuration of an image experiencing system in a eleventh embodiment; [0037]
  • FIG. 22 is a UML activity chart showing the process of an on-board piece image recognition unit constituting a component of the image experiencing system of the eleventh embodiment; [0038]
  • FIG. 23 is a view showing an example of the configuration of an image experiencing system in a twelfth embodiment; [0039]
  • FIGS. 24 and 25 are UML activity charts showing the process of a piece operation recognition unit constituting a component of the image experiencing system of the twelfth embodiment; [0040]
  • FIG. 26 is a view showing an example of the configuration of an image experiencing system in a thirteenth embodiment; [0041]
  • FIG. 27 is a view showing the difference in the output from the piece image recognition unit, for explaining a piece image recognition-guide display instruction unit constituting a component of the image experiencing system of the thirteenth embodiment; [0042]
  • FIG. 28 is a UML activity chart showing the process of the piece image recognition-guide display instruction unit constituting a component of the image experiencing system of the thirteenth embodiment; [0043]
  • FIG. 29 is a view showing an example of the guide to be displayed in the display unit of the HMD by the image experiencing system of the thirteenth embodiment; [0044]
  • FIG. 30, composed of FIGS. 30A and 30B, and FIG. 31, composed of FIGS. 31A, 31B and [0045] 31C, are views showing examples of the configuration of an image experiencing system in a fourteenth embodiment;
  • FIG. 32 is a view showing a fifteenth embodiment in a conceptual image; [0046]
  • FIG. 33 is a schematic view showing the configuration of the fifteenth embodiment; [0047]
  • FIG. 34 is a view showing markers; [0048]
  • FIG. 35 is a view showing guides; [0049]
  • FIG. 36 is a view showing card identification; [0050]
  • FIG. 37 is a view showing a phase proceeding by voice; [0051]
  • FIG. 38 is a flow chart of a card reading unit; [0052]
  • FIG. 39 is a flow chart of a position and attitude grasp unit; and [0053]
  • FIG. 40 is a flow chart of a game management unit. [0054]
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Now the present invention will be clarified in detail by the following description, which is to be taken in conjunction with the accompanying drawings, in which equivalent configurations are represented by same numbers. [0055]
  • In the image experiencing system to be explained in the following, each player wears a see-through HMD (head mounted display) for displaying CF (computer graphics) in superposition with the actual situation of the board game or card game, executed in a limited field of a game board. The CG changes according to the proceeding of the game. For example, in a chess game, a knight piece is represented by CG of a knight on horse back, and at the displacement of the piece, the CG shows a running horse. Also, when a piece captures an opponent piece, there is given fighting and winning CG against CG corresponding to the opponent piece. [0056]
  • Such image experiencing system provides, in addition to the interestingness of the actual board game itself, an improved feeling of reality and facilitates grasping the situation of proceeding of the game. [0057]
  • Also in comparison with the conventional game utilizing the MR technology, the image experiencing system of the present invention is easier in setting and allows to relatively easily accommodate alteration of the content of the game. [0058]
  • (First Embodiment) [0059]
  • FIG. 1 is a view showing an example of the configuration of the image experiencing system of a first embodiment. [0060]
  • There is provided a [0061] game board 101 constituting the field of game, and players execute the game by placing, removing or moving pieces on the board 101. The player wears a see-through HMD 103 in playing the game. A position and attitude sensor 104 is fixed to the HMD 103 and detects the position and attitude of the view of the player.
  • In the following there will be given definitions for the terms “position/attitude (posture)”, “position” and “attitude” to be used in the present specification. “Position/attitude” includes both the “position” and “attitude”. Thus, “position/attitude information” means both the “position information” and “attitude information”. [0062]
  • “Position” means information indicating a point in a specified spatial coordinate system, and is represented, in case of an XYZ orthogonal coordinate system, by a set of three values (x, y, z). Also in case of representing an object on the earth, there can be employed a set of three values of a latitude, a longitude and a height (or a depth). “Attitude” means a direction from the point represented by the “position”, and can be represented by the position of an arbitrary point on such direction, or, in case of the XYZ orthogonal coordinate system, by the angles of the viewing line with the axes of the coordinate system or by specifying a direction of the viewing line (for example -Z direction) and indicating the amounts of rotation from such specified direction about the axes of the coordinate system. [0063]
  • In the absence of other limiting conditions, the “position” has 3 freedoms, and the “attitude” also has [0064] 3 freedoms.
  • The position and attitude sensor is capable of values of 6 freedoms on “position” and “attitude”. [0065]
  • Such position and attitude sensor is commercially available in various forms, for example one utilizing a magnetic field, one based on image processing of a marker photographed by an external camera, or one based on the combination of a gyro sensor and an acceleration sensor. [0066]
  • Because of the nature of the board game, the head position of the player scarcely changes during the play. It is therefore possible also to use the position information calibrated at the start of the game and to use an attitude sensor which measures the attitude information only. In such case, the measurement is made only on the attitude information of 3 freedoms during the game, but the system can process the values of 6 freedoms including the position information initially calibrated. In other words, the attitude sensor and the calibrated position data can be considered to constitute a position and attitude sensor. [0067]
  • FIG. 2 shows a method of calibration before the game. A [0068] game board 101 is provided with markers 201 for identification on four corners. For the purpose of simplicity, it is assumed that the markers 201 are provided in a square arrangement, with a length of a side of unity (1). The player is positioned in front of the board, with a view point 202 positioned at the center position of the board and at a distance d from the front side of the board. When the player observes the board in this state, the rear side of the board appears shorter than the front side. Based on an observed length m1 of the front side and an observed length m2 of the rear side, the height h of the view point 202 from the plane of the board 101, though dependent on the projection method, can be determined as:
  • h=((m 2 2(d+1)2 m 1 2 d 2)/(m 2 m 2 2))0.5
  • The input/output of the image to or from the [0069] HMD 103 and the position/attitude information from the position/attitude sensor 104 are processed by a game console or a PC 102.
  • FIG. 3 is a view showing the internal configuration of the see-through [0070] HMD 103, which includes a video see-through type and an optical see-through type.
  • In case of the video see-through type, the light from the external field does not directly reach the eyes of the player. The light from the external field is deflected by a two-[0071] sided mirror 301 and enters an image pickup device 302. An image presented to the player is displayed on a display device (display unit) 303 and enters the eyes the player via the two-sided mirror 301. If the output image of the image pickup device (image pickup unit) 392 is directly supplied to the display device 303, the HMD becomes a mere seeing glass, but such output image is processed by the game console or PC 102 in the course of such supply to display the generated CG in superposition.
  • In case of the optical see-through type, the light directly reaches the eyes of the player, and the separately generated CG are simultaneously displayed and appear in superposition to the eyes of the player. [0072]
  • The light from the external field pass through a half mirror and enters the eyes of the player. At the same time, an image displayed on a display device is reflected by the half mirror and enters the eyes of the player. The image pickup device is unnecessary in this case, but it is required if an image at the view point of the player is used in image processing. Instead of the [0073] image pickup device 302, a separate camera for image processing may be fixed on the HMD 103.
  • The game console or [0074] PC 102 manages the proceeding of the game as in the ordinary game.
  • In case of a board game, the required position/attitude information is limited to the position/attitude relationship between the [0075] board 101 and the HMD 103, and there is not required setting of the scene or stage properties at each installation or the calibration of the sensors to be mounted on the players.
  • The present embodiment requires only a more compact set, in comparison with the MR game, and is easy in the installation work, including the calibration. The details of the MR game are described, for example, in the [0076] 22A: Mixed Reality in the papers of the fourth Convention of the Japanese Virtual Reality Society, by “Design and Implementation for MR Amusement Systems”.
  • The present embodiment can also improve the feeling of reality in comparison with the conventional board games. [0077]
  • FIG. 4 shows an example of the configuration of an image experiencing system in which the present embodiment is applied. [0078]
  • The game console or [0079] PC 102 manages the proceeding of the game by a game management unit 401. A CG generation unit 405 generates CG (computer graphics) corresponding to each scene. For generating a CG image seen from the view point of the player, the CG generation unit 405 acquires the position and attitude information of the view of the player from a player position and attitude determining means 402, which includes, for example, a position and attitude sensor 104, and a position/attitude sensor information processing unit 403 for analyzing such information thereby determining the position and attitude information of the view of the player.
  • The position and attitude sensor [0080] information processing unit 403 executes, for example, a format conversion of the data obtained from the position and attitude sensor 104, a transformation into a coordinate system employed in the present system, and a correction for the difference between the mounting position of the position and attitude sensor and the view point of the HMD 103.
  • The CG, generated in the [0081] CG generation unit 405 and corresponding to an image seen from the view point of the player, are superimposed in an image composition unit 404, in case of the HMDof video tYsee-through type, with an image obtained from the image pickup unit 302 of the HMD 103, for display on the image display unit 303.
  • In case of the HMD of optical see-through type, the [0082] image pickup unit 302 and the image composition unit 404 can be dispensed with since the image synthesis is unnecessary, and the output of the CG generation unit 405 is directly displayed on the display unit 303.
  • The [0083] game management unit 401 stores information relating to the game itself, or the rules of the game, and, in the course of a game, retains the current status or scene and determines and manages a next state to which the game is to proceed. Also for presenting a scene by CG to the player, it issues a drawing instruction for CG to the CG generation unit 405.
  • According to the instruction from the [0084] game management unit 401, the CG generation unit 405 places model data, which are an internal representation corresponding to each character, in a world, which is an internal representation of a virtual world in which the players are playing. The model data and the world are internally represented by a method called scene graph, and, after the generation of the scene graph of the world, the scene graphs is subjected to rendering. In this operation, the rendering is executed on a scene seen from the position and attitude, given from the player position and attitude determining means 402.
  • The rendering may be executed on an unrepresented internal memory or on a display memory called a frame buffer. For the purpose of simplicity, the rendering is assumed to be executed on the unrepresented internal memory. [0085]
  • The [0086] image composition unit 404 superimposes the CG, generated by the CG generation unit 405, with an image obtained by the image pickup unit 302 in the HMD 103. For superimposed display of images, there can be utilized a method called alpha blending. In case the image composition unit 404 has a pixel output format RGBA including an opacity A (alpha value; 0 A 1) in addition to the intensities of three primary colors RGB, the image synthesis can be executed utilizing such opacity value A.
  • As an example, let us consider a case where a pixel on the output from the [0087] image pickup unit 302 has RGB values (R1, G1, B1), while a corresponding pixel of the image composition unit 404 has RGB values (R2, G2, B2) and an opacity value A.
  • In such case, the corresponding pixel values outputted to the [0088] display unit 303 are given by:
  • (R 1*(1−A)+R2 *A, G 1*(1A)+G 2 *A, B 1*(1A)+B 2 *A).
  • Such alpha blending process can also be executed in the [0089] CG generation unit 405, but separate components are illustrated for ease of explanation of the functions.
  • The above-described configuration allows the player to feel a heightened feeling of reality in addition to the interestingness of the game itself. Also the player can easily grasp the proceeding situation of the game, as the superimposed CG are synchronized with the proceeding of the game. [0090]
  • As the position and attitude information, there is only required the relative relationship between the [0091] board 101 and the HMD 103 in position and attitude, and there is not required the scene setting or the setting of stage properties or the calibration of the sensor to be mounted on the player, at each installation of the game.
  • The present invention is not limited to games but also is applicable to various field such as education, presentation, simulation or visualization. [0092]
  • (Second Embodiment) [0093]
  • FIG. 5 shows an example of the configuration of the image experiencing system of a second embodiment, which is different from the embodiment shown in FIG. 4 only in the configuration of the player position and [0094] attitude determining means 402.
  • The player position and attitude determining means [0095] 402 is composed of a camera 501 fixed to the HMD 103 and a board image recognition unit 502. The image of the board 101, in the image taken by the camera 501, varies depending on the position of the view point 202 of the player. Therefore the image taken by the camera 501 is analyzed by the board image recognition unit 502, to determine the position and the attitude of the view of the player.
  • The board image recognition unit recognizes the image of the [0096] board 101, but the image obtained by the camera 501 appears distorted when the markers 201 are attached to the game board 101. The position and attitude of the camera 501 can be determined based on such distortion. It is known that the position and attitude of the camera 501 can be determined if the markers 201 in at least four points are correlated. Thus determined position and attitude of the camera 501 are corrected based on the difference between the camera position and the position of the view point of the player to output the position and attitude of the view point 202 of the player.
  • In case the [0097] image pickup unit 302 is attached to HMD 103, such image pickup unit 302 may be used instead of the camera 501 for similar effects.
  • Thus, in the present embodiment, the player position and attitude determining means is composed of the camera fixed to the see-through HMD and the board image recognition unit, and the board image recognition unit determines the relative position and attitude between the board and the view point of the player based on the image of the board taken by the camera, so that the position and attitude information of the view point of the player can be determined without the position and attitude sensor, whereby the configuration can be simplified. [0098]
  • (Third Embodiment) [0099]
  • FIG. 6 shows an example of the configuration of the image experiencing system of a third embodiment, which is different from the embodiments shown in FIGS. 4 and 5 only in the configuration of the player position and [0100] attitude determining means 402.
  • The player position and attitude determining means [0101] 402 is provided with both components of the means 402 shown in FIGS. 4 and 5, and additionally with a most likely position and attitude determining unit 601.
  • In general, the output of the position and [0102] attitude sensor 104 is susceptible to external perturbations and is rather unstable. Therefore, it is conceivable to use the position and attitude information from the board image recognition unit 502 as the output of the player position and attitude determining means, but the board 101 is not necessarily always included in the image taking range of the camera, and there may also be an element hindering the recognition, such as a hand of the player. Therefore, the reliability of the position and attitude information is deteriorated in such situation.
  • Therefore, the information from the position and [0103] attitude sensor 104 is utilized only in such situation. Such configuration allows to obtain the output of the position and attitude information without interruption, and to obtain higher precise position and attitude information while the board 101 is recognized.
  • FIG. 7 is a UML activity chart showing the process of the most likely position and attitude determining unit. [0104]
  • At first, the unit awaits the position and attitude information from the position and attitude sensor [0105] information processing unit 403 and from the board image recognition unit 502. When both data become available, there is discriminated whether the position and attitude information data from the board image recognition unit 502 are suitable. If suitable, the information from the board image recognition unit 502 is used as the output of the player position and attitude determining means 402, but, if not suitable, there is used the information from the position and attitude sensor information processing unit 403.
  • In the foregoing it is assumed that highly precise values are obtained from the board [0106] image recognition unit 502 during a proper recognition but unsuitable values are obtained otherwise, but the present invention is not limited to such case. A certain image from the camera 501 may only provide values of low reliability, and in a certain position and attitude sensor, the reliability may be high only in a limited range and gradually decreases outside such range. Consequently the system has to be so designed as to provide most suitable values based on the reliability of the output values of the position and attitude sensor information processing unit 403 and the board image recognition unit 502.
  • The present embodiment allows to obtain reliable position and attitude information even in case any of the position and attitude information determined by plural methods. [0107]
  • (Fourth Embodiment) [0108]
  • The position and attitude sensor is available in various types, but such sensor in many types is associated with a drawback of fluctuation of the obtained values. FIG. 8 is a chart showing the difference dV between the measured value and the true value as a function of elapsing time in the abscissa. There are experienced a small fluctuation as indicated by a broken line, and a large shift as indicated by a solid line, and the present embodiment deals with the case of a large shift as indicated by the solid line. [0109]
  • In case of a large shift, the difference of two consecutive samples of dV is small if the sampling is executed with a sufficiently high frequency. Therefore, in a state where the output value of the board [0110] image recognition unit 502 is suitable, dV is calculated assuming such value as the true value and −dV is used as the correction value for the position and attitude sensor information.
  • In this manner, a large shift in the value starts from 0 when the output value of the board [0111] image recognition unit 502 becomes unsuitable, and the position and attitude information released by the player position and attitude determining means 402 in such state remains continuous, so that the player is relieved from an unpleasant feeling caused by a sudden shift in the position of the CG image.
  • Also in case the unsuitable period of the value from the board [0112] image recognition unit 502 is sufficiently short, the change in dV resulting from a large shift is small, so that the CG drawing position does not become discontinuous when the output value of the board image recognition unit 502 is adopted as the value of the player position and attitude determining means 402.
  • FIG. 9 shows an example of the image experiencing system of the fourth embodiment. [0113]
  • The basic configuration is same as in FIG. 6, except that the output of the board [0114] image recognition unit 502 is entered into the position and attitude sensor information processing unit 403 in addition to the most likely position and attitude determining unit 601, and the output of the position and attitude sensor information processing unit 403 is entered into the board image recognition unit 502 in addition to the most likely position and attitude determining unit 601. In the following description, however, the input of the output of the position and attitude sensor information processing unit 403 into the board image recognition unit 502 is considered negligible.
  • FIG. 10 is a UML activity chart showing the process of the position and attitude sensor [0115] information processing unit 403.
  • The information from the position and [0116] attitude sensor 104 is processed in normal manner to calculate the position and attitude information, and its value is retracted as a variable LastSensorPro. The variable LastSensorPro is an object variable which is referred to also from another thread to be explained in the following.
  • Subsequently, the calculated position and attitude information is added with a correction value to obtain a return value which is temporarily retracted. The correction value is also an object variable, of which value is set by another thread to be explained in the following. The return value is a local variable, which is only temporarily used for an exclusive execution. Finally, the return value is returned as the output of the position and attitude sensor [0117] information processing unit 403.
  • The aforementioned correction value, which is from time to time renewed in response to an output from the board [0118] image recognition unit 502, is calculated in the following manner.
  • At first there is discriminated whether the image recognition information is suitable. If unsuitable, the renewing process is not executed. If suitable, the correction value is set by subtracting the variable LastSensorPro from the position and attitude information obtained by the image recognition. [0119]
  • In the foregoing, the input from the position and attitude sensor [0120] information processing unit 403 to the board image recognition unit 502 is considered negligible, but the present invention is not limited to such case. In case highly precise information can be obtained from the position and attitude sensor 104, the correction value can be renewed also in the board image recognition unit 502 in a similar manner as in the position and attitude sensor information processing unit. Also the extent of correction on the respective values may be varied depending on the confidences on such values. For example, in case the difference of the confidences is very large, the correction value for the position and attitude determining means at the lower side is so renewed as to substantially directly release the output at the higher side, but, in case the difference is not so large, the correction value is so renewed as to execute corrections in small amounts on all the values.
  • According to the present embodiment, the renewal of the correction value allows to constantly obtain the position and attitude information on the view point of the player, with a high reliability, in a continuous manner, thereby avoiding the unpleasant feeling resulting from a sudden shift in the CG drawing position. [0121]
  • (Fifth Embodiment) [0122]
  • It is explained in the foregoing that the position and attitude sensor can be provided by the combination of a gyro sensor and an acceleration sensor, in which the gyro sensor detects the attitude information only. Such attitude sensor can be utilized as the position and attitude sensor if the position is calibrated in advance. Such calibration can be dispensed with if there is simultaneously provided position and attitude determining means consisting of the [0123] camera 501 and the board image recognition unit 502.
  • The position and attitude information is basically calculated by image processing, and, if such information obtained by the image processing is unsuitable, the attitude data alone can be compensated by the value supplied from the attitude sensor. In case of a board game or a card game, the change in the viewing field resulting from a change in the attitude of the [0124] HMD 103 is considered much larger than that resulting from a change in the position of the HMD 103, so that the compensation of the attitude information alone can be considered significantly useful. For this reason, an attitude sensor is fixed, in addition to the camera 510, to the HMD 103.
  • This constitutes the image experiencing system of a fifth embodiment, of which configuration is shown in FIG. 11. [0125]
  • In comparison with the embodiment shown in FIG. 6, the present embodiment is different only in the configuration of the player position and [0126] attitude determining means 302. More specifically, the position and attitude sensor 103 is replaced by an attitude sensor 1101, the position and attitude sensor information processing unit 403 is replaced by an attitude sensor information processing unit 1102, and the most likely position and attitude determining unit 601 is replaced by a most likely attitude determining unit 1103.
  • The basic process flow is same as in the third embodiment. The output data of the [0127] attitude sensor 1101 are processed by the attitude sensor information processing unit 1102 to provide attitude information. It is to be noted that the position and attitude sensor information processing unit 403 outputs the position information, in addition to the attitude information.
  • FIG. 12 is a UML activity chart showing the process of the most likely [0128] attitude determining unit 1103.
  • At first, the unit awaits the attitude information from the attitude sensor [0129] information processing unit 1102 and the image recognition information from the board image recognition unit 502. When both data become available, there is discriminated whether the image recognition information is suitable.
  • If suitable, the position information alone therein is set as an object variable LastIPPos. Then the image recognition information is returned and the process is terminated. [0130]
  • If the information is identified not suitable, the attitude sensor information is used as the attitude information, and the variable LastIPPos set in the foregoing is used as the deficient position information. Then the position and attitude information, obtained by combining both data, is returned and the process is terminated. [0131]
  • In the foregoing it is assumed that the values of the board image recognition unit are more reliable that those of the attitude information processing unit with respect to the attitude information, but the most likely [0132] attitude determining unit 1103 may calculate and determine the attitude information from both values, depending on the reliabilities thereof.
  • In the present embodiment, as the most likely attitude determining unit determines the most reliable attitude of the view point of the player based on the respective output values, utilizing the reliabilities thereof, there can be obtained highly reliable attitude information on the view point of the player, even in case any of the output values is unsuitable. [0133]
  • The present embodiment allows to dispense with the calibration of the position and attitude of the view point of the user, which is indispensable in case of employing the attitude sensor only, and to provide an inexpensive system because an attitude sensor can be employed instead of the position and attitude sensor. [0134]
  • (Sixth Embodiment) [0135]
  • As explained in the foregoing, also the attitude information obtained from the attitude sensor [0136] information processing unit 1102 shows fluctuation. The present embodiment resolves such drawback by a method similar to the fourth embodiment.
  • FIG. 13 shows an example of the configuration of the image experiencing system of a sixth embodiment, which is same in the configuration as in FIG. 11 except that the output of the board [0137] image recognition unit 502 is supplied not only to the most likely attitude determining unit 1103 but also to the attitude sensor information processing unit 1102. However, the attitude sensor information processing unit 1102 receives only the attitude information within the position and attitude information.
  • FIG. 14 is a UML activity chart showing the process of the attitude sensor [0138] information processing unit 1102.
  • The information from the [0139] attitude sensor 1101 is processed in the normal manner to obtain the attitude information, of which value is retracted as a variable LastSensorDir. The variable LastSensorDir is an object variable which is referred to also from another thread to be explained in the following.
  • Subsequently, the calculated attitude information is added with a correction value to obtain a return value which is temporarily retracted. The correction value is also an object variable, of which value is set by another thread to be explained in the following. The return value is a local variable, which is only temporarily used for an exclusive execution. Finally, the return value is returned as the output of the attitude sensor [0140] information processing unit 1102.
  • The aforementioned correction value, which is from time to time renewed in response to an output from the board [0141] image recognition unit 502, is calculated in the following manner.
  • At first there is discriminated whether the image recognition information is suitable. If unsuitable, the renewing process is not executed. If suitable, the correction value is set by subtracting the variable LastSensorDir from the attitude information obtained by the image recognition. [0142]
  • In the foregoing, the input from the attitude sensor [0143] information processing unit 1102 to the board image recognition unit 502 is considered negligible, but the present invention is not limited to such case. In case highly precise information can be obtained from the attitude sensor 1101, the correction value can be renewed also in the board image recognition unit 502 in a similar manner as in the attitude sensor information processing unit. Also the extent of correction on the respective values may be varied depending on the confidences of such values. For example, in case the difference of the confidences is very large, the correction value for the attitude determining means at the lower side, or a portion relating to the attitude determination in the position and attitude determining means, is so renewed as to substantially directly release the output at the higher side, but, in case the difference is not so large, the correction value is so renewed as to execute corrections in small amounts on all the values.
  • The present embodiment allows to obtain the position and attitude information on the view point of the player, including the attitude information of a high reliability, in a continuous manner. [0144]
  • (Seventh Embodiment) [0145]
  • It is already explained that the [0146] board 101, constituting a field of the board game, includes certain areas and the players execute the game by placing, removing or moving pieces in, from or between these areas. The game management unit 401 grasps the situation of the scene or proceeding of the game, to enable the CG generation unit 405 to generate CG matching such scene or proceeding of the game, whereby the game is felt more realistic to the players.
  • For this purpose, there is provided, for the piece to be operated by the player, piece operation recognition means for recognizing “which piece” is “placed/removed” in or from “which area”. [0147]
  • It is naturally possible also to employ another item in place for the piece and to recognize the operation on such item. [0148]
  • FIG. 15 shows an example of the configuration of the image experiencing system of a seventh embodiment. [0149]
  • A piece operation [0150] recognition unit A 1501 is composed of a special mark such as a bar code attached to a piece, and a special mark recognition unit A 1502 such as a bar code reader for regonizing the special mark. The special mark is to be attached on the piece and is therefore omitted from FIG. 15.
  • The special mark is used only for identifying the piece, and can not only be an ordinary printed mark but can also be based on so-called RFID system utilizing an IC chip or the like. [0151]
  • The special mark [0152] recognition unit A 1502 may be provided in each area on the board 101, or may be provided collectively for plural or all the areas on the board.
  • The data from the special mark [0153] recognition unit A 1502 are transferred to a special mark recognition unit B 1503, and then to a piece operation recognition unit B 1504.
  • The special mark [0154] recognition unit B 1503 analyzes the information from the special mark recognition unit A 1502 and converts it into a data format required by the piece operation recognition unit B 1504. In case a special mark recognition unit B 1503 is provided in each area, the information of “which area” can be identified from the special mark recognition unit B 1503 releasing the output and need not be released. However, in case a single special mark recognition unit B 1503 covers plural areas, the information indicating “which area” is outputted for thus covered areas. Also, in case the input from the special mark recognition unit A 1502 is for example a number of 10 digits, such input is converted, for example by a conversion table, into information indicating “which area”.
  • The piece operation [0155] recognition unit B 1504 recognizes “which piece” is “placed/removed” in or from “which area”, and transfer the result of such recognition, as the result obtained by the piece operation recognition unit A 1501, to the game management unit 401.
  • The [0156] game management unit 401 causes the game to proceed, based on the result of recognition from the piece operation recognition unit A 1501. In the actual proceeding of the game, there may be required information that “which piece” is moved from “which area” to “which area”. Such information is judged by the game management unit 401, by combining information that a piece is “removed from an area j” and information that “a piece i is placed in an area k”. In this case, if the piece placed in the area j is a piece i, there is judged that “a piece i is moved from an area j to an area k”. The piece in the area j can be identified as the piece i by managing and referring to the history by the game management unit.
  • FIG. 16 is a UML activity chart showing the process of the piece operation recognition unit. [0157]
  • The special mark [0158] recognition unit A 1501 is provided in each area on the board 101, whereby a special mark recognition unit i corresponds to the area i. The special mark recognition unit B 1503 returns a special mark identifier j when the piece j is placed, and a particular special mark identifier Nothing when the piece is removed.
  • The piece operation [0159] recognition unit A 1501 awaits the input from the special mark recognition unit, and outputs a result “a piece is removed from the area i” or “a piece j is placed in the area i” respectively if the special mark identifier j is Nothing or otherwise.
  • The present embodiment allows the game to proceed, based on the actual operations of the players. Since the CG can be generated matching the scene or proceeding situation of the game, it is felt as a more realistic game to the players. [0160]
  • (Eighth Embodiment) [0161]
  • A bar code can be used as the special mark identifier, corresponding to claim 10. [0162]
  • The bar code is widely utilized for example in the field of distribution of commodities, and has various features such as easy availability, high accuracy in recognition, stability in recognition, inexpensiveness etc. Particularly in case of a card game, the bar code can be printed simultaneously with the printing of the cards. Also an invisible bar code can be used for attaching the special mark without affecting the design of the cards. [0163]
  • (Ninth Embodiment) [0164]
  • An RFID system, or radio frequency identification technology which is a non-contact automatic identification technology utilizing radio frequency, can be used as the special mark recognition means. [0165]
  • A device called tag or transponder is attached to an article, and an ID specific to the tag is read by a reader. In general, the tag is composed of a semiconductor circuitry including a control circuit, a memory etc. constructed as a single chip, and an antenna. The reader emits an inquiring electric wave, which is also used as electric energy, so that the tag does not require a battery. In response to the inquiring wave, the tag emits the ID stored in advance in the memory. The reader reads such ID, thereby identifying the article. [0166]
  • The RFID system is widely employed for example in the ID card or the like, and has features of easy availability, high accuracy in recognition, stability in recognition, inexpensiveness etc. If the tag is incorporated inside the piece, it can be recognized without affecting at all the external appearance of the piece. Also the piece and the board have a larger freedom in designing, since the surface of the piece need not be flat and a non-metallic obstacle may be present between the tag and the reader. [0167]
  • (Tenth Embodiment) The “piece” can be recognized, even without the special mark recognition unit, by an image recognition process on the image obtained with a camera. The piece recognition can be achieved by the pattern on the card surface in case of a card game, or by the shape of the piece in case of chess or the like, or by the shape of the piece and the pattern drawn thereon in other games. [0168]
  • In the following there will be explained an example of recognizing a pattern drawn on the surface of a rectangular card, but the present invention is also applicable to a case of recognizing the shape of the piece, or a case of recognizing the shape of the piece and the pattern thereon at the same time. [0169]
  • FIG. 17 shows an example of the configuration of the image experiencing system of a tenth embodiment. In comparison with the embodiment shown in FIG. 15, the present embodiment is different only in the configuration of the piece operation recognition means [0170] 1501, wherein the special mark recognition unit 1502 corresponds to a piece recognition camera 1701 and the special mark recognition unit 1503 corresponds to a piece image recognition unit 1702. The piece operation recognition unit 1504 remains same.
  • In the following there will be explained an example of recognizing two patterns shown in FIG. 18, but it is also possible to recognize various complex patterns such as a cartoon or a photograph, by employing more complex processing. [0171]
  • FIG. 20 is a UML activity chart showing the process of the piece [0172] image recognition unit 1702.
  • The recognition is executed in two stages, namely the detection of a frame, and then the detection of a pattern. In case the frame cannot be detected, it is judged that the card is not present, and a piece identifier Nothing is returned. The method of frame detection is not illustrated, but can be achieved, for example, by detecting straight lines by Huffman transformation or the like and judging a frame from the positional relationship of such lines. [0173]
  • In case the frame is detected, there is then executed detection of the pattern. As shown in FIG. 19, the interior of the frame is divided into four areas, and the color is detected in each area. Various methods are available also for the color detection. For example, in case of detecting white and black only, there is only utilized the luminocity information and the average luminocity is determined in the object area, and the area is judged as black or white respectively if such average luminocity is lower or higher than a predetermined value T[0174] B.
  • The areas are numbered from [0175] 1 to 4 as shown in FIG. 19, and, if the combination of colors in the areas 1 to 4 are black-white-white-black or white-black-black-white, a piece identifier 1 is returned. If the colors are black-black-white-white, white-white-black-black, black-white-black-white or white-black-white-black, a piece identifier 2 is returned. Any other combination indicates an unexpected card or an erroneous recognition of the frame, and an identifier Nothing is returned, indicating the absence of the card. In case the image recognition is repeated, a same result is outputted in succession.
  • In the course of a card placing or removing operation, there may be outputted at random a state where a card is placed or removed. In a game which may be hindered by such situation, there is required a measure for suppressing successive same outputs or for providing an output when a same state continues for a predetermined time, but such measure will not be explained. [0176]
  • (Eleventh Embodiment) [0177]
  • The [0178] camera 501 fixed to the HMD 103 can be used for the piece recognition. In such case the system can be simplified as the camera fixed to the HMD of the player is used for the piece operation recognition means.
  • If the [0179] board 101 can be recognized by the board image recognition unit 502, the areas provided on the board 101 can be identified. The piece operation recognition means can be constituted by recognizing the pieces in such areas.
  • FIG. 21 shows an example of the configuration of the image experiencing system of an eleventh embodiment. Piece operation recognition means [0180] 1501 is composed of an on-board piece image recognition unit 2101 and a piece operation recognition unit 1504, and the image data to be recognized are entered from the camera 501 while the recognition information of the board 101 is entered from the board image recognition unit 502. The output information of the board image recognition unit 502 indicates the position and attitude of the view point of the player, from which the position of the board on the image can be easily calculated.
  • FIG. 22 is a UML activity chart showing the process of the on-board piece image recognition unit. The unit receives an image input from the [0181] camera 501, and then the position and attitude of the view point of the HMD from the board image recognition unit 502.
  • At first the position and attitude of the [0182] board 101 on the input image is calculated from the position and attitude of the view point of the HMD. Then the positions and attitudes of the areas on the input image are calculated from the position and attitude information of the predetermined areas on the board.
  • Once the position and attitude of each area are known, the image in such position is cut out and subjected to image recognition to recognize the piece in each area. Since the information of each area includes the attitude information, such information may also be utilized in the image recognition to improve the accuracy. [0183]
  • (Twelfth Embodiment) [0184]
  • In case of recognizing a piece on the board by the [0185] camera 501, there may arise a situation where the number of pixels occupied by the piece on the image becomes smaller, for example depending on the distance from the camera 501 to the piece or on the attitude of the piece relative to the camera 501, thereby rendering the recognition difficult or requiring a complex configuration of the image recognition unit for correcting the deformation of the image.
  • Therefore, in recognizing “which piece”, the piece is brought to a predetermined specified position relative to the [0186] camera 501.
  • For example, the piece is brought to a position at a distance of 30 cm in front of the camera. For example, in case of a card as shown in FIG. 18, the card is judged exposed to the camera when the frame arrives at a specified position on the image, and the recognition of the card is executed in such position. [0187]
  • Once the piece is recognized, “which area” can be identified by tracing the piece until it is placed on the board or by recognizing “placing” of any piece by simplifying the image recognition unit in the tenth embodiment. The “removing” can also be recognized in a similar manner. [0188]
  • In recognizing “which piece”, the recognition rate can be improved by positioning the piece at a specified position with respect to a specified camera. Also it is possible to simplify the configuration of the recognition unit. [0189]
  • FIG. 23 shows an example of the configuration of the image experiencing system of a twelfth embodiment. In comparison with the embodiment shown in FIG. 21, the present embodiment is different only in the configuration of the piece operation recognition means [0190] 1501. The on-board piece image recognition unit 2101 may be same as that in the eleventh embodiment, or may be further simplified since there is only required judgment that a piece is “placed” or “removed”. The piece image recognition unit can be same as that shown in the tenth embodiment. A piece operation recognition unit 2301, different from the piece operation recognition unit 1504, receives the inputs from both the on-board piece image recognition unit 2101 and the piece image recognition unit 1702.
  • FIGS. 24 and 25 are UML activity charts showing the process of the piece [0191] operation recognition unit 2301. FIG. 24 shows a state of receiving information “a piece j is recognized” from the piece image recognition unit 1702, and FIG. 25 shows a state of receiving information “a piece is placed/removed in an area i” from the on-board piece image recognition unit 2101.
  • In case information “a piece j is recognized” is received from the piece [0192] image recognition unit 1702, the information “piece j” is recorded in an object variable, and is utilized as the information “which piece” when information “piece is placed in which area” is received later.
  • Also in case information “a piece is placed in an area i” is received from the on-board piece [0193] image recognition unit 2101, the information “piece j” recorded in the object variable is read out, and a result “a piece j is placed in an area i” is returned.
  • Also in case information “a piece is removed from an area i” is received from the on-board piece [0194] image recognition unit 2101, a result “a piece is removed from an area i” is directly returned.
  • The input to the piece [0195] image recognition unit 1702 may be executed, instead of the camera 501 fixed to the HMD 103, by an exclusive camera such as a separately prepared document camera. Also a similar effect can be attained by replacing the combination of the exclusive camera and the piece image recognition unit 1702 by the special mark recognition unit 1502 and the special mark recognition unit 1503, prepared separately.
  • (Thirteenth Embodiment) [0196]
  • The piece can be recognized by the exposure of such piece in front of the [0197] camera 501, but such exposure position is not easily understandable for the player. Also if the image experiencing system is so designed as to improve the ease of use by the players, the spatial range for recognition inevitably becomes wider to result in complication of the recognition unit or in a loss in the recognition rate.
  • It is therefore desired that the player can expose the piece, without doubt or hesitation, to the spatially limited recognition area. This can be achieved by displaying a guide on the [0198] display unit 303 of the HMD 103, and exposing the piece by the player so as to match the displayed guide.
  • FIG. 26 shows an example of the configuration of the image experiencing system of a thirteenth embodiment. In comparison with the embodiment shown in FIG. 23, the configuration remains same except that the piece [0199] image recognition unit 1702 is replaced by a piece image recognition/guide display instruction unit 2601, from which information is outputted to the CG generation unit 405.
  • The piece image recognition/guide [0200] display instruction unit 2601 is same in configuration as the piece image recognition unit 1702, except that it outputs a guide display instruction in case the confidence on the result of recognition is less than a certain level. FIG. 27 shows the difference of the outputs of the piece image recognition unit 1702 and the piece image recognition/guide display instruction unit 2601. In case the recognition engine is similar to the piece image recognition unit 1702, there is judged a recognized state and the result of recognition is outputted if the confidence on the recognition is at least equal to a certain value Th higher than the threshold value of the piece image recognition unit 1702. A high recognition rate can be easily realized as the threshold value Th is higher.
  • On the other hand, there is judged a non-recognized state or a non-exposed state of the piece in case the confidence is lower than a certain value Tl. Such value Tl is same as the recognition threshold Th in case of the piece [0201] image recognition unit 1702, but is selected lower in case of the piece image recognition/guide display instruction unit 2601. A guide display instruction is given in case the confidence is neither “recognized” nor “not recognized”.
  • FIG. 28 is a UML chart showing the process of the piece image recognition/guide [0202] display instruction unit 2601. If the confidence is selected within a range of 1-0, there stands a relationship 0<Tl<Th<1. After the piece image recognition process, if the confidence on the result of recognition is lower than Tl, the situation is judged non-recognized and no action is executed. If the confidence is higher than Th, the situation is judged recognized and the result of recognition is transferred to the piece operation recognition unit 2301. If the situation is neither of the foregoing, a guide display instruction is issued. The guide display can be, for example, as shown in FIG. 29.
  • In the present embodiment, in case of exposing a piece at a specified position of a specified camera, a guide for assisting such exposure is prepared by CG and is displayed in superposition in the HMD of the player, the player can easily place and expose the piece in a spatially appropriate position. [0203]
  • (Fourteenth Embodiment) [0204]
  • In a game played by plural players, the event on the [0205] board 101, including the display, has to be shared by all the players. This can be achieved logically by sharing a game management unit 401 by all the players. Physically, such unit may consist of a single exclusive PC or a specified exclusive game console including other constituent components, or may be provided in plural game consoles or PCs as in the case of a dispersed database.
  • Stated differently, for each player, the game console or [0206] PC 102 assumes a configuration as shown in FIG. 4, and the gate management unit 401 also reflects the result of operations executed by other players.
  • FIGS. 30A and 30B show examples of the configuration of the image experiencing system of a fourteenth embodiment. [0207]
  • A game console or [0208] PC 102 is assigned to each player, and such consoles or PCs are mutually connected by a network. The information flowing in the network is utilized for synchronizing the contents of the game management units 401 in the game consoles or PCs.
  • The piece operation recognition means is provided in each game console or [0209] PC 102, with each piece being recognized from plural view points. Ikn such case, it is also possible to exchange the information of recognition through the network, and to utilize the result of recognition of a higher reliability.
  • Also FIGS. 31A to [0210] 31C show another examples of the configuration of the image experiencing system of the fourteenth embodiment. Game contents are contained in a game server on the internet, and the game consoles or PCs 102 of the players are connected through the internet.
  • The [0211] game management unit 401 is composed of local game management units 3101 and a game server 3102, the latter being provided in an independent equipment. The local game management unit 3101 deals with matters relating only to each player and those requiring feedback to each local player without delay in time. Also data and programs relating to the individual game contents are downloaded from the game server 3102 through the internet, either at the start of the game or in the course of execution thereof.
  • The present embodiment allows plural players to play on a single board, and to display the result of complex operations by the plural players in each HMD based on the view point of each player, thereby enabling experience of a game played by plural players. [0212]
  • (Fifteenth Embodiment) [0213]
  • The present embodiment provides a system of executing a game by adapting MR (mixed reality) technology to a card game, thereby combining a real field and a virtual field by CG (computer graphics). [0214]
  • The card game is already known in various forms, such as poker or black jack utilizing the playing cards. Recently in fashion is a card game utilizing cards in which the cartoon characters are assigned. [0215]
  • Such card game is played on a play sheet or a game board, by players each holding cards. Each card records a cartoon character and its attributes or its specialty skill. The players execute the game by using these cards, and the game is won or lost by the offensive method or power and the defensive method or power, which are determined by the combination of the cards. [0216]
  • In the following there will be explained the system of the present embodiment, applied to a battle-type card game played on a board, by two players holding the cards. [0217]
  • FIG. 32 is a conceptual view of the game of the present embodiment. The two players respectively wear the see-through HMDs, and are positioned across a board, which constitutes the battle space of the game. The two players play the game by placing the respective cards on the board or moving the cards placed thereon. In the see-through HMD of each player, CG matching the characteristics of each card are displayed on each card. [0218]
  • FIG. 33 schematically shows the configuration of the system of the present embodiment. [0219]
  • The player wears an [0220] HMD 3321, which is provided with a camera 3320 and a three-dimensional position and attitude measuring means 3322. The HMD 3321 is connected to a game management unit 3325 while the camera 3320 and the three-dimensional position and attitude measuring means 3322 are connected to a position and attitude grasp unit 3329, both through signal cables.
  • The [0221] camera 3320 is matched with the view of the player and photographs the objects observed by the player. The obtained image data are transferred to the position and attitude grasp unit.
  • While the player observes a [0222] play board 3326, the image taken by the camera 3320 contains markers 3331 shown in FIG. 34. The markers 3331 are provided in predetermined positions on the play board, and the positional information of such markers is inputted in advance in the position and attitude grasp unit 3329. Therefore the area observed by the player can be estimated from the markers 3321 appearing on the image of the camera 3320.
  • The three-dimensional position and attitude measuring means [0223] 3322 measures the position and attitude of the HMD worn by the player, and the measured position and attitude data are supplied to the position and attitude grasp unit 3329.
  • Based on the positions of the [0224] markers 3331 taken by the camera 3320 and the position and attitude data from the three-dimensional position and attitude measuring means, the position and attitude grasp unit 3329 calculates the range observed by the player.
  • Above the [0225] play board 3326, there is provided a roof 3327 on which installed is a card recognition camera 3328. The card recognition camera 3328 may cover the entire area of the play board 3326, but it is also possible to divide the play board into four areas and to place four card recognition cameras 3328 respectively corresponding to these divided areas, or to place card recognition cameras 3328 by a number corresponding to that of the areas in which the cards are to be placed. The card recognition camera 3328 constantly watches the play board during the game, and the obtained image is transferred to a card reading unit 3324, which identifies the card on the play board 3326, based on the obtained image.
  • In the following there will be given an explanation on the markers provided on the play board. FIG. 34 is a plan view of the [0226] play board 3326, on which the markers 3331 are provided. Each marker is formed specified color and shape, and the information of such color, shape and position is registered in advance in the position and attitude grasp unit 3329. By detecting the markers in the image taken by the camera 3320 and identifying the color and shape, a position in the play board corresponding to the taken image can be identified. Then, based on the result of such identification, it is possible to estimate the area of the board observed by the player.
  • In the following there will be given an explanation on guides. As shown in a plan view of the [0227] play board 3326 shown in FIG. 35, the play board 3326 is provided with guides 3341 in a 5×2 arrangement for the player at the front side and also in a 5×2 arrangement for the player at the rear side. The guide 3341 defines an area in which the card is to be placed, and a card placed in any other area will be irrelevant from the proceeding of the game. The precision of card recognition can be improved since the card position is clearly determined by the guide 3341, which can be composed, for example, of a recess or a ridge corresponding to the card size.
  • Now the flow of the game will be explained with reference to flow charts. [0228]
  • At first reference is made to FIG. 38 for explaining the [0229] card reading unit 3324.
  • After the process is started in a step S[0230] 701, the sequence enters an image capturing phase in a step S702, in which the image of the play board from the camera 3328 is captured. Then the captured image is identified in a card identification phase of a step S703.
  • The card identification will be explained with reference to FIG. 36, which is a plan view of the [0231] play board 3326 in a state where three cards are placed on the guides shown in FIG. 35. With a coordinate description in which the upper left corner is defined by (0, 0) and the lower right corner by (5, 4), FIG. 36 shows a state where a card ‘G’ is placed at (5, 1), a card ‘2’ at (4, 3) and a card ‘1’ at (3, 4).
  • A step S[0232] 703 analyzes the camera image, detects the cards placed on the board by the image recognition technology, and identifies the coordinate value and the kind of each card. The coordinate value and the kind, thus identified, of the cards are retained as card arrangement data.
  • A step S[0233] 704 compares the present card arrangement data with the prior data. If the comparison shows no change in the arrangement data, the sequence returns to the image capturing step S702. If the arrangement is changed, a step S705 executes a change in the card arrangement data and the sequence returns to the image capturing step S702.
  • In the following reference is made to FIG. 39 for explaining the position and [0234] attitude grasp unit 3329.
  • After the process is started in a step S[0235] 801, the sequence enters an image capturing phase in a step S802, in which the image from the camera 3320 attached to the HMD of the player is captured. Then a step S803 fetches the attitude data from the three-dimensional position and attitude measuring means 3322. A step S804 identifies the markers 3331 from the image fetched in the step S802, thereby estimating the view point of the player. Then a step S805 determines the more exact position and attitude of the view point of the player, based on the attitude information obtained in S803 and the estimated information in S804. Thereafter the sequence returns to S802.
  • In the following, reference is made to FIG. 40 for explaining the [0236] game management unit 3325.
  • After the process is started in a step S[0237] 901, a step S902 waits for any instruction from the player on the proceeding of the game. If any event arrives, the sequence proceeds to a step S903 for identifying the kind of the event. If the event is a signal for advancing to a next phase, the sequence proceeds to a step S904, but, if otherwise, the sequence returns to S902 to wait for a next event.
  • The signal for advancing to a next phase is generated by identifying an operation inducing a phase advancement. Such operation inducing a phase advancement may be recognized by various methods. [0238]
  • The game advancement can be judged, for example, by recognition of a voice of the player by a voice recognition unit as shown in FIG. 37, or by image recognition of a card exposed by the player in a large image size to the [0239] camera 3320 attached to the HMD worn by the player, or by image recognition of a card placement in a specified position on the board in the image obtained from the card recognition camera 3328.
  • A step S[0240] 904 reads the card arrangement data determined by the card reading unit 3324. Then a step S905 fetches, from the card arrangement data, the data of the cards relating to the current phase, and calculates the offensive character and the offensive characteristics (offensive power and method) of the offensive side, based on the arrangement and combination of the cards. Then a step S906 fetches, from the card arrangement data as in the step S905, the data of the cards relating to the current phase, and calculates the defensive character and the defensive characteristics (defensive power and method) of the defensive side, based on the arrangement and combination of the cards. The calculation of the offensive and defensive characteristics in the steps S905 and S906 is executed according to the rules of the game.
  • Then a step S[0241] 907 calculates the result of battle according to the combination of characters of the offensive and defensive sides, and generates a battle scene matching such result. Then a step S908 acquires the view point of the player derived by the position and attitude grasp unit 3329, then a step S909 generates CG of the battle scene seen from the view point of the player, and a step S910 synthesizes the image obtained from the camera 3320 with the CG generated in S909. Then the real field and the virtual field of the CG are superimposed by the MR technology in the HMD worn by the player, thereby displaying a virtual CG character on the card.
  • However, such image synthesis is only required in case of a video see-through HMD, and not required in case of an optical see-through HMD. [0242]
  • As explained in the foregoing, the present invention allows to combine the real world with the virtual CG world by displaying the virtual CG corresponding to the viewing point, thereby realizing much higher excitement in a game utilizing a play board. [0243]
  • In case the [0244] card recognition camera 3328 cannot be installed in satisfactory manner, it is also possible, in order to improve the reading accuracy of the card recognition camera 3328, to adopt a method in which the player exposes a card in front of the HMD 3321 worn by the player to identify the kind of the card by the camera 3320, before such card is placed. Also in order to improve the reading accuracy of the card recognition camera 3328, it is possible to position such camera 3328 in a specified position and to identify the kind of a card by such camera 3328 before such card is placed.
  • In the foregoing embodiment, a card is employed as the item of the game, but there may also be employed other items such as a piece. [0245]
  • Also the board may have a three-dimensionally stepped structure and the character may be displayed in a position corresponding to the height of such stepped structure. In such case, a model of the three-dimensionally stepped structure of the play board is registered in advance, and the position of the synthesized CG is controlled based on the three-dimensional structural information corresponding to the position of the view point. [0246]
  • Also for identifying the card on the play board, the foregoing embodiment analyzes the image of the [0247] camera 3328 and recognizes the pattern on the card by the image recognition technology, but it is also possible to attach a bar code to the card and to identify the card with a bar code reader.
  • (Other Embodiments) [0248]
  • The present invention also includes a case of supplying a computer of an apparatus or a system, connected with various devices so as to operate such devices for realizing the functions of the aforementioned embodiments, with program codes of a software realizing the functions of the aforementioned embodiments and causing a computer (or CPU or MPU) of such apparatus or system to execute the program codes thereby operating such devices and realizing the functions of the aforementioned embodiments. [0249]
  • In such case, the program codes themselves of such software realize the functions of the aforementioned embodiments, and the program codes themselves, and means for supplying the computer with such program codes, for example a memory medium storing such program codes, constitute the present invention. [0250]
  • The memory medium for supplying such program codes can be, for example, a floppy disk, a hard disk, an optical disk, a magnetooptical disk, a CD-ROM, a magnetic tape, a non-volatile memory card or a ROM. [0251]
  • The present invention naturally includes not only a case where the functions of the aforementioned embodiments are realized by the execution of the supplied program codes by the computer but also a case where the functions of the aforementioned embodiments are realized by the cooperation of such program codes with an OS (operating system) or another application software or the like functioning on the computer. [0252]
  • The present invention further includes a case where the supplied program codes are once stored in a memory provided in a function expansion board of the computer or a function expansion unit connected to the computer and a CPU or the like provided on such function expansion board or function expansion unit executes all the processes or a part thereof under the instructions of such program codes. [0253]

Claims (29)

What is claimed is:
1. An image experiencing system for a game which proceeds by placing an item on a game board, the system comprising:
player position and attitude determining means for determining the position and attitude information of a view point of a player;
generation means for generating computer graphics corresponding to the item on the game board, based on the position and attitude information of the view point of said player; and
a head mounted display for displaying said generated computer graphics in superposition with an image of the real world.
2. A system according to claim 1, wherein the position and attitude information of said player is information indicating a relative position of the view point of said player relative to said game board.
3. A system according to claim 1, further comprising:
measurement means for measuring attitude information of said player;
wherein said player position and attitude determining means determines the position and attitude information of the view point of said player from the attitude information and pre-calibrated position information of said player.
4. A system according to claim 1, further comprising:
a camera fixed to said head mounted display;
wherein said player position and attitude determining means analyzes the image of said camera and executes image recognition thereon, thereby obtaining the position and attitude information of the view point of said player.
5. A system according to claim 1, further comprising:
a position and attitude sensor for measuring the position and attitude of the player; and
a camera fixed to said head mounted display;
wherein said player position and attitude determining means includes a first position and attitude determining unit for determining the position and attitude information of the view point of the player from the output of said position and attitude sensor, and a second position and attitude determining unit for determining the position and attitude information of the view point of the player from an image taken by said camera, and the position and attitude information of the view point of said player is determined according to the reliability of the respective output values of said first and second position and attitude determining units.
6. A system according to claim 1, further comprising:
an attitude sensor for measuring the attitude of the player; and
a camera fixed to said head mounted display;
wherein said player position and attitude determining means includes a first position and attitude determining unit for determining the position and attitude information of the view point of the player from the output of said attitude sensor, and a second position and attitude determining unit for determining the position and attitude information of the view point of the player from an image taken by said camera, and the position and attitude information of the view point of said player is determined according to the reliability of the respective output values of said first and second position and attitude determining units.
7. A system according to claim 5 or 6, wherein a correction value is determined from the output value of said first or second position and attitude determining unit, and said correction value is used for correcting the output value of said first or second position and attitude determining unit.
8. A system according to claim 1, further comprising item operation recognition means for recognizing a change in the item on said game board.
9. A system according to claim 8, wherein said item operation recognition means recognizes a special mark identifier attached to the item.
10. A system according to claim 9, wherein a visible or invisible bar code is used as the special mark identifier.
11. A system according to claim 9, wherein a RFID transponder is used as the special mark identifier.
12. A system according to claim 8, wherein said item operation recognition means recognizes a shape of the item and/or a pattern on the item by image recognition.
13. A system according to claim 12, wherein an image taken by a camera fixed to said head mounted display is used.
14. A system according to claim 12, wherein said item operation recognition means recognizes a placement of an item in a specified position of a specified camera, thereby recognizing an item to be changed.
15. A system according to claim 14, wherein, in placing an item at said specified position of said specified camera, a guide facilitating the placement of the item is prepared by computer graphics and is displayed in the head mounted display of said player.
16. A system according to claim 1, wherein plural players play on a single game board, and the result of complex operations by the plural players is displayed in each head mounted display in the view point of each player.
17. An information processing method for a game which proceeds by placing an item on a game board, the method comprising steps of:
entering position and attitude information of a view point of a player;
generating computer graphics corresponding to the item on the game board, based on the position and attitude information of the view point of said player; and
displaying said generated computer graphics in superposition with an image of the real world in a head mounted display worn by the player.
18. A method according to claim 17, wherein the position and attitude information of said player is information indicating a relative position of the view point of said player relative to said game board.
19. A program for controlling an information processing apparatus thereby executing the information processing according to claim 17.
20. A recording medium storing the program according to claim 19.
21. An image experiencing system for a game which proceeds by placing an item on a game board bearing plural marks, the system comprising:
a position and attitude grasp unit for recognizing the kind of plural items placed on said game board and the position of said items;
game management means for managing the proceeding of the game, based on the kinds of said plural items and the positions thereof; and
generation means for generating computer graphics of a game scene corresponding to the kinds of the items of said plural items and the positions thereof.
22. A system according to claim 21, wherein:
said game is a battle type game;
said system further comprises determination means for determining the characteristics of a character of an own side based on the combination of the own side; and
said generation means generates said computer graphics based said determined characteristics of the character of the own side.
23. A system according to claim 21, wherein said game management means manages the proceeding of the game by a combination of the characteristics of the characters of the own side and the opponent side.
24. A system according to claim 21, wherein said game board has a three-dimensional structure, and said computer graphics are displayed in the head mounted display worn by the player, based on a model of the three-dimensional structure of said game board.
25. An information processing method for a game which proceeds by placing an item on a game board bearing plural marks, the method comprising steps of:
entering an image from an image pickup unit in a head mounted display worn by a player;
detecting a mark in said image thereby determining the position and attitude of a view point of said player on said game board;
identifying the item on said game board and generating computer graphics indicating a game scene based on the result of said identification; and
displaying said computer graphics, based on the position and attitude of said view point in said head mounted display.
26. A method according to claim 25, further comprising a step of:
entering position and attitude information of the player from a three-dimensional position and attitude measuring unit for measuring the position and attitude of said player;
wherein the position and attitude of said view point is obtained from said position and attitude information and from said detected mark.
27. A method according to claim 25, wherein the item on said game board is identified from an image from an image pickup unit for taking the image on said game board.
28. A method according to claim 25, further comprising a step of recognizing an instruction from the player for advancing the game.
29. A program for realizing an information processing method for a game which proceeds by placing an item on a game board bearing plural marks, the method comprising steps of:
entering an image from an image pickup unit in a head mounted display worn by a player;
detecting a mark in said image thereby determining the position and attitude of a view point of said player on said game board;
identifying the item on said game board and generating computer graphics indicating a game scene based on the result of said identification; and
displaying said computer graphics, based on the position and attitude of said view point in said head mounted display.
US10/254,789 2001-09-28 2002-09-26 Image experiencing system and information processing method Abandoned US20030062675A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2001-300545 2001-09-28
JP2001300544A JP3584229B2 (en) 2001-09-28 2001-09-28 Video experience system and information processing method
JP2001-300544 2001-09-28
JP2001300545A JP3584230B2 (en) 2001-09-28 2001-09-28 Video experience system, information processing method and program

Publications (1)

Publication Number Publication Date
US20030062675A1 true US20030062675A1 (en) 2003-04-03

Family

ID=26623268

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/254,789 Abandoned US20030062675A1 (en) 2001-09-28 2002-09-26 Image experiencing system and information processing method

Country Status (1)

Country Link
US (1) US20030062675A1 (en)

Cited By (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040109009A1 (en) * 2002-10-16 2004-06-10 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US20050049022A1 (en) * 2003-09-02 2005-03-03 Mullen Jeffrey D. Systems and methods for location based games and employment of the same on location enabled devices
US20050231532A1 (en) * 2004-03-31 2005-10-20 Canon Kabushiki Kaisha Image processing method and image processing apparatus
US20050285878A1 (en) * 2004-05-28 2005-12-29 Siddharth Singh Mobile platform
US20050289590A1 (en) * 2004-05-28 2005-12-29 Cheok Adrian D Marketing platform
US20050288078A1 (en) * 2004-05-28 2005-12-29 Cheok Adrian D Game
US20060073892A1 (en) * 2004-02-18 2006-04-06 Yusuke Watanabe Image display system, information processing system, image processing system, and video game system
US20060079324A1 (en) * 2004-02-18 2006-04-13 Yusuke Watanabe Image display system, image processing system, and video game system
US20060105838A1 (en) * 2004-11-16 2006-05-18 Mullen Jeffrey D Location-based games and augmented reality systems
US20070077987A1 (en) * 2005-05-03 2007-04-05 Tangam Gaming Technology Inc. Gaming object recognition
US20080096665A1 (en) * 2006-10-18 2008-04-24 Ariel Cohen System and a method for a reality role playing game genre
US20080100620A1 (en) * 2004-09-01 2008-05-01 Sony Computer Entertainment Inc. Image Processor, Game Machine and Image Processing Method
US20080188277A1 (en) * 2007-02-01 2008-08-07 Ritter Janice E Electronic Game Device And Method Of Using The Same
US20080192300A1 (en) * 2005-08-03 2008-08-14 Grid Ip Pte. Ltd Information Output Device, Medium, and Information Input/Output Device
US20080200285A1 (en) * 2004-03-19 2008-08-21 Sports Innovation As Mat For Sport and Games
US7474318B2 (en) 2004-05-28 2009-01-06 National University Of Singapore Interactive system and method
US20090108532A1 (en) * 2007-10-31 2009-04-30 Richard Darling Changeable gaming table
US20090195656A1 (en) * 2007-11-02 2009-08-06 Zhou Steven Zhi Ying Interactive transcription system and method
WO2010029553A1 (en) * 2008-09-11 2010-03-18 Netanel Hagbi Method and system for compositing an augmented reality scene
US20110055049A1 (en) * 2009-08-28 2011-03-03 Home Depot U.S.A., Inc. Method and system for creating an augmented reality experience in connection with a stored value token
US20110049234A1 (en) * 2007-06-21 2011-03-03 Kenji Yoshida Card surface reading/instruction executing method
US20110115157A1 (en) * 2009-11-17 2011-05-19 Filo Andrew S Game tower
US20110216060A1 (en) * 2010-03-05 2011-09-08 Sony Computer Entertainment America Llc Maintaining Multiple Views on a Shared Stable Virtual Space
US20110319166A1 (en) * 2010-06-23 2011-12-29 Microsoft Corporation Coordinating Device Interaction To Enhance User Experience
EP2491989A3 (en) * 2011-02-25 2013-08-07 Nintendo Co., Ltd. Information processing system, information processing method, information processing device and information processing program
US20130271452A1 (en) * 2011-09-30 2013-10-17 Arvind Kumar Mechanism for facilitating context-aware model-based image composition and rendering at computing devices
US20130321312A1 (en) * 2012-05-29 2013-12-05 Haruomi HIGASHI Information processing apparatus, information display system and information display method
EP2679290A1 (en) * 2012-06-28 2014-01-01 Alcatel-Lucent Method for supporting a joint activity of a plurality of remote communication devices
EP2394713A3 (en) * 2010-06-11 2014-05-14 Nintendo Co., Ltd. Image processing system, program, apparatus and method for video games
US20140135087A1 (en) * 2010-06-21 2014-05-15 Alexander Luchinskiy Brain-teaser
US8884987B2 (en) 2010-09-22 2014-11-11 Nintendo Co., Ltd. Storage medium having stored thereon display control program, display control apparatus, display control system, and display control method for setting and controlling display of a virtual object using a real world image
US9064335B2 (en) 2011-02-25 2015-06-23 Nintendo Co., Ltd. System, method, device and computer-readable medium recording information processing program for superimposing information
EP2886172A1 (en) * 2013-12-18 2015-06-24 Microsoft Technology Licensing, LLC Mixed-reality arena
US9155967B2 (en) 2011-09-14 2015-10-13 Bandai Namco Games Inc. Method for implementing game, storage medium, game device, and computer
US20150348324A1 (en) * 2014-06-03 2015-12-03 Robert L. Vaughn Projecting a virtual image at a physical surface
US9250703B2 (en) 2006-03-06 2016-02-02 Sony Computer Entertainment Inc. Interface with gaze detection and voice input
US20160048203A1 (en) * 2014-08-18 2016-02-18 Universal City Studios Llc Systems and methods for generating augmented and virtual reality images
US9278281B2 (en) 2010-09-27 2016-03-08 Nintendo Co., Ltd. Computer-readable storage medium, information processing apparatus, information processing system, and information processing method
US9282319B2 (en) 2010-06-02 2016-03-08 Nintendo Co., Ltd. Image display system, image display apparatus, and image display method
US20160327799A1 (en) * 2008-09-30 2016-11-10 Apple Inc. Head-Mounted Display Apparatus for Retaining a Portable Electronic Device with Display
US9597592B2 (en) 2010-04-13 2017-03-21 Samsung Electronics Co., Ltd. Method and apparatus for processing virtual world
US9639989B2 (en) 2012-06-29 2017-05-02 Sony Corporation Video processing device, video processing method, and video processing system
US20170185160A1 (en) * 2015-12-24 2017-06-29 Samsung Electronics Co., Ltd. Electronic device and method of controlling the same
US9703369B1 (en) * 2007-10-11 2017-07-11 Jeffrey David Mullen Augmented reality video game systems
US9958934B1 (en) 2006-05-01 2018-05-01 Jeffrey D. Mullen Home and portable augmented reality and virtual reality video game consoles
US10015473B2 (en) 2010-06-11 2018-07-03 Nintendo Co., Ltd. Computer-readable storage medium, image display apparatus, image display system, and image display method
CN108351690A (en) * 2015-11-02 2018-07-31 索尼互动娱乐股份有限公司 Information processing unit, information processing system and information processing method
CN108550169A (en) * 2018-04-24 2018-09-18 中北大学 The computational methods of the determination of pieces of chess position and its height in three dimensions
US10120438B2 (en) 2011-05-25 2018-11-06 Sony Interactive Entertainment Inc. Eye gaze to alter device behavior
US10532271B2 (en) * 2015-10-23 2020-01-14 Chol Whan OH Data processing method for reactive augmented reality card game and reactive augmented reality card game play device, by checking collision between virtual objects
US20200023269A1 (en) * 2016-10-14 2020-01-23 Lego A/S Game system
US10831436B2 (en) 2016-08-31 2020-11-10 Casio Computer Co., Ltd. Object display system, user communication device, and object display method and program
US11016557B2 (en) 2009-08-28 2021-05-25 Home Depot Product Authority, Llc Method and system for creating a personalized experience in connection with a stored value token
US11452935B2 (en) 2016-11-11 2022-09-27 Lego A/S Virtual card game system
US11504622B1 (en) * 2021-08-17 2022-11-22 BlueOwl, LLC Systems and methods for generating virtual encounters in virtual games
US11593539B2 (en) 2018-11-30 2023-02-28 BlueOwl, LLC Systems and methods for facilitating virtual vehicle operation based on real-world vehicle operation data
US11684848B2 (en) * 2021-09-28 2023-06-27 Sony Group Corporation Method to improve user understanding of XR spaces based in part on mesh analysis of physical surfaces
US11691084B2 (en) 2020-01-20 2023-07-04 BlueOwl, LLC Systems and methods for training and applying virtual occurrences to a virtual character using telematics data of one or more real trips
US11697069B1 (en) 2021-08-17 2023-07-11 BlueOwl, LLC Systems and methods for presenting shared in-game objectives in virtual games
US11896903B2 (en) 2021-08-17 2024-02-13 BlueOwl, LLC Systems and methods for generating virtual experiences for a virtual game

Citations (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4048653A (en) * 1974-10-16 1977-09-13 Redifon Limited Visual display apparatus
US4315241A (en) * 1979-01-11 1982-02-09 Redifon Simulation Limited Visual display apparatus
US4315240A (en) * 1979-01-11 1982-02-09 Redifon Simulation Ltd. Visual display apparatus
US4340878A (en) * 1979-01-11 1982-07-20 Redifon Simulation Limited Visual display apparatus
US4347507A (en) * 1978-12-21 1982-08-31 Redifon Simulation Limited Visual display apparatus
US4347508A (en) * 1978-12-21 1982-08-31 Redifon Simulation Limited Visual display apparatus
US4349815A (en) * 1979-01-11 1982-09-14 Redifon Simulation Limited Head-movable frame-scanner for head-coupled display
US4398799A (en) * 1980-03-04 1983-08-16 Pilkington P.E. Limited Head-up displays
US4843568A (en) * 1986-04-11 1989-06-27 Krueger Myron W Real time perception of and response to the actions of an unencumbered participant/user
US5072218A (en) * 1988-02-24 1991-12-10 Spero Robert E Contact-analog headup display method and apparatus
US5353042A (en) * 1992-12-02 1994-10-04 Klapman Matthew H Method for determining an orientation of an object
US5368309A (en) * 1993-05-14 1994-11-29 The Walt Disney Company Method and apparatus for a virtual video game
US5388990A (en) * 1993-04-23 1995-02-14 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Virtual reality flight control display with six-degree-of-freedom controller and spherical orientation overlay
US5423554A (en) * 1993-09-24 1995-06-13 Metamedia Ventures, Inc. Virtual reality game method and apparatus
US5495576A (en) * 1993-01-11 1996-02-27 Ritchey; Kurtis J. Panoramic image based virtual reality/telepresence audio-visual system and method
US5581271A (en) * 1994-12-05 1996-12-03 Hughes Aircraft Company Head mounted visual display
US5616078A (en) * 1993-12-28 1997-04-01 Konami Co., Ltd. Motion-controlled video entertainment system
US5662332A (en) * 1994-06-22 1997-09-02 Wizards Of The Coast, Inc. Trading card game method of play
US5702305A (en) * 1996-02-15 1997-12-30 Motorola Electronic game system
US5734421A (en) * 1995-05-30 1998-03-31 Maguire, Jr.; Francis J. Apparatus for inducing attitudinal head movements for passive virtual reality
US5853327A (en) * 1994-07-28 1998-12-29 Super Dimension, Inc. Computerized game board
US5986660A (en) * 1997-12-31 1999-11-16 Autodesk, Inc. Motion capture data system and display
US5991085A (en) * 1995-04-21 1999-11-23 I-O Display Systems Llc Head-mounted personal visual display apparatus with image generator and holder
US6014117A (en) * 1997-07-03 2000-01-11 Monterey Technologies, Inc. Ambient vision display apparatus and method
US6069594A (en) * 1991-07-29 2000-05-30 Logitech, Inc. Computer input device with multiple switches using single line
US6124862A (en) * 1997-06-13 2000-09-26 Anivision, Inc. Method and apparatus for generating virtual views of sporting events
US6175343B1 (en) * 1998-02-24 2001-01-16 Anivision, Inc. Method and apparatus for operating the overlay of computer-generated effects onto a live image
US6181371B1 (en) * 1995-05-30 2001-01-30 Francis J Maguire, Jr. Apparatus for inducing attitudinal head movements for passive virtual reality
US6278418B1 (en) * 1995-12-29 2001-08-21 Kabushiki Kaisha Sega Enterprises Three-dimensional imaging system, game device, method for same and recording medium
US6359601B1 (en) * 1993-09-14 2002-03-19 Francis J. Maguire, Jr. Method and apparatus for eye tracking
US6380732B1 (en) * 1997-02-13 2002-04-30 Super Dimension Ltd. Six-degree of freedom tracking system having a passive transponder on the object being tracked
US20020075286A1 (en) * 2000-11-17 2002-06-20 Hiroki Yonezawa Image generating system and method and storage medium
US6411266B1 (en) * 1993-08-23 2002-06-25 Francis J. Maguire, Jr. Apparatus and method for providing images of real and virtual objects in a head mounted display
US20020095265A1 (en) * 2000-11-30 2002-07-18 Kiyohide Satoh Information processing apparatus, mixed reality presentation apparatus, method thereof, and storage medium
US20020103617A1 (en) * 2000-11-30 2002-08-01 Shinji Uchiyama Position and orientation determining method and apparatus and storage medium
US6442293B1 (en) * 1998-06-11 2002-08-27 Kabushiki Kaisha Topcon Image forming apparatus, image forming method and computer-readable storage medium having an image forming program
US20020126895A1 (en) * 2001-03-06 2002-09-12 Kiyohide Satoh Specific point detecting method and device
US6473717B1 (en) * 1998-03-07 2002-10-29 Claus-Frenz Claussen Method and apparatus for evaluating a movement pattern
US20020163576A1 (en) * 2001-03-22 2002-11-07 Nikon Corporation Position detector and attitude detector
US20020167500A1 (en) * 1998-09-11 2002-11-14 Visible Techknowledgy, Llc Smart electronic label employing electronic ink
US6522312B2 (en) * 1997-09-01 2003-02-18 Canon Kabushiki Kaisha Apparatus for presenting mixed reality shared among operators
US20030063115A1 (en) * 2001-09-10 2003-04-03 Namco Ltd. Image generation method, program, and information storage medium
US6561513B1 (en) * 1999-11-18 2003-05-13 Degeorge Andrew Role and war game playing system
US6580563B1 (en) * 1999-03-16 2003-06-17 Keller Limited Image detection system
US6633304B2 (en) * 2000-11-24 2003-10-14 Canon Kabushiki Kaisha Mixed reality presentation apparatus and control method thereof
US6690156B1 (en) * 2000-07-28 2004-02-10 N-Trig Ltd. Physical object location apparatus and method and a graphic display device using the same
US6795041B2 (en) * 2000-03-31 2004-09-21 Hitachi Zosen Corporation Mixed reality realizing system
US6798345B2 (en) * 2001-09-13 2004-09-28 Allied Telesis K.K. Administrative system

Patent Citations (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4048653A (en) * 1974-10-16 1977-09-13 Redifon Limited Visual display apparatus
US4347507A (en) * 1978-12-21 1982-08-31 Redifon Simulation Limited Visual display apparatus
US4347508A (en) * 1978-12-21 1982-08-31 Redifon Simulation Limited Visual display apparatus
US4349815A (en) * 1979-01-11 1982-09-14 Redifon Simulation Limited Head-movable frame-scanner for head-coupled display
US4340878A (en) * 1979-01-11 1982-07-20 Redifon Simulation Limited Visual display apparatus
US4315240A (en) * 1979-01-11 1982-02-09 Redifon Simulation Ltd. Visual display apparatus
US4315241A (en) * 1979-01-11 1982-02-09 Redifon Simulation Limited Visual display apparatus
US4398799A (en) * 1980-03-04 1983-08-16 Pilkington P.E. Limited Head-up displays
US4843568A (en) * 1986-04-11 1989-06-27 Krueger Myron W Real time perception of and response to the actions of an unencumbered participant/user
US5072218A (en) * 1988-02-24 1991-12-10 Spero Robert E Contact-analog headup display method and apparatus
US6069594A (en) * 1991-07-29 2000-05-30 Logitech, Inc. Computer input device with multiple switches using single line
US5353042A (en) * 1992-12-02 1994-10-04 Klapman Matthew H Method for determining an orientation of an object
US5495576A (en) * 1993-01-11 1996-02-27 Ritchey; Kurtis J. Panoramic image based virtual reality/telepresence audio-visual system and method
US5388990A (en) * 1993-04-23 1995-02-14 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Virtual reality flight control display with six-degree-of-freedom controller and spherical orientation overlay
US5368309A (en) * 1993-05-14 1994-11-29 The Walt Disney Company Method and apparatus for a virtual video game
US6411266B1 (en) * 1993-08-23 2002-06-25 Francis J. Maguire, Jr. Apparatus and method for providing images of real and virtual objects in a head mounted display
US6359601B1 (en) * 1993-09-14 2002-03-19 Francis J. Maguire, Jr. Method and apparatus for eye tracking
US5423554A (en) * 1993-09-24 1995-06-13 Metamedia Ventures, Inc. Virtual reality game method and apparatus
US5616078A (en) * 1993-12-28 1997-04-01 Konami Co., Ltd. Motion-controlled video entertainment system
US5662332A (en) * 1994-06-22 1997-09-02 Wizards Of The Coast, Inc. Trading card game method of play
US5853327A (en) * 1994-07-28 1998-12-29 Super Dimension, Inc. Computerized game board
US5581271A (en) * 1994-12-05 1996-12-03 Hughes Aircraft Company Head mounted visual display
US5991085A (en) * 1995-04-21 1999-11-23 I-O Display Systems Llc Head-mounted personal visual display apparatus with image generator and holder
US5734421A (en) * 1995-05-30 1998-03-31 Maguire, Jr.; Francis J. Apparatus for inducing attitudinal head movements for passive virtual reality
US6181371B1 (en) * 1995-05-30 2001-01-30 Francis J Maguire, Jr. Apparatus for inducing attitudinal head movements for passive virtual reality
US6278418B1 (en) * 1995-12-29 2001-08-21 Kabushiki Kaisha Sega Enterprises Three-dimensional imaging system, game device, method for same and recording medium
US5702305A (en) * 1996-02-15 1997-12-30 Motorola Electronic game system
US6380732B1 (en) * 1997-02-13 2002-04-30 Super Dimension Ltd. Six-degree of freedom tracking system having a passive transponder on the object being tracked
US6124862A (en) * 1997-06-13 2000-09-26 Anivision, Inc. Method and apparatus for generating virtual views of sporting events
US6014117A (en) * 1997-07-03 2000-01-11 Monterey Technologies, Inc. Ambient vision display apparatus and method
US6522312B2 (en) * 1997-09-01 2003-02-18 Canon Kabushiki Kaisha Apparatus for presenting mixed reality shared among operators
US5986660A (en) * 1997-12-31 1999-11-16 Autodesk, Inc. Motion capture data system and display
US6175343B1 (en) * 1998-02-24 2001-01-16 Anivision, Inc. Method and apparatus for operating the overlay of computer-generated effects onto a live image
US6473717B1 (en) * 1998-03-07 2002-10-29 Claus-Frenz Claussen Method and apparatus for evaluating a movement pattern
US6442293B1 (en) * 1998-06-11 2002-08-27 Kabushiki Kaisha Topcon Image forming apparatus, image forming method and computer-readable storage medium having an image forming program
US20020167500A1 (en) * 1998-09-11 2002-11-14 Visible Techknowledgy, Llc Smart electronic label employing electronic ink
US6580563B1 (en) * 1999-03-16 2003-06-17 Keller Limited Image detection system
US6561513B1 (en) * 1999-11-18 2003-05-13 Degeorge Andrew Role and war game playing system
US6795041B2 (en) * 2000-03-31 2004-09-21 Hitachi Zosen Corporation Mixed reality realizing system
US6690156B1 (en) * 2000-07-28 2004-02-10 N-Trig Ltd. Physical object location apparatus and method and a graphic display device using the same
US20020075286A1 (en) * 2000-11-17 2002-06-20 Hiroki Yonezawa Image generating system and method and storage medium
US6633304B2 (en) * 2000-11-24 2003-10-14 Canon Kabushiki Kaisha Mixed reality presentation apparatus and control method thereof
US20020095265A1 (en) * 2000-11-30 2002-07-18 Kiyohide Satoh Information processing apparatus, mixed reality presentation apparatus, method thereof, and storage medium
US20020103617A1 (en) * 2000-11-30 2002-08-01 Shinji Uchiyama Position and orientation determining method and apparatus and storage medium
US20020126895A1 (en) * 2001-03-06 2002-09-12 Kiyohide Satoh Specific point detecting method and device
US20020163576A1 (en) * 2001-03-22 2002-11-07 Nikon Corporation Position detector and attitude detector
US20030063115A1 (en) * 2001-09-10 2003-04-03 Namco Ltd. Image generation method, program, and information storage medium
US6798345B2 (en) * 2001-09-13 2004-09-28 Allied Telesis K.K. Administrative system

Cited By (131)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040109009A1 (en) * 2002-10-16 2004-06-10 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US7427996B2 (en) * 2002-10-16 2008-09-23 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US11103785B2 (en) * 2003-09-02 2021-08-31 Jeffrey D Mullen Systems and methods for location based games and employment of the same on location enabled devices
US9662582B2 (en) 2003-09-02 2017-05-30 Jeffrey D. Mullen Systems and methods for location based games and employment of the same on location enabled devices
US10974151B2 (en) 2003-09-02 2021-04-13 Jeffrey D Mullen Systems and methods for location based games and employment of the same on location enabled devices
US20050049022A1 (en) * 2003-09-02 2005-03-03 Mullen Jeffrey D. Systems and methods for location based games and employment of the same on location enabled devices
US10967270B2 (en) 2003-09-02 2021-04-06 Jeffrey David Mullen Systems and methods for location based games and employment of the same on location enabled devices
US11033821B2 (en) * 2003-09-02 2021-06-15 Jeffrey D. Mullen Systems and methods for location based games and employment of the same on location enabled devices
US20060284789A1 (en) * 2003-09-02 2006-12-21 Mullen Jeffrey D Systems and methods for location based games and employment of the same on location enabled devices
US20060258420A1 (en) * 2003-09-02 2006-11-16 Mullen Jeffrey D Systems and methods for location based games and employment of the same on location enabled devices
US20060079324A1 (en) * 2004-02-18 2006-04-13 Yusuke Watanabe Image display system, image processing system, and video game system
US20060073892A1 (en) * 2004-02-18 2006-04-06 Yusuke Watanabe Image display system, information processing system, image processing system, and video game system
US7690975B2 (en) 2004-02-18 2010-04-06 Sony Computer Entertainment Inc. Image display system, image processing system, and video game system
US8152637B2 (en) * 2004-02-18 2012-04-10 Sony Computer Entertainment Inc. Image display system, information processing system, image processing system, and video game system
US8096900B2 (en) * 2004-03-19 2012-01-17 Sports Innovation As Mat for sport and games
US20080200285A1 (en) * 2004-03-19 2008-08-21 Sports Innovation As Mat For Sport and Games
US20050231532A1 (en) * 2004-03-31 2005-10-20 Canon Kabushiki Kaisha Image processing method and image processing apparatus
US7728852B2 (en) 2004-03-31 2010-06-01 Canon Kabushiki Kaisha Image processing method and image processing apparatus
US7474318B2 (en) 2004-05-28 2009-01-06 National University Of Singapore Interactive system and method
US20050285878A1 (en) * 2004-05-28 2005-12-29 Siddharth Singh Mobile platform
US20050289590A1 (en) * 2004-05-28 2005-12-29 Cheok Adrian D Marketing platform
US20050288078A1 (en) * 2004-05-28 2005-12-29 Cheok Adrian D Game
US20080100620A1 (en) * 2004-09-01 2008-05-01 Sony Computer Entertainment Inc. Image Processor, Game Machine and Image Processing Method
US7991220B2 (en) * 2004-09-01 2011-08-02 Sony Computer Entertainment Inc. Augmented reality game system using identification information to display a virtual object in association with a position of a real object
US10828559B2 (en) 2004-11-16 2020-11-10 Jeffrey David Mullen Location-based games and augmented reality systems
US10179277B2 (en) 2004-11-16 2019-01-15 Jeffrey David Mullen Location-based games and augmented reality systems
US9352216B2 (en) 2004-11-16 2016-05-31 Jeffrey D Mullen Location-based games and augmented reality systems
US20060105838A1 (en) * 2004-11-16 2006-05-18 Mullen Jeffrey D Location-based games and augmented reality systems
US9744448B2 (en) 2004-11-16 2017-08-29 Jeffrey David Mullen Location-based games and augmented reality systems
US8585476B2 (en) * 2004-11-16 2013-11-19 Jeffrey D Mullen Location-based games and augmented reality systems
US20070077987A1 (en) * 2005-05-03 2007-04-05 Tangam Gaming Technology Inc. Gaming object recognition
US20080192300A1 (en) * 2005-08-03 2008-08-14 Grid Ip Pte. Ltd Information Output Device, Medium, and Information Input/Output Device
US8248666B2 (en) * 2005-08-03 2012-08-21 Yoshida Kenji Information input/output device including a stage surface on which a reflecting medium including a printed dot pattern is disposed
US9740948B2 (en) 2005-08-03 2017-08-22 Kenji Yoshida Information input/output device, and medium, using dot patterns with at least two kinds of inks with different reaction characteristic
US9250703B2 (en) 2006-03-06 2016-02-02 Sony Computer Entertainment Inc. Interface with gaze detection and voice input
US9958934B1 (en) 2006-05-01 2018-05-01 Jeffrey D. Mullen Home and portable augmented reality and virtual reality video game consoles
US10838485B2 (en) 2006-05-01 2020-11-17 Jeffrey D. Mullen Home and portable augmented reality and virtual reality game consoles
US20080096665A1 (en) * 2006-10-18 2008-04-24 Ariel Cohen System and a method for a reality role playing game genre
US20080188277A1 (en) * 2007-02-01 2008-08-07 Ritter Janice E Electronic Game Device And Method Of Using The Same
US20100311485A1 (en) * 2007-02-01 2010-12-09 Mattel, Inc. Electronic Game Device and Method of Using the Same
US8651953B2 (en) 2007-02-01 2014-02-18 Mattel, Inc. Electronic game device and method of using the same
US20110049234A1 (en) * 2007-06-21 2011-03-03 Kenji Yoshida Card surface reading/instruction executing method
US20180260021A1 (en) * 2007-10-11 2018-09-13 Jeffrey David Mullen Augmented reality video game systems
US10509461B2 (en) * 2007-10-11 2019-12-17 Jeffrey David Mullen Augmented reality video game systems
US20220129061A1 (en) * 2007-10-11 2022-04-28 Jeffrey David Mullen Augmented reality video game systems
US20200081521A1 (en) * 2007-10-11 2020-03-12 Jeffrey David Mullen Augmented reality video game systems
US9703369B1 (en) * 2007-10-11 2017-07-11 Jeffrey David Mullen Augmented reality video game systems
US11243605B2 (en) * 2007-10-11 2022-02-08 Jeffrey David Mullen Augmented reality video game systems
US20090108532A1 (en) * 2007-10-31 2009-04-30 Richard Darling Changeable gaming table
US8358320B2 (en) 2007-11-02 2013-01-22 National University Of Singapore Interactive transcription system and method
US20090195656A1 (en) * 2007-11-02 2009-08-06 Zhou Steven Zhi Ying Interactive transcription system and method
US10565796B2 (en) 2008-09-11 2020-02-18 Apple Inc. Method and system for compositing an augmented reality scene
US9824495B2 (en) 2008-09-11 2017-11-21 Apple Inc. Method and system for compositing an augmented reality scene
WO2010029553A1 (en) * 2008-09-11 2010-03-18 Netanel Hagbi Method and system for compositing an augmented reality scene
US10530914B2 (en) 2008-09-30 2020-01-07 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
US9749451B2 (en) 2008-09-30 2017-08-29 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
US11716412B2 (en) 2008-09-30 2023-08-01 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
US10686922B2 (en) 2008-09-30 2020-06-16 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
US10530915B2 (en) 2008-09-30 2020-01-07 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
US10897528B2 (en) 2008-09-30 2021-01-19 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
US10306036B2 (en) 2008-09-30 2019-05-28 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
US10306037B2 (en) 2008-09-30 2019-05-28 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
US20160327799A1 (en) * 2008-09-30 2016-11-10 Apple Inc. Head-Mounted Display Apparatus for Retaining a Portable Electronic Device with Display
US10306038B2 (en) 2008-09-30 2019-05-28 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
US20170011716A1 (en) * 2008-09-30 2017-01-12 Apple Inc. Head-Mounted Display Apparatus for Retaining a Portable Electronic Device with Display
US11089144B2 (en) 2008-09-30 2021-08-10 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
US9595237B2 (en) * 2008-09-30 2017-03-14 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
US11258891B2 (en) 2008-09-30 2022-02-22 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
US9646574B2 (en) * 2008-09-30 2017-05-09 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
US9646573B2 (en) 2008-09-30 2017-05-09 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
US11016557B2 (en) 2009-08-28 2021-05-25 Home Depot Product Authority, Llc Method and system for creating a personalized experience in connection with a stored value token
US8645220B2 (en) 2009-08-28 2014-02-04 Homer Tlc, Inc. Method and system for creating an augmented reality experience in connection with a stored value token
US20110055049A1 (en) * 2009-08-28 2011-03-03 Home Depot U.S.A., Inc. Method and system for creating an augmented reality experience in connection with a stored value token
US20110115157A1 (en) * 2009-11-17 2011-05-19 Filo Andrew S Game tower
US8328613B2 (en) 2009-11-17 2012-12-11 Hasbro, Inc. Game tower
US9513700B2 (en) 2009-12-24 2016-12-06 Sony Interactive Entertainment America Llc Calibration of portable devices in a shared virtual space
US9310883B2 (en) 2010-03-05 2016-04-12 Sony Computer Entertainment America Llc Maintaining multiple views on a shared stable virtual space
US20110216060A1 (en) * 2010-03-05 2011-09-08 Sony Computer Entertainment America Llc Maintaining Multiple Views on a Shared Stable Virtual Space
US8730156B2 (en) * 2010-03-05 2014-05-20 Sony Computer Entertainment America Llc Maintaining multiple views on a shared stable virtual space
US9597592B2 (en) 2010-04-13 2017-03-21 Samsung Electronics Co., Ltd. Method and apparatus for processing virtual world
US9282319B2 (en) 2010-06-02 2016-03-08 Nintendo Co., Ltd. Image display system, image display apparatus, and image display method
EP2394713A3 (en) * 2010-06-11 2014-05-14 Nintendo Co., Ltd. Image processing system, program, apparatus and method for video games
US10015473B2 (en) 2010-06-11 2018-07-03 Nintendo Co., Ltd. Computer-readable storage medium, image display apparatus, image display system, and image display method
US9058790B2 (en) 2010-06-11 2015-06-16 Nintendo Co., Ltd. Image processing system, storage medium storing image processing program, image processing apparatus and image processing method
US20140135087A1 (en) * 2010-06-21 2014-05-15 Alexander Luchinskiy Brain-teaser
US20110319166A1 (en) * 2010-06-23 2011-12-29 Microsoft Corporation Coordinating Device Interaction To Enhance User Experience
US9586147B2 (en) * 2010-06-23 2017-03-07 Microsoft Technology Licensing, Llc Coordinating device interaction to enhance user experience
US8884987B2 (en) 2010-09-22 2014-11-11 Nintendo Co., Ltd. Storage medium having stored thereon display control program, display control apparatus, display control system, and display control method for setting and controlling display of a virtual object using a real world image
US9278281B2 (en) 2010-09-27 2016-03-08 Nintendo Co., Ltd. Computer-readable storage medium, information processing apparatus, information processing system, and information processing method
US9064335B2 (en) 2011-02-25 2015-06-23 Nintendo Co., Ltd. System, method, device and computer-readable medium recording information processing program for superimposing information
EP2491989A3 (en) * 2011-02-25 2013-08-07 Nintendo Co., Ltd. Information processing system, information processing method, information processing device and information processing program
US8970623B2 (en) 2011-02-25 2015-03-03 Nintendo Co., Ltd. Information processing system, information processing method, information processing device and tangible recoding medium recording information processing program
US10120438B2 (en) 2011-05-25 2018-11-06 Sony Interactive Entertainment Inc. Eye gaze to alter device behavior
US9155967B2 (en) 2011-09-14 2015-10-13 Bandai Namco Games Inc. Method for implementing game, storage medium, game device, and computer
US20130271452A1 (en) * 2011-09-30 2013-10-17 Arvind Kumar Mechanism for facilitating context-aware model-based image composition and rendering at computing devices
US9285906B2 (en) * 2012-05-29 2016-03-15 Ricoh Company, Limited Information processing apparatus, information display system and information display method
US20130321312A1 (en) * 2012-05-29 2013-12-05 Haruomi HIGASHI Information processing apparatus, information display system and information display method
WO2014001239A1 (en) * 2012-06-28 2014-01-03 Alcatel Lucent Method for supporting a joint activity of a plurality of remote communication devices
EP2679290A1 (en) * 2012-06-28 2014-01-01 Alcatel-Lucent Method for supporting a joint activity of a plurality of remote communication devices
US9639989B2 (en) 2012-06-29 2017-05-02 Sony Corporation Video processing device, video processing method, and video processing system
EP2886172A1 (en) * 2013-12-18 2015-06-24 Microsoft Technology Licensing, LLC Mixed-reality arena
US9972131B2 (en) * 2014-06-03 2018-05-15 Intel Corporation Projecting a virtual image at a physical surface
US20150348324A1 (en) * 2014-06-03 2015-12-03 Robert L. Vaughn Projecting a virtual image at a physical surface
US9690375B2 (en) * 2014-08-18 2017-06-27 Universal City Studios Llc Systems and methods for generating augmented and virtual reality images
US10606348B2 (en) * 2014-08-18 2020-03-31 Universal City Studios Llc Systems and methods for generating augmented and virtual reality images
US11586277B2 (en) 2014-08-18 2023-02-21 Universal City Studios Llc Systems and methods for generating augmented and virtual reality images
EP4145255A1 (en) * 2014-08-18 2023-03-08 Universal City Studios LLC Systems and methods for enhancing visual content in an amusement park
US20190196581A1 (en) * 2014-08-18 2019-06-27 Universal City Studios Llc Systems and methods for generating augmented and virtual reality images
US10241568B2 (en) * 2014-08-18 2019-03-26 Universal City Studios Llc Systems and methods for generating augmented and virtual reality images
RU2668532C2 (en) * 2014-08-18 2018-10-01 ЮНИВЕРСАЛ СИТИ СТЬЮДИОС ЭлЭлСи Systems and methods for forming images with additional and virtual reality
WO2016028531A1 (en) * 2014-08-18 2016-02-25 Universal City Studios Llc Systems and methods for generating augmented and virtual reality images
US20160048203A1 (en) * 2014-08-18 2016-02-18 Universal City Studios Llc Systems and methods for generating augmented and virtual reality images
US10532271B2 (en) * 2015-10-23 2020-01-14 Chol Whan OH Data processing method for reactive augmented reality card game and reactive augmented reality card game play device, by checking collision between virtual objects
CN108351690A (en) * 2015-11-02 2018-07-31 索尼互动娱乐股份有限公司 Information processing unit, information processing system and information processing method
US10416758B2 (en) 2015-11-02 2019-09-17 Sony Interactive Entertainment Inc. Information processing apparatus, information processing system, and information processing method
US20170185160A1 (en) * 2015-12-24 2017-06-29 Samsung Electronics Co., Ltd. Electronic device and method of controlling the same
US10338688B2 (en) * 2015-12-24 2019-07-02 Samsung Electronics Co., Ltd. Electronic device and method of controlling the same
US10831436B2 (en) 2016-08-31 2020-11-10 Casio Computer Co., Ltd. Object display system, user communication device, and object display method and program
US20200023269A1 (en) * 2016-10-14 2020-01-23 Lego A/S Game system
US11389718B2 (en) * 2016-10-14 2022-07-19 Lego A/S Game system
US11452935B2 (en) 2016-11-11 2022-09-27 Lego A/S Virtual card game system
CN108550169A (en) * 2018-04-24 2018-09-18 中北大学 The computational methods of the determination of pieces of chess position and its height in three dimensions
US11593539B2 (en) 2018-11-30 2023-02-28 BlueOwl, LLC Systems and methods for facilitating virtual vehicle operation based on real-world vehicle operation data
US11691084B2 (en) 2020-01-20 2023-07-04 BlueOwl, LLC Systems and methods for training and applying virtual occurrences to a virtual character using telematics data of one or more real trips
US11707683B2 (en) 2020-01-20 2023-07-25 BlueOwl, LLC Systems and methods for training and applying virtual occurrences and granting in-game resources to a virtual character using telematics data of one or more real trips
US11857866B2 (en) 2020-01-20 2024-01-02 BlueOwl, LLC Systems and methods for training and applying virtual occurrences with modifiable outcomes to a virtual character using telematics data of one or more real trips
US11504622B1 (en) * 2021-08-17 2022-11-22 BlueOwl, LLC Systems and methods for generating virtual encounters in virtual games
US11697069B1 (en) 2021-08-17 2023-07-11 BlueOwl, LLC Systems and methods for presenting shared in-game objectives in virtual games
US11896903B2 (en) 2021-08-17 2024-02-13 BlueOwl, LLC Systems and methods for generating virtual experiences for a virtual game
US11918913B2 (en) 2021-08-17 2024-03-05 BlueOwl, LLC Systems and methods for generating virtual encounters in virtual games
US11684848B2 (en) * 2021-09-28 2023-06-27 Sony Group Corporation Method to improve user understanding of XR spaces based in part on mesh analysis of physical surfaces

Similar Documents

Publication Publication Date Title
US20030062675A1 (en) Image experiencing system and information processing method
JP3584229B2 (en) Video experience system and information processing method
JP3584230B2 (en) Video experience system, information processing method and program
US6853935B2 (en) Information processing apparatus, mixed reality presentation apparatus, method thereof, and storage medium
US8226011B2 (en) Method of executing an application in a mobile device
US8854356B2 (en) Storage medium having stored therein image processing program, image processing apparatus, image processing system, and image processing method
EP1708139B1 (en) Calibration method and apparatus
US7728852B2 (en) Image processing method and image processing apparatus
US8956227B2 (en) Storage medium recording image processing program, image processing device, image processing system and image processing method
US7084887B1 (en) Marker layout method, mixed reality apparatus, and mixed reality space image generation method
JP3530772B2 (en) Mixed reality device and mixed reality space image generation method
US7817167B2 (en) Method and apparatus for processing information
US20150130790A1 (en) Visually Convincing Depiction of Object Interactions in Augmented Reality Images
US8417384B2 (en) Information processing system, robot apparatus, and control method therefor
CN110402415A (en) Record the technology of augmented reality data
US20120218300A1 (en) Image processing system, method and apparatus, and computer-readable medium recording image processing program
US20120236179A1 (en) Image processing apparatus and image processing method
US20120108334A1 (en) Game device, control method for a game device, and a non-transitory information storage medium
JP2006301924A (en) Image processing method and image processing apparatus
JP2005143657A (en) Information presentation system, information presentation device, medium for information presentation device, information presentation method, and information presentation program
JP4367926B2 (en) Image composition system, image composition method, and image composition apparatus
US11127156B2 (en) Method of device tracking, terminal device, and storage medium
CN109844600A (en) Information processing equipment, information processing method and program
US20140241586A1 (en) Information retaining medium and information processing system
JP5735861B2 (en) Image display program, image display apparatus, image display method, image display system, marker

Legal Events

Date Code Title Description
AS Assignment

Owner name: CANON KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NORO, HIDEO;SATO, HIROAKI;MATSUI, TAICHI;REEL/FRAME:013344/0317;SIGNING DATES FROM 20020919 TO 20020920

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION