WO2011082405A1 - Interactive game environment - Google Patents

Interactive game environment Download PDF

Info

Publication number
WO2011082405A1
WO2011082405A1 PCT/US2011/020058 US2011020058W WO2011082405A1 WO 2011082405 A1 WO2011082405 A1 WO 2011082405A1 US 2011020058 W US2011020058 W US 2011020058W WO 2011082405 A1 WO2011082405 A1 WO 2011082405A1
Authority
WO
WIPO (PCT)
Prior art keywords
game
input
accordance
player
image
Prior art date
Application number
PCT/US2011/020058
Other languages
French (fr)
Inventor
Mark L. Davis
Timothy Alan Tabor
Roger H. Hoole
John M. Black
Jeffrey Taylor
Kirby Bistline
Original Assignee
MEP Games Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US12/651,947 external-priority patent/US20110165923A1/en
Application filed by MEP Games Inc. filed Critical MEP Games Inc.
Publication of WO2011082405A1 publication Critical patent/WO2011082405A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/04Dice; Dice-boxes; Mechanical dice-throwing devices
    • A63F9/0468Electronic dice; electronic dice simulators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F1/00Card games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F1/00Card games
    • A63F2001/008Card games adapted for being playable on a screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/24Electric games; Games using electronic circuits not otherwise provided for
    • A63F2009/2401Detail of input, input devices
    • A63F2009/2411Input form cards, tapes, discs
    • A63F2009/2419Optical
    • A63F2009/2425Scanners, e.g. for scanning regular characters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/24Electric games; Games using electronic circuits not otherwise provided for
    • A63F2009/2448Output devices
    • A63F2009/245Output devices visual
    • A63F2009/2461Projection of a two-dimensional real image
    • A63F2009/2463Projection of a two-dimensional real image on a screen, e.g. using a video projector
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2250/00Miscellaneous game characteristics
    • A63F2250/30Miscellaneous game characteristics with a three-dimensional image
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/69Involving elements of the real world in the game world, e.g. measurement in live races, real video
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/04Dice; Dice-boxes; Mechanical dice-throwing devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • Computer displays for example, display images, which can be affected by user input to a keyboard, mouse, controller, or another input device.
  • the computer display itself acts as an input device using touch or proximity sensing on a display.
  • multi-touch functional displays that can receive user input from multiple touches simultaneously.
  • Circle games involve the players manipulating objects on or proximate the play area. For example, many circle games require the player to role dice, start a timer, spin a spinner, play cards, move pieces, and so forth, depending on the game. Circle games have existed for thousands of years across diverse cultures. New circle games arise to meet the social needs and interests of the community while old circle games go out of use as society loses interest. Many believe that circle games provide significantly more opportunity for social development than other types of conventional video games that are strong in popularity in modern times. The contribution of circle games to society should not be ignored, but often is.
  • Circle games can provide an impetus for bringing families, friends, and other significant social groups together and fostering important human relationships. Children wait with great eagerness to engage with others in circle games. The types of circle games that individuals enjoy may change as one grows older, and may differ between population segments. Nevertheless, circle games draw human beings together with the immediate hope of engaging others in a test of skill, while the horizontal play area provides a subtle and significant side-benefit in permitting channels of communication to be opened, as players are positioned to face each other. Many have experienced that the conversation migrates to topics beyond the scope of the game itself, often resulting in a level of conversation that is greater than particular individuals might be inclined to engage in without the circle game. The benefit to society in encouraging individuals to come together in circle games is often underestimated and not fully recognized in a society in which people choose more and more to absorb themselves into fictional worlds.
  • Some embodiments described herein relate to the projection of an interactive game environment image on a surface.
  • the interactive image may be a three dimensional image, or may be two dimensional.
  • Data is received that represents virtual objects that are spatially positioned in virtual game environment space.
  • a game environment image is then projected on a surface that includes a visual representation of all or a portion of the virtual space including one or more of the virtual objects.
  • the system may then detect user interaction with the projected visualized representation of the virtual game environment space, and in response thereto, change the projected visualized representation. That interaction may be via an input device, or even more directly via physical interaction with the interactive game environment image.
  • the user might interact with a virtual object within the game environment image, or with a physical object (such as a game piece or a game board) that is within the space of the projected game environment image.
  • a user may interact with visualized representations of virtual space enabling complex and interesting interactivity scenarios and applications.
  • the game input mechanism includes a light-emitting mechanism that defines multiple input regions for a game in which there are multiple players. Each of the input regions is a portion of the playing surface in which a corresponding player subset is to provide physical input (such as rolling dice, playing cards, placing game pieces, and so forth) to affect game state.
  • a scanning mechanism scans objects placed within the input regions, while a communication mechanism communicates information regarding the scanned object. The information might, for example, be communicated to affect an electronic game state maintained in another device or distributed across multiple devices.
  • Figure 1 abstractly illustrates a distributed electronic game system
  • FIG. 2 abstractly illustrates an interactive image projection system that represents an example of the interactive image projection system of Figure 1.
  • Figure 3 illustrates an example embodiment of a virtual space that includes virtual objects
  • Figure 4 abstractly illustrates an image generation system with which the interactive image projection system may operate
  • Figure 5 abstractly illustrates a player console that represents an example of an input device of Figure 1 ;
  • Figure 6 illustrates a concrete example of a player console
  • Figure 7 illustrates another concrete example of a player console in the form of a game master player console
  • Figure 8 illustrates a flowchart of a method for projecting an interactive image on a surface
  • Figure 9 illustrate a concrete example of the interactive image projection system in which multiple projectors are operating and that does not use intervening optics in the projection or scanning operations;
  • Figure 10 illustrates another concrete example of the interactive image projections system in which a single projector is operating, and which does use intervening optics in the projection operation;
  • Figure 11 illustrates a computing system architecture in which the principles described herein may be employed in at least some embodiments
  • Figure 12 abstractly illustrates a distributed electronic game system that includes a central display
  • Figure 13 illustrates a more concrete example of the central display of Figure
  • Figure 14 abstractly illustrates an orientation-sensing game input device that may be an example of a game input device of Figure 12;
  • Figure 15 illustrates a specific concrete example of an orientation-sensing game input device in the form of an orientation-sensing die
  • Figure 16 schematically illustrates components of a scanning game input device
  • Figure 17 illustrates one embodiment of the scanning game input device of Figure 16 in which multiple game input regions are simultaneously defined
  • Figure 18A illustrates another embodiment of the scanning game input device of Figure 16 in which one game input region at a time is defined according to whose turn it is to provide physical game input;
  • Figure 18B illustrates the scanning game input device of Figure 18A, after having rotated the scanning mechanism to capture physical game input from another game input region;
  • Figure 19 illustrates an example system with a central display and surrounding player consoles in which each of the central display and the surrounding player consoles has an integrated scanning device;
  • Figure 20 illustrates an example system with a central display and with surrounding player consoles that each have an integrated scanning device, and with game state responding to physical game input in the form of a die roll;
  • Figure 21 illustrates a player console with integrated scanning device that represents a closer view of the player consoles illustrated in Figures 1 and 20;
  • Figure 22 illustrates a player console with an integrated scanning device that scans a game input region in the form of a window defined on the private display area of the player console itself.
  • the interactive game environment image may be a two dimensional, or may even include three-dimensional image information, such that the image may be viewed as a three-dimensional image with appropriate aids.
  • Data is received that represents virtual objects that are spatially positioned in virtual space.
  • the game environment image is then projected on the surface that includes a visual representation of all or a portion of the virtual game environment space including one or more of the virtual objects.
  • the interactive image projection system may then detect user interaction with the projected visualized representation of the virtual game environment space, and in response thereto, change the projected visualized representation, and perhaps cause a permanent change to game state.
  • a game input mechanism that includes a light- emitting mechanism that defines multiple input regions for a game in which there are multiple players.
  • Each of the input regions is a portion of the playing surface in which a corresponding player subset is to provide physical input (such as rolling dice, playing cards, or placing game pieces, and so forth) to affect game state.
  • a scanning mechanism scans objects placed within the input regions, while a communication mechanism communicates information regarding the scanned object. The information might, for example, be communicated to affect an electronic game state maintained in another device or distributed across multiple devices.
  • FIG. 1 abstractly illustrates a distributed electronic game system 100.
  • the system 100 includes an interactive image projection system 101.
  • the interactive image projection system 101 projects a game environment image 111 onto a surface.
  • the projected game environment image 111 is made to be interactive.
  • the surface is a substantially horizontal surface in which case the game environment image 1 11 is projected downwards on the surface.
  • the substantially horizontal surface may be a table top, counter top, a floor, a game board, or any other substantially horizontal surface.
  • a "substantially horizontal” surface may be any surface that is within 30 degrees of horizontal.
  • a "more precisely horizontal” surface may be any surface that is within 5 degrees of horizontal.
  • the surface may be a more complex surface.
  • the surface on which the interactive game environment image 1 11 may be projected may include a combination of a substantially horizontal surface and a substantially vertical surface.
  • a substantially vertical surface may be any surface that is within 30 degrees of vertical.
  • a "more precisely vertical” surface may be any surface that is within 5 degrees of vertical.
  • the complex surface might include a floor or table area (or a game board) as a substantially horizontal surface, and a wall as a substantially vertical surface.
  • the substantially vertical surface might also be a translucent material (such as glass).
  • Other examples of complex surface may include texture surfaces, as well as surfaces with a topology.
  • the interactive game environment image 111 represents an interactive game environment area in which one or more players may interact either through a player console, or directly via the image itself.
  • the interactive image 1 11 might also be a collaboration area, a work area, or any other type of interactive area.
  • the system 100 is often described as being a game in a particular example. In that case, the user(s) would each be a player, and the interactive area 1 11 would be an interactive play area.
  • the principles described herein may apply to any environment in which one or more users interact with a projected image on a surface.
  • Figure 1 is abstract only, the interactive image projection system 101 and the interactive game environment image 111 are only abstract representations. Subsequent figures will illustrate a more concrete representation of an example of the interactive image projection system 101 and the interactive game environment image 1 11.
  • the system 100 also includes surrounding control devices (also called herein "input devices").
  • input devices 102A through 102H illustrated in Figure 1 although the ellipses 1021 represents that there may be less than or greater than eight control devices.
  • the input devices 102 are each represented abstractly as rectangles although they will each have a particular concrete form depending on their function and design. Example forms are described further below. In the context of a game, for example, the input devices 102 may be player consoles. However, the input devices 102 are optional only.
  • the users may instead provide input through direct contact with the interactive game environment image 1 11 using, for example, a finger, or manipulating physical game pieces positioned within the interaction game environment image, or perhaps rolling dice or playing cards within the interactive image.
  • the interactive image projection system 101 is capable of responding to multiple simultaneous instances of users interacting with the interactive game environment image 111.
  • input into the system 100 may be achieved using either one or more of input devices 102 and/or by direct interaction with the interactive game environment image 111.
  • the users may affect game state in this manner.
  • one, some, or even all of the input devices 102 are wireless.
  • the wireless input device may communicate wirelessly with the interactive image projection system 101.
  • One or even some of the input devices 102 may be remotely located from the interactive image 111.
  • Such remotely located game input device(s) may perhaps communicate with the interactive image projection system 101 over a Wide Area Network (WAN) such as the Internet. That would enable a user to participate in the interactive image 11 1 even if that player is located in a completely different part of the globe.
  • WAN Wide Area Network
  • a father or mother stationed overseas might play a child's favorite board game with their child before going to bed.
  • game input devices 102 may be local (e.g., in the same room) to the interactive image projection system 101. In yet another embodiment, there are no game input devices 102. Regardless of whether there are input devices 102 or not, the user might directly interactive with the interactive game environment image 111.
  • FIG. 2 abstractly illustrates an interactive image projection system 200 that represent an example of the interactive image projection system 101 of Figure 1.
  • the system 200 is illustrates as including an output channel 210 that projects an image (such as interactive game environment image 1 1 1) onto a surface.
  • the output channel 210 includes several functions including image preparation and projection. Image preparation is performed by an image preparation mechanism 211, and projection of the image is performed by projector(s) 212 which includes at least one projector 212A, with the ellipses 212B representing that there may be more than one projector in the output channel 210 of the interactive image projection system 200.
  • the image preparation mechanism 21 1 receives an input image 201 and supplies an output image 202 in response to receiving the input image.
  • the input image 201 may be provided by any image generator.
  • the input image 201 might be provided by a video game console, a rendering program (whether two dimensional or three-dimensional), or any other module, component or software, that is capable of generating an image.
  • the input image 201 represents one or more virtual objects that are spatially positioned in a virtual game environment space.
  • the virtual space may represent a battleground with specific terrain.
  • the battleground is represented in a computer, and need not represent any actual battleground.
  • Other examples of virtual space might include a three-dimensional representation of the surface of the moon, a representation of a helium atom, a representation of a crater of a fictional planet, a fictional spacecraft, outer space, a fictional subterranean cave network, and so forth.
  • the virtual game environment space is created by a computer programmer either directly or indirectly through the creation of logic that creates the virtual space.
  • Virtual objects are placed in the virtual game environment space also by a computer programmer (or indirectly by logic created by the programmer), and may represent any object, real or imagined.
  • a virtual object might represent a soldier, a tank, a building, a fictional anti-gravity machine, or any other possible object, real or imagined.
  • Figure 3 illustrates an example of a virtual game environment space 300.
  • the virtual game environment space 300 includes objects 301, 302, 303 and 304.
  • the virtual game environment space 300 is three dimensional, such that the objects 301, 302, 303 and 304 are each represented as three dimensional objects having a specific shape and positioning within the virtual three-dimensional space.
  • This virtual space 300 may be used in order to formulate an image representation of a certain portion and/or perspective of the virtual game environment space.
  • the output image 202, as projected includes a visual representation of at least part of the virtual space, the visualized representation include a visualized representation of at least some of the virtual objects.
  • the projected image may show a visual representation of a portion of that crater, with virtual objects that might include several crater monsters, soldiers that are members of the same team, weapons that are strewn about and ready to be picked up, and so forth.
  • the visualized representation might be a portion of the city and include vehicles, buildings, people, and so forth.
  • the image preparation mechanism 21 1 may perform any processing on the input image 201 to generate the output image that is ultimately projected by the one or more projectors 212.
  • the image preparation mechanism 211 may simply pass through the input image 201 such that the output image 202 is identical to the input image 201.
  • the image preparation mechanism might also change the format of the image, change the resolution of the image, compress the image, decrypt the image, select only a portion of the image, and the like.
  • the image preparation mechanism 211 may select which portion (i.e., also referred to as a "subimage") of the input image is to be projected by each projector, such that when the images are projected by each projector, the collective whole of all of the projected images appears as a single image on the surface where the images are projected. This is referred to herein as stitching.
  • the image preparation might also take into consideration appropriate adjustments given the surface on which the output image 202 is to be projected, or any intervening optics. For instance, if the surface is a complex surface, the image preparation mechanism 21 1 may adjust the image such that the image appears properly on the surface. The user might configure the image preparation mechanism 21 1 with information regarding the surface. Alternatively or in addition, the system 200 may be configured to enter a discovery phase upon physical positioning that identifies the characteristics of the surface in relation to the projection mechanism. As an example, if the surface is a combination of horizontal and vertical surfaces, the image preparation may take into consideration the distances and the angles of the surface to make sure that the image appears proportional as intended on each surface.
  • the image preparation mechanism 211 may make appropriate geometrical adjustments to the image so that the image appears properly on the surface.
  • Other examples of complex surface includes surfaces that includes spherical surfaces, surfaces that represent a topology (as in a complex terrain with various peaks and valleys), surfaces that include a cylindrical surface, surfaces that includes convex portions, and/or surfaces that include concave portions.
  • the image preparation mechanism 211 may consider the presence of such optics.
  • the system 200 may also output various signals. For instance, the system 200 may output audio, such as perhaps the audio of the video game console that provides the input image 201.
  • the system 200 may output wired or wireless signals to the input devices 102, perhaps causing some private state to be altered at the input devices 102.
  • the system 200 may dispatch information in a wired or wireless fashion to the central display.
  • user input may be provided through interaction with an input device (such as one of the input devices 102 of Figure 1) and/or through direct interaction of a real object (such as a human finger, a game piece, a game board, a central display or the like) within the area of the interactive game environment image 1 1 1.
  • a real object such as a human finger, a game piece, a game board, a central display or the like
  • the interactive image projection mechanism 200 may also include an input channel 220.
  • the input channel 220 includes a scanning mechanism 221 configured to scan the area projected by the projected game environment image to determine one or more positions of a real interactivity input object.
  • the output game environment image 202 of Figure 2 includes just two-dimensional information.
  • the projector(s) 212 projects the image.
  • the scanning mechanism may scan the area in which the last frame was projected. This projection and scanning process is then repeated for the next frame image, and for the next, and so on. Even though projection and scanning do not happen at the same time (with scanning happening between image frame projections), they happen at such a high frequency that the projected image seems to have continuous motion.
  • the period of time that the projected image is not present is so short, and occurs at such a frequency, that it gives the illusion to the human observer that the projected image is always present.
  • real objects have the appearance of occupying the same space as the projected image.
  • the output image 202 of Figure 2 may represent three- dimensional information.
  • the projector(s) 212 may project a left eye image intended for the left eye, and a right eye image intended for the right eye.
  • the image can be observed by the human mind as being truly three dimensional.
  • 3-D glasses are an appropriate aid for enabling this kind of eye-specific light channeling, but the principles of the present invention are not limited to the type of aid used to allow a human observer to conceptualize three-dimensional image information.
  • the projection of the left eye image and the right eye image are interlaced, with each being displayed at a frequency at which continuous motion is perceived by a human observer.
  • a frequency at which continuous motion is perceived by a human observer typically 44 frames per second is the threshold above which an average human observer cannot distinguish discrete changes between frames, but instead perceives continuous motion.
  • the scanning mechanism 221 may scan for real objects in the scope of the projected image.
  • the scanning may also occur between every frame at 120 Hz, or perhaps between every other frame at 60 Hz, or perhaps at some other interval. That said, the principles described herein are not limited to any particular frame rate for projection and sampling rate for scanning.
  • the input channel 220 of the interactive image projection system 200 may also include an input preparation function provided by, for example, an input preparation mechanism 222.
  • This mechanism 222 may take the input provided through the scanning process and provide it in another form recognizable by a system that generates the input image 201 (such as perhaps by a conventional video game system).
  • the input preparation mechanism 222 may receive information from the scanning mechanism 221 that allows the input preparation mechanism 222 to recognize gestures and interaction with virtual objects that are visualized.
  • the input mechanism might recognize the gesture, and correlate that gesture to particular input.
  • the input preparation mechanism 222 may consider the surface configuration, as well as any optics (such as mirrors or lenses) that may intervene between the surface and the scanning mechanism 221.
  • the projected image is of a game board, with pieces placed on the game board.
  • the user might reach into the projected image, touch a projected game piece with a finger (or more accurately stated, "simulate touching" since the projected game piece is just a projection), and move that game piece from one location of the projected game board to another, thereby advancing the game state of the game perhaps permanently.
  • the movement may occur over the course of dozens or even hundreds of frames, which occurs in but a small moment by the user's perspective.
  • the input preparation mechanism 222 recognizes that a human finger has reached into the space that is occupied by the projected image, and has intersected the finger with the space that is occupied by the visualization of the game piece.
  • the input preparation would monitor the position of the user's finger in three-dimensional space, and have a concept for the three-dimensional position of the virtual game piece.
  • the game piece is just a projected portion of the image, and thus the user would not feel a game piece. Nevertheless, the input preparation mechanism 222 recognizes that the user is now indicated an intent to perform some action on the projected game piece.
  • the input preparation mechanism 222 recognizes slight incremental movement of the finger, which represents intent to move the game piece in the same direction and magnitude as the finger moved.
  • the input preparation mechanism knows what commands to issue to cause that actual image generator to cause that projected game piece to move in the virtual game environment space. The changes can be almost immediately observed in the projected image. This occurs for each frame until the user indicates an intent to no longer move the game piece (perhaps by tapping the surface on which the projected image is projected at the location at which the user wishes to deposit the projected game piece).
  • the appearance to the player would be as though the player had literally contacted the game piece and caused the game piece to move, even though the game piece is but a projection. Accordingly the system may move projected objects.
  • Other actions might include resizing, re-orienting, changing the form, or changing the appearance of the virtual object that the user interacted with.
  • the interactive image projection system 200 may interface with a conventional image generation system to enable the appearance of an interactive projected image. After all, the system 200 receives the image which is generated by the external system, although additional processing of the image may occur within the image preparation mechanism 21 1 , which is then projected. However, the external image generation system just generates the image in the same manner as if the image were just to be displayed on a conventional display. Furthermore, the external image generation system receives commands as it is accustomed to receive them to thereby effect a permanent change to the game state and advance progress through the game. The external image generation system acts the same no matter how complex the systems used to generate the commands. Whether the input was generated by a conventional hand-held controller, or through the complexity of the input channel 220, the external image generation system will act the same.
  • the input channel 220 may also provide information for other surrounding devices such as for example, any one or more of the input devices, or perhaps the central display, thereby altering state of any of these devices, and allowing for these devices to participate in the game state alterations caused by the playing interacting with the projected image.
  • the user may interact with physical objects within the area of the projected game environment image. These physical objects are not virtual, but are real, and thus can be felt by the player as they interact with the physical object.
  • the physical object may be an actual physical game board.
  • the input channel 220 may recognize the configuration of the game board and interpret player gestures (such as the movement of a physical game piece, or the interaction with a virtual object) with reference to the physical game board.
  • a physical MONOPOLY board may be placed within a projected image that might include virtual objects such as for example, virtual chance and community chest cards, and virtual houses and hotels, and perhaps a combination of real and virtual game pieces (according to player preference configured at the beginning of a game).
  • a player might tap on a property owned by that player, which the input channel may interpret as an intent to build a house on the property.
  • the input channel 220 might then coordinate with any external image generation system and the output channel 210 to cause an additional virtual house to appear on the property (with perhaps some animation).
  • the input channel 220 may coordination to debit the account of that player by the cost of a house.
  • the user's personal input device 102 may be transmitted information to allow the personal input device 102 to update with a new account balance.
  • the player might roll dice at the beginning of the player's turn.
  • the input channel 220 may recognize what was rolled and cause the projected image to highlight the position that the player's game piece should move to. If the player has a virtual game piece, then the system might automatically move (with perhaps some animation) the virtual game piece, or perhaps have the user move with the player's interaction with the virtual game piece (perhaps configured by the user to suit his/her preference). In response, the system might transmit a prompt to the user's input device, requesting whether the user desires to purchase the properly, or notifying the user of rent owed.
  • the output channel 210 not only projects images, but also responds to an external game system to provide appropriate output to appropriate devices. For instance, the output channel 210 might recognize that the external game system is prompting the current player as to whether to purchase the property.
  • the output channel 210 in addition to projecting the appropriate game environment image, may also transmit an appropriate prompt to the player's input device 102.
  • the central display may provide a displayed image and be positioned within the projected image of the image projection system 101.
  • a projected image may be superimposed upon an image displayed by the central display.
  • Figure 4 abstractly illustrates an image generation system 400, which may be used to generate the input image 201 of Figure 2.
  • the image generation system 400 may be a conventional video game which outputs an image that might, for example, change as a player progresses through the video game.
  • the functions described as being included within the image generation system 400 may be performed instead within the interactive image projection system 101.
  • the image generation system 400 includes logic 411, image generation mechanism 412, and an input interface 413.
  • the logic 41 1 and/or the image generation mechanism 412 have a concept for the virtual space in which the logic 411 operates.
  • the image generation mechanism 412 generates an image that is appropriate give a current state 414 of the logic 411.
  • the input interface 413 receives commands that may alter the state 414 of the logic 411, thereby potentially also affecting the image generated by the image generation mechanism 412.
  • the game state may even be permanently altered from one stage to the next as the players advance through the game. In such systems, images can be generated at such a rate that continuous motion is perceived.
  • the bi-directional channel may be wired or wireless, or perhaps wired in one direction and wireless in another. Input commands are typically less data-intensive as compared to images, and thus the communication channel from the interactive image projection system 200 to the image generation system 400 may be wireless.
  • the channel from the image generation system 400 to the interactive image projection system 200 may also be wireless provided that the bandwidth of the channel in that direction is sufficient.
  • the image projection system 101, and/or any of the surrounding game input devices 102 may have built in microphones to allow sound data (such as the player's voice) to be input into the image generation system 400 to affect the state 414. There may also be voice recognition capability incorporated into interactive image projection system 101 and/or surrounding game input devices 102 to permit such sound data to be converted to more usable form. Speakers, headset ports, and earpieces may also be incorporated into the surrounding input devices 102.
  • Figure 5 abstractly illustrates a player console 500.
  • the input devices 102 of Figure 1 may be player consoles in the context in which the system 100 is a game environment.
  • Figure 5 is an abstract illustration of a player console showing functional components of the player console 500.
  • Figure 5 is abstract. Accordingly, the various components illustrated as being included within the player console 500 should not be construed as implying any particular shape, orientation, positioning or size of the corresponding component.
  • Figure 6 will illustrate a more concrete representation of an example of the player console 500.
  • Each player, or perhaps each player team, may have an associated player console, each associated with the corresponding player or team.
  • the player console 500 includes a private display area 501 and game logic 502 capable of rendering at least a portion a private portion of game state 503 associated with the player (or team).
  • the player or team may use an input mechanism 504 to enter control input into the player console.
  • a transmission mechanism illustrated in the form of a transceiver 505 transmits that control information to the interactive image projection system 200 and/or to the image generation system 400, where the control information is used to alter the state 414 of the logic 41 1 used to generate the image.
  • FIG. 6 illustrate a concrete example of a player console 600.
  • the private display area 601 displays the player's private information (in this case, several playing cards).
  • the player console 600 also includes a barrier 602 to prevent other players from seeing the private game state displayed on the private display area 601.
  • the private viewing area 601 may be touch-sensitive, allowing the player to interact with physical gestures on the private viewing area 601, thereby causing control information to update the rendering on the private display area, and the game states on the player console 600, as well as on the central display 101.
  • the private display area 601 also, in this example, displays video images 603 A, 603B and 603C of other players.
  • the player console might be a game master console 700, in which the game master may interface with the private viewing area to perhaps control game state.
  • the game master may use physical gestures on the touch-sensitive display 701 of the game master console 700 to affect what is displayed within the interactive game environment image 1 1 1.
  • the game master might control what portions of the map are viewable within interactive game environment image 1 11.
  • the game master might also control what effect another player's actions might have on the operation of the game logic.
  • the game master might also create a scenario and setting of a game using the game master console 700.
  • Figure 8 illustrates a flowchart of a method 800 for projecting an interactive game environment image on a surface.
  • the system receives data (act 801) representing a virtual objects that are spatially positioned in a virtual space.
  • An example of such data is an image in which such virtual objects are represented.
  • the image is then projected (act 802) on a surface in response to the received data.
  • the projected image including a visual representation of at least part of the virtual space.
  • the system detects user interaction (act 803) with the visualized representation. In response to that user interaction, the projected image is then altered (act 804).
  • FIG 9 illustrate a concrete example 900 of the interactive image projection system 101 in which multiple modules 902A through 902E are mounted to a stand 901.
  • Each module 902A through 902E includes a projector and a corresponding camera (not shown) which would be in the lower surface of each module 902A through 902E.
  • the projector projects the images downward towards a floor on which the stand 901 is situated. These projectors would each project a corresponding subimage that are each processed such that the projected image is stitched together to appear as a single image on the floor.
  • the camera scans the area of the projected image for user interaction in the area of the projected image.
  • Figure 9 does not use intervening optics in the projection or scanning operations.
  • FIG 10 illustrates another concrete example 1000 of the interactive image projections system 101 in which a single projector is operating, and which does use intervening optics in the projection operation.
  • the interactive image projection system 1000 includes a housing that includes a rigid base 1001 that is situated on a substantially horizontal surface.
  • a projector mechanism 1011 projects a single image upward through a lens to be reflected off of a curved mirror 1012, through windows 1013, and downward onto the substantially horizontal surface on which the base 1001 is placed.
  • the images are prepared to account for the intervening lenses and mirrors used to direct the image onto the horizontal surface.
  • Four cameras (of which three 1021 A through 1021C are visible in Figure 10) are positioned around the upper circumference of the system 1000. Such cameras scan the area of the projected image.
  • an interactive game environment image projection system has just been described. Having described the embodiments in some detail, as a side-note, the various operations and structures described herein may, but need, not be implemented by way of a physical computing system. Accordingly, to conclude this description, an example computing system will be described with respect to Figure 11.
  • the computing system 1100 may be incorporated within the interactive image projection system 101, within one or more of the input devices 102, and/or within the image generation system 400.
  • FIG 11 illustrates a computing system 1100.
  • Computing systems are now increasingly taking a wide variety of forms. Computing systems may, for example, be handheld devices, appliances, laptop computers, desktop computers, mainframes, distributed computing systems, or even devices that have not conventionally been considered a computing system.
  • the term "computing system” is defined broadly as including any device or system (or combination thereof) that includes at least one processor, and a memory capable of having thereon computer-executable instructions that may be executed by the processor.
  • the memory may take any physical form and may depend on the nature and form of the computing system.
  • a computing system may be distributed over a network environment and may include multiple constituent computing systems.
  • a computing system 1 100 typically includes at least one processing unit 1 102 and memory 1104.
  • the memory 1104 is a physical system memory, which may be volatile, non-volatile, or some combination of the two.
  • the term “memory” may also be used herein to refer to non-volatile mass storage such as physical storage media. If the computing system is distributed, the processing, memory and/or storage capability may be distributed as well.
  • the term “module” or “component” can refer to software objects or routines that execute on the computing system. The different components, modules, engines, and services described herein may be implemented as objects or processes that execute on the computing system (e.g., as separate threads).
  • embodiments are described with reference to acts that are performed by one or more computing systems. If such acts are implemented in software, one or more processors of the associated computing system that performs the act direct the operation of the computing system in response to having executed computer-executable instructions.
  • An example of such an operation involves the manipulation of data.
  • the computer-executable instructions (and the manipulated data) may be stored in the memory 1 104 of the computing system 1 100.
  • Embodiments within the scope of the present invention also include computer- readable media for carrying or having computer-executable instructions or data structures stored thereon.
  • Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer.
  • Such computer-readable media can comprise physical storage and/or memory media such as RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other physical medium which can be used to carry or store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
  • Computer-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions.
  • the components of the computing system 1 100 may, for example, be used to provide functionality to game logic, store or remember game state, configure and communicate between devices, and operate the logic of game incorporation.
  • Each of the player consoles may also have a computing system such as computing system 1 100 guiding their processing needs.
  • Figure 12 abstractly illustrates a distributed electronic game system 1200 that may be used in the light emitting boundary embodiment, and which may be also used in the interactive projection embodiments in the case where a central display is used.
  • the system 1200 includes a flat multi-touch functional central display 1201.
  • the central display 1201 may be laid horizontally on a table or other surface and may be used as a horizontal central playing surface. For instance, the central display 1201 may behave as an electronic board of a digital board game.
  • the display 1201 may be movable, or perhaps may be fixed, perhaps being built into a furniture item. Since Figure 12 is abstract, the various components illustrated as being included within the central display 1201 should not be construed as implying any particular shape, orientation, positioning or size of the corresponding component. Subsequent figures will illustrate a more concrete representation of an example of the central display 1201.
  • the system 1200 also includes surrounding game control devices (also called herein "input devices").
  • Such input devices may, for example, be the same as the input devices 102 described with respect to Figure 1.
  • the player consoles 500, 600 and 700 may also be examples of the input devices 102 of Figure 12, as well as being examples of the input devices 102 of Figure 1.
  • the game input devices 102 may be orientation-sensitive game input devices, player consoles, or a combination thereof.
  • the central display may preferably be a flat multi-touch functional central display 1201 is capable of detecting and responding to multiple simultaneous instances of players touching the display 1201, and affecting game state in response to each touch instance. Such may be employed to effectively assist in games in which multiple players may be touching the screen simultaneously, although not all games require some simultaneous input.
  • the central display 1201 may also have a scratch resistant coating to prevent scratching that might otherwise be caused by players touching the central display 1201.
  • the central display 1201 may also receive signals from the surrounding game input devices 102, interpret control actions from the signals, and affect game state in response to the control actions.
  • the central display 1201 includes a public display area 1211. Note that the public display area 1211 is only abstractly represented in Figure 12, and is thus not drawn to scale. In a preferred embodiment, the public display area 121 1 would actually occupy a substantial majority of the viewable surface of the central display 1201 when the display 1201 is laid horizontally, and thus emulate a board-like play area.
  • the public display area displays game information that should be viewable by all of the players and is thus deemed "public”. There is no required form for the central display 1201.
  • the central display 1201 might have any size or configuration.
  • the central display 1201 also includes game logic 1212 that is capable of rendering all or at least a portion of the public game state 1213 on the public display area, and is capable of formulating or determine game state based on game input.
  • game logic 1212 that is capable of rendering all or at least a portion of the public game state 1213 on the public display area, and is capable of formulating or determine game state based on game input.
  • a communication mechanism in the form of wireless transceiver 1214 receives control information from surrounding game input devices 102, and in some cases, may transmit information to the surrounding game input devices.
  • a game incorporation mechanism 1215 identifies the control information received from the game input devices 102 and alters a game state based on the control information.
  • the central display 1201 incorporates functionality of a general -purpose computing system with a hard drive 1221, memory 1222, general- purpose processor(s) 1223, speakers 1224 (and/or headset ports with headsets or earpieces), a video driver 1225, a wireless transceiver 1226 (such as a BLUETOOTH® transceiver), and so forth (see ellipses 1227).
  • a general -purpose computing system with a hard drive 1221, memory 1222, general- purpose processor(s) 1223, speakers 1224 (and/or headset ports with headsets or earpieces), a video driver 1225, a wireless transceiver 1226 (such as a BLUETOOTH® transceiver), and so forth (see ellipses 1227).
  • the game logic 1212, portions of the transceiver mechanism 1214 stack, and the game incorporation mechanism 1215 may be software-implemented.
  • the game state 1213 may be represented as data within the hard drive 1221, memory 1222 and
  • the central display 1201, and/or any of the surrounding game input devices 102 may have built in microphones to allow sound data (such as the player's voice) to be input into the system to affect game configuration or game state. There may also be voice recognition capability incorporated into central display 1201 and/or surrounding game input devices 1202 to permit such sound data to be converted to more usable form. Speakers, headset ports, and earpieces may also be incorporated into the surrounding game input devices.
  • the system 1200 is described as being an electronic game system, the principles described herein are not limited to the use of system 1200 for games.
  • the central display 1201 may be used to display any public state, whereas input devices 102 may not necessarily be used to provide input for a game.
  • the game logic 1212 may be any logic.
  • the term "player" described herein may more broadly include any participant in a system in which there is a public viewing area for displaying public state associated with any process, and a private viewing area for displaying private state associated with the process.
  • Figure 13 illustrates a more concrete example 1300 of the display 1201 of Figure 12.
  • the display 1300 includes the public display area 1311 that represents an example of the public display area 1211 of Figure 12.
  • the displayed public game state may be associated with' any type of game, and may render game state in response to instructions provided by the video driver 1225.
  • the video driver 1225 may, in response to commands from the game logic, display cinematic game introductions and/or scene transitions to help entice the players into a richer playing experience.
  • the video driver 1225 may also display a cinematic conclusion that may depend on a result of the game.
  • the display 1300 there are a number of built-in input devices 1312A through 1312H (referred to collectively as "input devices 1312"). In this case, there are eight illustrated built-in input devices (two on each of the four sides of the display 1300), although the display 1300 may have any number of built-in input devices.
  • the built-in input devices may be a camera capable of capturing a still or video image and may be adjustable. Thus, for example, in a game with eight local players, each camera may be adjusted to capture the video of a corresponding player.
  • the display 1300 may include logic that renders the captured video, or portions thereof, on the public display area 131 1 of the display 1300.
  • the logic might also cause all or portions of that video to be transmitted to game input devices (such as player consoles) so that the video may also be displayed at the various game input devices.
  • the built-in input devices may fold into the display 1300 edge.
  • the built-in input devices 1312A, 1312B, 1312E and 1312G are illustrated in contracted collapsed (inactive) position within the display 1300, whereas the input devices 1312C, 1312D, 1312F and 1312H are illustrated in extended position ready to capture video.
  • the built-in input devices 1312 may be a scanner, capable of detecting physical game input provided by a player (such as a roll of the dice, the playing of a card, or the positioning of a game piece.
  • the scanner may include a light-emitting boundary definition mechanism that defines the boundary of an input region using emitted light.
  • the emitted light may be emitted along the perimeter of the input region and/or across the area of the input region.
  • the player may then visualize where the physical game input is to be provided. Once that input is provided, the scanner scans the physical input so that the game input represented by that physical input may be incorporated into the game by game incorporation mechanism 1215.
  • the scanner might be, for example, a three- dimensional image scanner such as those conventionally available on the market.
  • the scanner may be integrated with the camera to form a built-in input device, or they may be separate from each other to allow for independent adjustment of the camera direction and input region positioning.
  • Figure 14 abstractly illustrates an orientation-sensing game input device 1400.
  • the surrounding game input devices 102 of Figure 1 and 12 may be orientation-sensing game input devices, player consoles, game master consoles or a combination thereof.
  • Figure 14 is an example of such an orientation-sensing game input device.
  • Figure 14 is abstract. Accordingly, the various components illustrated as being included within the orientation-sensing device 1400 should not be construed as implying any particular shape, orientation, positioning or size of the corresponding component. Subsequent figures will illustrate a more concrete representation of an example of the orientation-sensing game input device 1400.
  • the orientation-sensing game input device 1400 includes an orientation sensor 1401 that, when active, outputs a spatial orientation signal representing a spatial orientation of the game input device.
  • the orientation sensor 1401 is rigidly attached to the game input device 1400.
  • the orientation sensor 1401 is able to detect how the game input device 1400 is oriented with respect to vertical, and/or how the game input device is oriented with respect to north.
  • the orientation sensor 1401 is an accelerometer.
  • the orientation sensor 1401 may be a compass that generates a direction signal indicating a geographical orientation.
  • the orientation-sensing device may also potentially have a Global Positioning System (GPS) that allows the orientation-sensing device 1400 to detect a global position of the orientation-sensing device 1400 in global coordinates.
  • GPS Global Positioning System
  • a transmission mechanism 1402 is communicatively coupled to the orientation sensor 1401 so as to receive the spatial orientation signal from the orientation-sensor 1401 and transmit spatial orientation information present in the spatial orientation signal to the flat multi-touch functional display 1201.
  • the transmission mechanism 1402 may accomplish this using acoustics, but preferably accomplishes this using wireless electro-magnetic radiation.
  • a suitable protocol for transmission of the spatial orientation information is BLUETOOTH®.
  • the orientation-sensing device 1400 is a multi-sided die, and if the orientation sensor 1401 is a tri-axial accelerometer, the spatial orientation signal may indicate or at least include enough information to infer which side of the die is facing up.
  • the spatial orientation signal may indicate or at least include enough information to infer whether the playing card is face up or face down, or which side of the coin is facing up.
  • the orientation-sensing device 1400 is a domino tile, and the orientation sensor 1401 is an accelerometer, the spatial orientation signal may convey whether the domino tile were face up or face down.
  • the orientation sensor 1401 is also a compass, the spatial orientation signal may convey which direction the domino was oriented on the table.
  • the transmission mechanism 1402 may also transmit other useful information.
  • the transmission mechanism may transmit a locally-unique and perhaps globally-unique identifier. This may be especially useful in a case where there are multiple orientation-sensing devices 1400 being used in a game. For instance, if the orientation-sensing devices 1400 were each six-sided die, the central device could confirm what die was rolled, and the associated rolled value of that specific die, even if multiple dice were rolled.
  • the orientation-sensing device 1400 might also transmit other information identifying characteristics of the device 1400. For instance, if the device 1400 were a coin, the device 1400 might transmit a device type identifier that identifies the device as a coin, and so forth for other types of devices. The device 1400 might also transmit information from which the central device might infer other characteristics of the device as well, such as color, size, shape, which might be helpful where such characteristics have an impact on game state.
  • the device 1400 might transmit information that helps the central display interpret the impact on the game of the orientation of the device 1400. For instance, one die might have a quality of 36 in which the actual value input by the roll result is to be 36 times the number rolled. Such quality information may be included with the transmission.
  • the transmission mechanism 1402 includes a reliable transmission mechanism in which transmissions are acknowledged by the central display, or else the ; information is retransmitted according to a particular protocol.
  • Figure 15 illustrates an orientation-sensing die 1500.
  • the orientation sensing die 1500 is a six-sided die.
  • the principles described herein may be applied to any die, regardless of the number of sides. For instance, some die have as few as only four sides. Some commercially available die have as many as 100 sides.
  • the die includes a multi-sided body 1501 having at least four flat sides; (in the illustrated example six sides).
  • the image on each itself is not illustrated such that some of the internal-embedded components may be more easily seen. That said, the various components are not necessarily drawn to size since the precise size and positioning of the components is not critical, so long as the components fit within the boundaries of the die. Furthermore, if the die is desired to be kept random, the components should be distributed appropriately to keep the center of gravity in the middle of the cube.
  • An orientation sensor 1511 (such as a tri-axial accelerometer) is embedded within the multi-sided body 1501 and is structured to, when active, output a spatial orientation signal representing a spatial orientation of the game input device.
  • a transmission mechanism 1512 is also embedded within the multi-sided body 401 and communicatively coupled to the orientation sensor 151 1 so as to receive the spatial orientation signal and transmit spatial orientation information present in the spatial orientation signal to locations external to the multi-sided body.
  • the orientation sensor 151 1 and the transmission mechanism 1512 are a single integrated BLUETOOTH® - enabled tri-axial accelerometer.
  • An electronic power source 1513 is also embedded within the multi-sided body 1501 and is coupled to the orientation sensor 151 1 and the transmission mechanism 1512 so as to electronically power the orientation sensor 1511 and the transmission mechanism 1512.
  • the electronic power source 1513 includes a rechargeable battery.
  • the electronic power source 1513 may also be an insertable and removable battery and may even perhaps be disposable.
  • the electronic power source 1513 is a non-rechargeable disposable battery that is not removable from the die.
  • the entire die may be considered disposable, or at least converts to a normal non-transmitting die after the battery fails.
  • the die may have a cavity that fits the battery, and that is accessed by removing a cover that snaps into place.
  • a status indicator 1516 may also be included and may be visible from external to the multi-sided body 1501.
  • the status indicator 1516 may be on the surface of the die 1500. If the multi-sided body 1501 is composed of translucent material, the status indicator 1516 may also be embedded within the multi-sided body 1501 itself. If necessary or desired, a counterweight 1517 may also be positioned rigidly within the multi-sided body 1501 so as to further center a center of gravity of the wireless die.
  • Figure 16 illustrates abstractly a game input mechanism in the form of a scanning device 1600.
  • the scanning device is an example of the game input devices 102 of Figure 12.
  • the scanning device 1600 is drawn abstractly so once again the various components of the scanning device are not limited to any particular size, position, orientation, or form.
  • the scanning game input device 1600 includes a light- emitting boundary definition mechanism 1601, a scanning mechanism 1602, a communication mechanism 1603, and a mechanical support mechanism 1604.
  • the scanning game input mechanism may also have processor(s) 1605 and memory 1606, thus enabling the scanning game input mechanism to at least partially process information captured by the scanning mechanism 1602, control the light-emitting boundary definition mechanism 1601, and/or communicate with the communication mechanism 1603.
  • the light-emitting boundary definition mechanism 1601 defines multiple input regions for a game in which multiple players are engage. Each of the so defined input regions is a region on a playing surface in which a corresponding player subset is to provide physical game input. A player subset may be multiple players in a team- oriented game, or may be a single player in a game that does not involve teams. Examples of physical input include 1 ) the rolling of a die or dice, 2) the playing of one or more cards, 3) the positioning of one of more game pieces, 4) the spinning of a spinning or top, 5) a human hand and so forth. For instance, in an electronic version of rock, paper, scissors, a human hand might be used to provide game input within the game input region.
  • the light-emitting boundary definition mechanism 1601 may selectively define only one or perhaps a subset of the multiple regions that the mechanism 1601 is capable of defining. For example, in a turn-oriented game in which it is a turn of one or more, but less than all, of the player subsets, the corresponding game input regions for only those player subset(s) whose turn it is might be made visible.
  • Game state transmitted by the central display 1201 and/or the other game input devices 102 might give the scanner game input device information sufficient to derive the identity of whose turn it is, to thereby prompt the scanning game input device 1600 to light the appropriate region corresponding to shows turn it is, while deemphasizing or even not lighting at all the game input region corresponding to player subset(s) whose turn it is not.
  • a scanning mechanism 1602 is configured to scan at least some objects placed within any of the plurality of input regions.
  • the game input regions may be non-overlapping or they may be overlapping depending on the design of a game.
  • a communication mechanism 1603 communicates information regarding scanned objects scanned by the scanning mechanism 1602.
  • the scanning game input device 1600 is wireless, in which case the communication mechanism 1603 communicates wirelessly with, for example, the central display 1201 and/or one or more other game input devices 102.
  • the communication mechanism 1603 might simply send image information (e.g., a collection of images of a die) to the central display 1201, and have the central display 1201 extrapolate the three-dimensional rendering of the viewable surfaces, and then calculate the game input.
  • the processor(s) 1605 might take on more processing role by extrapolating the three-dimensional rendering of the scanned image, and then the communication mechanism 1603 communicates that three-dimensional rendering to the central display, which then calculates the game input.
  • the processor(s) 1605 might take on all processing required to determine the game input from a scanning operation. For example, the processor(s) 1605 might determine that the player subset rolled two die, resulting in a roll of a six and a four.
  • the communication mechanism 1603 might also communicate with player consoles to thereby affect the private game state of the private consoles.
  • the communication mechanism 1603 might additionally communicate with other devices such as, for example, a surrounding computing system (such as a laptop computer), to convey information and/or may receive information from the surrounding computing system (such as configuration information) or from the central display 1201 or other game input devices 102
  • a surrounding computing system such as a laptop computer
  • the communication mechanism 1603 might additionally communicate with other devices such as, for example, a surrounding computing system (such as a laptop computer), to convey information and/or may receive information from the surrounding computing system (such as configuration information) or from the central display 1201 or other game input devices 102
  • the mechanical support mechanism 1604 positions the light-emitting boundary definition mechanism 1 01 and the scanning mechanism 1602 with respect to a playing surface.
  • the mechanical support mechanism couples the scanning game input device 1600 to the central display 1201, or perhaps couples the game input device 1600 to one of the player consoles.
  • the scanning game input device may not be rigidly coupled to the central display 1201 or the player consoles, but may be free standing.
  • the mechanical support mechanism 1604 may have a different form depending on the configuration of the scanning input system. For instance, if the scanning device 1600 scans from below (e.g., which could be done with a translucent playing surface, the mechanical support mechanism 1604 would be properly configured so that the light-emitting boundary definition mechanism 1601 may light the surface from below, and the scanning mechanism 1602 may scan from below. If the scanning device hangs from the ceiling, or is supported by a wall, the appropriate configuration of mechanical support mechanism 1604 may be provided. Accordingly, the specific example configurations of Figures 17 and 18A and 18B are only examples of one of an infinite variety of ways to configure the scanning system in the context of a game. Mirrors or lens may even be used to direct the flow of light for the light-emitting boundary definition mechanism and/or for the scanning mechanism.
  • the scanning mechanism 1600 may be incorporated into any of the central display 1201 (if present) or any of the surrounding game input device 102 without restriction.
  • the scanning device 1600 may even be incorporated into a pair of glasses, a hat, an eyepiece or other mechanisms that sits on the player's head. In that case, no light-emitting boundary definition mechanism 1601 may be needed, although it still might be helpful. Rather, the player would know that the scanning mechanism 1602 is scanning a region that is relative to the player's field of view.
  • the light-emitting boundary definition mechanism 1601 might still be helpful though to help the player see the area that is to be scanned since the positioning of the glasses or other headgear, the orientation of the eyeball, and so forth might affect whether the game input region is directly in the player's field of view.
  • the processor(s) 1605 and the memory 1606 may collaborate to determine, at any given point, which players turn it is.
  • the processor(s) and the memory 1606 may then cause the light-emitting boundary definition mechanism 1601 to provide visual emphasis to the game input region in which physical game input is expected. For instance, the boundaries of the region may be turned green when physical game input is expected.
  • Figure 17 illustrates one embodiment 1700 of the scanning game input device 1600 of Figure 16.
  • the scanning game input device has a light- emitting boundary definition mechanism that defines four game input regions 1702A, 1702B, 1702C and 1702D.
  • each game input region is defined by a dedicated light-emitting boundary definition mechanism positioned within an upper portion 1703 that is supported by base 1701.
  • the boundary definition mechanism may be, for example, a Light Emitting Diode (LED), or any other device capable of defining the game input region by providing visual emphasis to the boundaries of the game input region, and/or by providing visual emphasis over the area of the game input region.
  • LED Light Emitting Diode
  • the light-emitting boundary definition mechanism 1601 may defined fixed-sized boundaries, the light-emitting boundary definition mechanism 1601 may also perhaps be adjustable.
  • the light-emitting boundary definition mechanism may be an array of LEDs. The size and shape of the boundary may be adjusted by turning some of the LEDs off, and keep some on. Each of the LEDs may be mapped to a particular memory location that turns the LED on or off, or adjusted between two discrete intensity levels (in the case of being mapped to a single bit), or have more refined adjustable intensity (in the case of being mapped to multiple bits).
  • the boundaries may be overlapping if desired. Such overlapping may also be a reward for a winning player, and a detriment for a losing player, with the winning player perhaps capturing some benefit by the physical game input of the losing player.
  • the boundary size might be configurable by a user. For instance, a player may choose to have a smaller or larger game input region depending on the player's preference. For instance, a younger player in a dice game might choose to have a larger roll area to accommodate a more aggressive and less controlled roll. An order player might require less of a roll area.
  • the boundary size might also be adjusted by the game state itself. For instance, as a player is losing a game, the player may have a more and more reduced size of a boundary in which to provide physical game input, or perhaps the boundaries may take a particular form that serves to taunt the player that is moving towards a loss. If the player is winning a game, the boundaries may perhaps expand, and/or take a more congratulatory form.
  • the LEDs may be of different colors such that the boundaries make a different color depending on game state. For instance, greener game input regions might designate the player is winning, whereas redder game input regions might designate the player is losing. Thus, the players can quickly ascertain and have feedback on how the player is doing.
  • Changing of colors of the game input regions may be accomplished by adjusting the proportion of LEDs of particular colors that are turned on and off, and their respective intensity levels.
  • the color of the game input regions may also define whose turn it is. For instance, if the color is green, that may mean it is that player's turn, if red or off, it may mean it is not that player's turn.
  • the scanning device 1600 may be an LED array that directly displays the game input region.
  • the light-emitting boundary definition mechanism 1602 may essentially be a portion of the public display area 1311 of the display 1300 of the central display 1201.
  • the light- emitting boundary definition mechanism 1601 may be all or a portion of the private display area 601 of the player console 600.
  • the boundary definition mechanism 1600 may also be a laser that defines a sharp boundary for the game input region.
  • a window might pop up on a portion of the public display area 1311 that is closer to the player.
  • a scanning device might be positioned in a predetermined location (e.g., integrated with the display 1300) with respect to that window such that the scanning mechanism 1602 may capture the window.
  • the window may include boundaries that make it easier for the scanning mechanism 1602 to recognize the boundaries of the game input region.
  • the content of the window may display a good contrasting color to the color of the game input so as to optimize scanning accuracy (e.g., if the die are white, then the window may have darker content.
  • the player provides physical game input directly on the public display area (e.g., rolls the dice onto the public display area 1311) such that the physical game input that occurs within the window is captured by the scanning mechanism.
  • the shape or size of the window may be adjusted in response to game state.
  • a window might pop up on a portion of the private display area 601 corresponding to the player console 600 that belongs to the player whose turn it is.
  • a scanner might be positioned in a predetermined location (e.g., integrated with the player console 600) with respect to that game input region such that the scanning mechanism may capture the window. Then, the player provides physical game input directly on the private display area 601 (e.g., rolls the dice onto the private display area 601) such that the physical game input that occurs within the windo is captured by the scanning mechanism associated with the player console.
  • the scanning device 1600 includes a light- emitting boundary definition mechanism that is a portion of a display itself.
  • the term "light-emitting boundary definition mechanism" should be interpreted broadly in the claims.
  • Figure 22 a player console 2200 with an integrated scanning device 2210 that scans a game input region in the form of a window defined on the private display area of the player console itself.
  • window 2201 might define a game input region in which the player is to enter physical input (e.g., the roll of a die 2204).
  • the window size might change to be for example window 2202 depending on game state.
  • the window may be, for example, a window displayed by an operating system on the private display.
  • the scanning device 2210 may be capable of scanning area 2203, although the system may ignore material scanned outside of the window that defines the game input region.
  • the window may be displayed to have a clear and distinct boundary to make it easier for the scanning device 2210 (or the system that interprets the scanned information to detect the game input region).
  • Each game input region also has a 3-D scanner associated therewith for scanning the region within the corresponding boundary.
  • a 3-D scanner associated therewith for scanning the region within the corresponding boundary.
  • Figures 18A and 18B collectively illustrated another alternative embodiment of the scanning game input device 1600 of Figure 16.
  • the scanning game input device of Figures 18A and 18B appears the same as the scanning game input device 1700 of Figure 17.
  • the upper portion 1703 is rotatably mounted to the base 1701.
  • the upper portion 1703 may have as few as a single light- emitting boundary definition mechanism and single scanning mechanism affixed therein.
  • the scanning game input device of Figures 18A and 18B rotates the upper portion 1703' when transitioning turns. This might be done according to some predetermined pattern, with the players situating themselves to be proximate their corresponding desired game input region.
  • the scanning game input device may first determine whose turn it is next, which may not be according to a predetermined pattern. The scanning game input device may determine this autonomously, or may determine this in communication with the central display and/or one or more of the player consoles.
  • Figure 18A illustrates the scanning game input device with the rotatable upper portion 1703' rotatably mounted on the base 170 ⁇ , and with the boundary definition mechanism and scanning mechanism rotated to form game input region 1702A'.
  • the upper portion 1703' is rotated to form game input region 1702B'.
  • there may be multiple fixed light-emitting boundary definition mechanisms whereas a rotatable portion includes the scanning mechanism, which rotates to whichever game input region corresponds to the player set whose turn it is.
  • boundary color, or intensity level that gives visual emphasis to the boundaries or area corresponding to the game input region whose turn it is.
  • the scanning mechanism rotates not to any fixed position, but senses where the player is whose turn it is presently.
  • the scanning game input device may detect the position of the player's player console, and rotate the game input region accordingly by rotating the light-emitting boundary definition mechanism and scanning mechanism.
  • the position of the player console may be determined in a number of ways. For instance, the player console may emit ultrasonic or subsonic acoustic signals that the scanning game input device may acoustically sense. Should GPS coordinate systems become more accurate, the player console may transmit GPS information to the scanning game input device.
  • the position of the player may also be calculated based on the orientation of a camera built into the central display. Thus, if a player moves during the course of the game, the position of their corresponding game input region changes accordingly.
  • the scanning device 1600 might scan any number of physical game input types. For instance, the scanning device 1600 might scan dice, playing pieces, playing cards, spinners, or any other object, even the player himself or herself. For instance, the scanning device 1600 might scan a human hand. This might allow the game state to reflect that the player played a "rock", or a “paper”, or a “scissors", or even “ambiguous”. The scanning device 1600 might also scan the hand to identify a number of fingers, or whether the hand is facing up or down, and so forth. The scanning device 1600 might use a hand as input to allow people proficient in sign language to enter letters or words into the game system.
  • the scanning device 1600 might also scan a human face perhaps to analyze the configuration of the face. For instance, the scanning device 1600 may detect whether the face is smiling or is confident, seems angry, frustrated, or nervous, for purposes of making any inference about the players emotions. Such emotional feedback may impact game state. For instance, if the player looks nervous, the player may be more subjected to attack by computerized players, or may have a reduced size of a game input region.
  • a game input device 102 may sense other biometrics of a player such as, for example, oxygen levels in the blood, blink rate, perspiration levels, heart rate, breathing rate, chemical content of exhaled breath, blood pressure, and so forth using any appropriate mechanism, whether through a scanning device or by some other mechanism. Any one or more of the measured biometrics, either singly, or in combination, may be used to calculate an effect on game state.
  • a game input device might also be a scanner positioned to view all or a portion of a playing board (e.g., the central display 1201 or even a non-electronic playing board or surface), and recognize the position, orientation of a game piece with respect to the playing board, or even a type of game piece. Such information may be used to affect the game state.
  • a playing board e.g., the central display 1201 or even a non-electronic playing board or surface
  • Such information may be used to affect the game state.
  • Such as scanner may also be able to detect whose game piece or game input device belongs to which player. For instance, die for one player may have a certain marker, such as an indented piece with a certain color. Playing cards may have a miniature bar code distinguishing who the cards belong to. A bar code or other marker might also represent other information regarding a playing piece, such as a type of playing piece, the significance of the playing piece, and so forth.
  • Such scanners need not necessarily have the light-emitting boundary definition mechanism if the players intuitively understand where to play the game pieces in a more common area (e.g., on the central display 1201).
  • the game input device 1300 might emit images or other visual cues on the playing surface in response to game input. For instance, if the player where to roll a six, then the game input device may emit on the playing surface a cue telling the user where to move, or what the options are for moving.
  • the scanning device 1600 has been described as potentially having a scanning mechanism 1602 that uses light as the scanning signal. However, the scanning mechanism 1602 might rather use any signal for scanning such as acoustic signals or safe frequencies of electro-magnetic radiation. An example of electro-magnetic radiation is visible light, ultraviolet light, infrared light, long-wave and short-wave radio, and so forth.
  • the scanning device 1600 may use combinations of the above to formulate a more complete scanned image of the game input token.
  • the scanning device 1600 might also have any number of different image capture mechanisms. Examples of image capture mechanism include a CCD camera, a bar code scanner, or a 3-D imaging camera.
  • Figure 19 illustrates an example system 1900 in which there is a central display 1901 (representing an example of the central display 1201 of Figure 12), and four surrounding player consoles 191 1 , 1912, 1913 and 1914 (each representing an example of the player console 500 of Figure 5).
  • the central display 1901 has a rotating camera 1921 that may turn to whomever's turn it is, and capture the player's image for display on the central display 1901 and/or one or more or all of the player consoles 191 1 through 1914.
  • Each player console 1911 through 1914 is shown equipped with an integrated scanning device 1931 through 1934, respectively.
  • the scanning device represents an example of the scanning device 1600 of Figure 16.
  • a light-emitting boundary definition mechanism associated with the scanning device 1931 is emitting light to define a game input region 1941. In this case, die have been rolled into the game input region 1941.
  • the scanning device 1931 captures the 3-D image of the die, and transmits information to the central display 1901 where the roll is incorporated into the game state.
  • One of the player consoles 1914 is shown having a privacy screen 1942, which may be removably attached to the player console 1914, or perhaps may be removably attached to any of the player consoles to provide appropriate privacy.
  • the scanning device 1931 might be turned to focus on the display of the player console 191 1.
  • a software-driven window pops up on the display of the player console 1911 showing the player where the player should roll. The player thus would roll the die directly on the display of the player console 191 1, whereupon the scanning device 1931 would capture the physical game input for incorporation into the game state.
  • Figure 20 illustrates another example system 2000 in which there is a central display 2001 (representing an example of the central display 1201 of Figure 12), and three surrounding player consoles 2011, 2012 and 2013 (each representing an example of the player console 500 of Figure 5).
  • the game state captured by the physical game input captured by the scanning device is incorporated to actually give the player a visual cue 2020 of the available movement options.
  • Figure 21 illustrates a player console 2100 that is similar to the player consoles 191 1, 1912, 1913, 1914, 201 1, 2012 and 2013, except more close up.
  • a scanning device 2102 is shown extended, but with a recess 2103 in which the scanning device might contract into perhaps before or after the game.
  • the scanning device 2102 might automatically extend and contract depending on the game state. For instance, if it is the player's turn, then the player console 2100 may extend in preparation for the player providing game input.
  • the scanning device 2102 remains extended for the duration of the game, and may be manually extendable and contractable.
  • a data and/or power cable 2101 (such as a USB cable) is also shown demonstrating that the player console may integrate with existing data cables and power cables.
  • the distributed game system described herein thus allows circle games to be played electronically.
  • the wireless distributed game system appeals to a teenager's keenness for a sense of technology, which has the potential to pull teenagers back into the family circle games, potentially enriching family relationships and maintaining important lines of communication.
  • the central display 1201 has an Internet connection (represented generally by the ellipses 1227 in Figure 12.
  • the central display may be configured to navigate to a predetermined set of one or more web sites, and may have a predetermined set of circle games installed already. The player might use the central display to navigate to a central web site that may be used to download software necessary to engage in other circle games.
  • the central device may inform the surrounding player consoles of the game that is about to begin and, if necessary, provide the appropriate software to the player consoles as well.
  • the player consoles are general-purpose computing devices with one or more processors, a memory, and potentially a hard disk. Accordingly, a flexible game system has just been described. Having described the embodiments in some detail, as a side-note, the various operations and structures described herein may, but need, not be implemented by way of a physical computing system, such as the computing system 1 100 of Figure 1 1

Abstract

The projection of an interactive game environment image on one or more surfaces. The interactive game environment image may be a three dimensional image, or may be two dimensional. Data is received that represents virtual objects that are spatially positioned in virtual space. An image is then projected on the substantially horizontal surface that includes a visual representation of all or a portion of the virtual space including one or more of the virtual objects. The system may then detect user interaction with the projected visualized representation of the virtual space, and in response thereto, change the projected visualized representation. Also, a scanning game input mechanism that includes a light-emitting mechanism that defines multiple input regions for a game in which there are multiple players.

Description

INTERACTIVE GAME ENVIRONMENT
CROSS REFERENCE TO RELATED APPLICATIONS
This application is a continuation-in-part of commonly assigned, co-pending application serial number 12/651,947, filed January 4, 2010, entitled Electronic Circle Game System, which application is incorporated herein by reference in its entirety. This application is also a continuation-in-part of commonly assigned, co-pending application serial number 12/411,289, filed March 25, 2009, entitled Wirelessly Distributed Electronic Circle Gaming, which application is also incorporated herein by reference in its entirety.
BACKGROUND
There are a variety of conventional displays that offer an interactive experience. Computer displays, for example, display images, which can be affected by user input to a keyboard, mouse, controller, or another input device. In some cases, the computer display itself acts as an input device using touch or proximity sensing on a display. There are even now multi-touch functional displays that can receive user input from multiple touches simultaneously.
Sometimes, however, the use of such displays tends to discourage some types of conventional social interaction. For instance, games have provided a social context in which people can interact and have fun. One type of game that is particularly engaging socially is "circle" games, where players will gather around a central horizontal play area that is visible to all players, and interact with the central horizontal play area and with each other. Such players are often as few as two (as is the case with chess or checkers), but may be as many as a dozen or more. Board games are circle games in which the board serves as the central horizontal play area. However, there are other circle games that have a central play area that is not a board. For instance, many card games can be played directly on the surface of a table or other flat surface. Many circle games involve the players manipulating objects on or proximate the play area. For example, many circle games require the player to role dice, start a timer, spin a spinner, play cards, move pieces, and so forth, depending on the game. Circle games have existed for thousands of years across diverse cultures. New circle games arise to meet the social needs and interests of the community while old circle games go out of use as society loses interest. Many believe that circle games provide significantly more opportunity for social development than other types of conventional video games that are strong in popularity in modern times. The contribution of circle games to society should not be ignored, but often is.
Circle games can provide an impetus for bringing families, friends, and other significant social groups together and fostering important human relationships. Children wait with great eagerness to engage with others in circle games. The types of circle games that individuals enjoy may change as one grows older, and may differ between population segments. Nevertheless, circle games draw human beings together with the immediate hope of engaging others in a test of skill, while the horizontal play area provides a subtle and significant side-benefit in permitting channels of communication to be opened, as players are positioned to face each other. Many have experienced that the conversation migrates to topics beyond the scope of the game itself, often resulting in a level of conversation that is greater than particular individuals might be inclined to engage in without the circle game. The benefit to society in encouraging individuals to come together in circle games is often underestimated and not fully recognized in a society in which people choose more and more to absorb themselves into fictional worlds.
BRIEF SUMMARY
Some embodiments described herein relate to the projection of an interactive game environment image on a surface. The interactive image may be a three dimensional image, or may be two dimensional. Data is received that represents virtual objects that are spatially positioned in virtual game environment space. A game environment image is then projected on a surface that includes a visual representation of all or a portion of the virtual space including one or more of the virtual objects. The system may then detect user interaction with the projected visualized representation of the virtual game environment space, and in response thereto, change the projected visualized representation. That interaction may be via an input device, or even more directly via physical interaction with the interactive game environment image. In the case of direct interaction, the user might interact with a virtual object within the game environment image, or with a physical object (such as a game piece or a game board) that is within the space of the projected game environment image. Thus, a user may interact with visualized representations of virtual space enabling complex and interesting interactivity scenarios and applications.
Other embodiments described herein relate to a game input mechanism. The game input mechanism includes a light-emitting mechanism that defines multiple input regions for a game in which there are multiple players. Each of the input regions is a portion of the playing surface in which a corresponding player subset is to provide physical input (such as rolling dice, playing cards, placing game pieces, and so forth) to affect game state. A scanning mechanism scans objects placed within the input regions, while a communication mechanism communicates information regarding the scanned object. The information might, for example, be communicated to affect an electronic game state maintained in another device or distributed across multiple devices.
This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
In order to describe the manner in which the above-recited and other advantages and features can be obtained, a more particular description of various embodiments will be rendered by reference to the appended drawings. Understanding that these drawings depict only sample embodiments and are not therefore to be considered to be limiting of the scope of the invention, the embodiments will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
Figure 1 abstractly illustrates a distributed electronic game system;
Figure 2 abstractly illustrates an interactive image projection system that represents an example of the interactive image projection system of Figure 1.
Figure 3 illustrates an example embodiment of a virtual space that includes virtual objects; Figure 4 abstractly illustrates an image generation system with which the interactive image projection system may operate;
Figure 5 abstractly illustrates a player console that represents an example of an input device of Figure 1 ;
Figure 6 illustrates a concrete example of a player console;
Figure 7 illustrates another concrete example of a player console in the form of a game master player console;
Figure 8 illustrates a flowchart of a method for projecting an interactive image on a surface;
Figure 9 illustrate a concrete example of the interactive image projection system in which multiple projectors are operating and that does not use intervening optics in the projection or scanning operations;
Figure 10 illustrates another concrete example of the interactive image projections system in which a single projector is operating, and which does use intervening optics in the projection operation;
Figure 11 illustrates a computing system architecture in which the principles described herein may be employed in at least some embodiments;
Figure 12 abstractly illustrates a distributed electronic game system that includes a central display; *
Figure 13 illustrates a more concrete example of the central display of Figure
12;
Figure 14 abstractly illustrates an orientation-sensing game input device that may be an example of a game input device of Figure 12;
Figure 15 illustrates a specific concrete example of an orientation-sensing game input device in the form of an orientation-sensing die;
Figure 16 schematically illustrates components of a scanning game input device;
Figure 17 illustrates one embodiment of the scanning game input device of Figure 16 in which multiple game input regions are simultaneously defined;
Figure 18A illustrates another embodiment of the scanning game input device of Figure 16 in which one game input region at a time is defined according to whose turn it is to provide physical game input; Figure 18B illustrates the scanning game input device of Figure 18A, after having rotated the scanning mechanism to capture physical game input from another game input region;
Figure 19 illustrates an example system with a central display and surrounding player consoles in which each of the central display and the surrounding player consoles has an integrated scanning device;
Figure 20 illustrates an example system with a central display and with surrounding player consoles that each have an integrated scanning device, and with game state responding to physical game input in the form of a die roll;
Figure 21 illustrates a player console with integrated scanning device that represents a closer view of the player consoles illustrated in Figures 1 and 20; and
Figure 22 illustrates a player console with an integrated scanning device that scans a game input region in the form of a window defined on the private display area of the player console itself.
DETAILED DESCRIPTION
Some embodiments described herein (referred to herein as the "interactive projection" embodiments) relate to the projection of an interactive game environment image on a surface. The interactive game environment image may be a two dimensional, or may even include three-dimensional image information, such that the image may be viewed as a three-dimensional image with appropriate aids. Data is received that represents virtual objects that are spatially positioned in virtual space. The game environment image is then projected on the surface that includes a visual representation of all or a portion of the virtual game environment space including one or more of the virtual objects. The interactive image projection system may then detect user interaction with the projected visualized representation of the virtual game environment space, and in response thereto, change the projected visualized representation, and perhaps cause a permanent change to game state.
Other embodiments described herein (refer to herein as the "light emitting boundary" embodiments) relate to a game input mechanism that includes a light- emitting mechanism that defines multiple input regions for a game in which there are multiple players. Each of the input regions is a portion of the playing surface in which a corresponding player subset is to provide physical input (such as rolling dice, playing cards, or placing game pieces, and so forth) to affect game state. A scanning mechanism scans objects placed within the input regions, while a communication mechanism communicates information regarding the scanned object. The information might, for example, be communicated to affect an electronic game state maintained in another device or distributed across multiple devices.
First, the interactive projection embodiments will be described with respect to Figures 1 through 11. Then, the light emitting boundary embodiments will be described with respect to Figure 12 through 22.
Although not required, the interactive projection embodiments may be especially useful in an electronic game system. Figure 1 abstractly illustrates a distributed electronic game system 100. The system 100 includes an interactive image projection system 101. The interactive image projection system 101 projects a game environment image 111 onto a surface. Through unique features of the system 100 described hereinafter, the projected game environment image 111 is made to be interactive.
In one embodiment, the surface is a substantially horizontal surface in which case the game environment image 1 11 is projected downwards on the surface. As an example, the substantially horizontal surface may be a table top, counter top, a floor, a game board, or any other substantially horizontal surface. In this description and in the claims, a "substantially horizontal" surface may be any surface that is within 30 degrees of horizontal. In this description and in the claims, a "more precisely horizontal" surface may be any surface that is within 5 degrees of horizontal.
In another embodiment, the surface may be a more complex surface. For instance, the surface on which the interactive game environment image 1 11 may be projected may include a combination of a substantially horizontal surface and a substantially vertical surface. In this description and in the claims, a "substantially vertical" surface may be any surface that is within 30 degrees of vertical. In this description and in the claims, a "more precisely vertical" surface may be any surface that is within 5 degrees of vertical. As an example, the complex surface might include a floor or table area (or a game board) as a substantially horizontal surface, and a wall as a substantially vertical surface. The substantially vertical surface might also be a translucent material (such as glass). Other examples of complex surface may include texture surfaces, as well as surfaces with a topology.
The interactive game environment image 111, as projected by the interactive image projection system 101 onto the surface, represents an interactive game environment area in which one or more players may interact either through a player console, or directly via the image itself. However, the interactive image 1 11 might also be a collaboration area, a work area, or any other type of interactive area. In the remainder of this description, the system 100 is often described as being a game in a particular example. In that case, the user(s) would each be a player, and the interactive area 1 11 would be an interactive play area. The principles described herein may apply to any environment in which one or more users interact with a projected image on a surface.
Since Figure 1 is abstract only, the interactive image projection system 101 and the interactive game environment image 111 are only abstract representations. Subsequent figures will illustrate a more concrete representation of an example of the interactive image projection system 101 and the interactive game environment image 1 11.
Optionally, the system 100 also includes surrounding control devices (also called herein "input devices"). There are eight such input devices 102A through 102H illustrated in Figure 1 , although the ellipses 1021 represents that there may be less than or greater than eight control devices. The input devices 102 are each represented abstractly as rectangles although they will each have a particular concrete form depending on their function and design. Example forms are described further below. In the context of a game, for example, the input devices 102 may be player consoles. However, the input devices 102 are optional only. Instead of providing input through the input devices 102, the users may instead provide input through direct contact with the interactive game environment image 1 11 using, for example, a finger, or manipulating physical game pieces positioned within the interaction game environment image, or perhaps rolling dice or playing cards within the interactive image. The interactive image projection system 101 is capable of responding to multiple simultaneous instances of users interacting with the interactive game environment image 111. Thus, input into the system 100 may be achieved using either one or more of input devices 102 and/or by direct interaction with the interactive game environment image 111. Thus, the users may affect game state in this manner.
In one embodiment, one, some, or even all of the input devices 102 are wireless. In the case of a wireless input device, the wireless input device may communicate wirelessly with the interactive image projection system 101. One or even some of the input devices 102 may be remotely located from the interactive image 111. Such remotely located game input device(s) may perhaps communicate with the interactive image projection system 101 over a Wide Area Network (WAN) such as the Internet. That would enable a user to participate in the interactive image 11 1 even if that player is located in a completely different part of the globe. Thus, for example, a father or mother stationed overseas might play a child's favorite board game with their child before going to bed. Or perhaps former strangers and new friends from different cultures around the globe might engage in a game, potentially fostering cross-cultural ties while having fun. That said, perhaps all of the game input devices 102 may be local (e.g., in the same room) to the interactive image projection system 101. In yet another embodiment, there are no game input devices 102. Regardless of whether there are input devices 102 or not, the user might directly interactive with the interactive game environment image 111.
Figure 2 abstractly illustrates an interactive image projection system 200 that represent an example of the interactive image projection system 101 of Figure 1. The system 200 is illustrates as including an output channel 210 that projects an image (such as interactive game environment image 1 1 1) onto a surface. The output channel 210 includes several functions including image preparation and projection. Image preparation is performed by an image preparation mechanism 211, and projection of the image is performed by projector(s) 212 which includes at least one projector 212A, with the ellipses 212B representing that there may be more than one projector in the output channel 210 of the interactive image projection system 200.
The image preparation mechanism 21 1 receives an input image 201 and supplies an output image 202 in response to receiving the input image. The input image 201 may be provided by any image generator. As an example, the input image 201 might be provided by a video game console, a rendering program (whether two dimensional or three-dimensional), or any other module, component or software, that is capable of generating an image.
The input image 201 represents one or more virtual objects that are spatially positioned in a virtual game environment space. As an example, the virtual space may represent a battleground with specific terrain. The battleground is represented in a computer, and need not represent any actual battleground. Other examples of virtual space might include a three-dimensional representation of the surface of the moon, a representation of a helium atom, a representation of a crater of a fictional planet, a fictional spacecraft, outer space, a fictional subterranean cave network, and so forth. Whether representing something real or imagined, the virtual game environment space is created by a computer programmer either directly or indirectly through the creation of logic that creates the virtual space.
Virtual objects are placed in the virtual game environment space also by a computer programmer (or indirectly by logic created by the programmer), and may represent any object, real or imagined. For instance, a virtual object might represent a soldier, a tank, a building, a fictional anti-gravity machine, or any other possible object, real or imagined.
Figure 3 illustrates an example of a virtual game environment space 300. In this example, the virtual game environment space 300 includes objects 301, 302, 303 and 304. In this case, the virtual game environment space 300 is three dimensional, such that the objects 301, 302, 303 and 304 are each represented as three dimensional objects having a specific shape and positioning within the virtual three-dimensional space. This virtual space 300 may be used in order to formulate an image representation of a certain portion and/or perspective of the virtual game environment space. The output image 202, as projected includes a visual representation of at least part of the virtual space, the visualized representation include a visualized representation of at least some of the virtual objects. For instance, if the virtual space included terrain for the inside of a crater, the projected image may show a visual representation of a portion of that crater, with virtual objects that might include several crater monsters, soldiers that are members of the same team, weapons that are strewn about and ready to be picked up, and so forth. If the virtual space were a city, the visualized representation might be a portion of the city and include vehicles, buildings, people, and so forth.
The image preparation mechanism 21 1 may perform any processing on the input image 201 to generate the output image that is ultimately projected by the one or more projectors 212. As an example, the image preparation mechanism 211 may simply pass through the input image 201 such that the output image 202 is identical to the input image 201. The image preparation mechanism might also change the format of the image, change the resolution of the image, compress the image, decrypt the image, select only a portion of the image, and the like. If multiple projectors are being used, the image preparation mechanism 211 may select which portion (i.e., also referred to as a "subimage") of the input image is to be projected by each projector, such that when the images are projected by each projector, the collective whole of all of the projected images appears as a single image on the surface where the images are projected. This is referred to herein as stitching.
The image preparation might also take into consideration appropriate adjustments given the surface on which the output image 202 is to be projected, or any intervening optics. For instance, if the surface is a complex surface, the image preparation mechanism 21 1 may adjust the image such that the image appears properly on the surface. The user might configure the image preparation mechanism 21 1 with information regarding the surface. Alternatively or in addition, the system 200 may be configured to enter a discovery phase upon physical positioning that identifies the characteristics of the surface in relation to the projection mechanism. As an example, if the surface is a combination of horizontal and vertical surfaces, the image preparation may take into consideration the distances and the angles of the surface to make sure that the image appears proportional as intended on each surface. Thus, the image preparation mechanism 211 may make appropriate geometrical adjustments to the image so that the image appears properly on the surface. Other examples of complex surface includes surfaces that includes spherical surfaces, surfaces that represent a topology (as in a complex terrain with various peaks and valleys), surfaces that include a cylindrical surface, surfaces that includes convex portions, and/or surfaces that include concave portions. In the case in which the image is to pass through optics such as lens and mirrors, the image preparation mechanism 211 may consider the presence of such optics.
In addition to image preparation and projection, the system 200 may also output various signals. For instance, the system 200 may output audio, such as perhaps the audio of the video game console that provides the input image 201. The system 200 may output wired or wireless signals to the input devices 102, perhaps causing some private state to be altered at the input devices 102. In addition, if there is a central display that displays a game environment image (such as the interactive central display described in the light emitting boundary embodiments described with respect to Figures 12 and subsequent figures) (hereinafter referred to simply as the "central display"), the system 200 may dispatch information in a wired or wireless fashion to the central display.
As described above, user input may be provided through interaction with an input device (such as one of the input devices 102 of Figure 1) and/or through direct interaction of a real object (such as a human finger, a game piece, a game board, a central display or the like) within the area of the interactive game environment image 1 1 1. If there is to be direct interaction to provide input, the interactive image projection mechanism 200 may also include an input channel 220.
The input channel 220 includes a scanning mechanism 221 configured to scan the area projected by the projected game environment image to determine one or more positions of a real interactivity input object.
As an example, suppose that the output game environment image 202 of Figure 2 includes just two-dimensional information. In that case, for each image frame, the projector(s) 212 projects the image. Then, after that frame is projected, during a short period in which the image is not projected, the scanning mechanism may scan the area in which the last frame was projected. This projection and scanning process is then repeated for the next frame image, and for the next, and so on. Even though projection and scanning do not happen at the same time (with scanning happening between image frame projections), they happen at such a high frequency that the projected image seems to have continuous motion. Furthermore, even though the projected image is not always present, the period of time that the projected image is not present is so short, and occurs at such a frequency, that it gives the illusion to the human observer that the projected image is always present. Thus, real objects have the appearance of occupying the same space as the projected image.
As another example, the output image 202 of Figure 2 may represent three- dimensional information. In that case, for each image frame, the projector(s) 212 may project a left eye image intended for the left eye, and a right eye image intended for the right eye. When appropriate aids are present that allow the left eye of a human observer to receive the left eye image (but not the right eye image), and that allow the right eye of that same human observer to receive the right eye image (but not the left eye image), the image can be observed by the human mind as being truly three dimensional. 3-D glasses are an appropriate aid for enabling this kind of eye-specific light channeling, but the principles of the present invention are not limited to the type of aid used to allow a human observer to conceptualize three-dimensional image information.
In one example, the projection of the left eye image and the right eye image are interlaced, with each being displayed at a frequency at which continuous motion is perceived by a human observer. Typically 44 frames per second is the threshold above which an average human observer cannot distinguish discrete changes between frames, but instead perceives continuous motion. Thus, a system that operates at 120 Hz, and which interlaces a left eye image and a right eye image, each at 60 Hz, will suffice to formulate the appearance of continuous three-dimensional motion. At periodic times between frames, the scanning mechanism 221 may scan for real objects in the scope of the projected image. In projection system that operates at 120 Hz, for example, the scanning may also occur between every frame at 120 Hz, or perhaps between every other frame at 60 Hz, or perhaps at some other interval. That said, the principles described herein are not limited to any particular frame rate for projection and sampling rate for scanning.
The input channel 220 of the interactive image projection system 200 may also include an input preparation function provided by, for example, an input preparation mechanism 222. This mechanism 222 may take the input provided through the scanning process and provide it in another form recognizable by a system that generates the input image 201 (such as perhaps by a conventional video game system). For instance, the input preparation mechanism 222 may receive information from the scanning mechanism 221 that allows the input preparation mechanism 222 to recognize gestures and interaction with virtual objects that are visualized. The input mechanism might recognize the gesture, and correlate that gesture to particular input. The input preparation mechanism 222 may consider the surface configuration, as well as any optics (such as mirrors or lenses) that may intervene between the surface and the scanning mechanism 221.
As an example, suppose that the projected image is of a game board, with pieces placed on the game board. The user might reach into the projected image, touch a projected game piece with a finger (or more accurately stated, "simulate touching" since the projected game piece is just a projection), and move that game piece from one location of the projected game board to another, thereby advancing the game state of the game perhaps permanently. In that case, the movement may occur over the course of dozens or even hundreds of frames, which occurs in but a small moment by the user's perspective. The input preparation mechanism 222 recognizes that a human finger has reached into the space that is occupied by the projected image, and has intersected the finger with the space that is occupied by the visualization of the game piece. If the image were a three-dimensional image, the input preparation would monitor the position of the user's finger in three-dimensional space, and have a concept for the three-dimensional position of the virtual game piece. The game piece is just a projected portion of the image, and thus the user would not feel a game piece. Nevertheless, the input preparation mechanism 222 recognizes that the user is now indicated an intent to perform some action on the projected game piece.
In subsequent frames, the input preparation mechanism 222 recognizes slight incremental movement of the finger, which represents intent to move the game piece in the same direction and magnitude as the finger moved. The input preparation mechanism knows what commands to issue to cause that actual image generator to cause that projected game piece to move in the virtual game environment space. The changes can be almost immediately observed in the projected image. This occurs for each frame until the user indicates an intent to no longer move the game piece (perhaps by tapping the surface on which the projected image is projected at the location at which the user wishes to deposit the projected game piece). The appearance to the player would be as though the player had literally contacted the game piece and caused the game piece to move, even though the game piece is but a projection. Accordingly the system may move projected objects. Other actions might include resizing, re-orienting, changing the form, or changing the appearance of the virtual object that the user interacted with.
The interactive image projection system 200 may interface with a conventional image generation system to enable the appearance of an interactive projected image. After all, the system 200 receives the image which is generated by the external system, although additional processing of the image may occur within the image preparation mechanism 21 1 , which is then projected. However, the external image generation system just generates the image in the same manner as if the image were just to be displayed on a conventional display. Furthermore, the external image generation system receives commands as it is accustomed to receive them to thereby effect a permanent change to the game state and advance progress through the game. The external image generation system acts the same no matter how complex the systems used to generate the commands. Whether the input was generated by a conventional hand-held controller, or through the complexity of the input channel 220, the external image generation system will act the same.
In addition to preparing input information for the external image generation system, the input channel 220 may also provide information for other surrounding devices such as for example, any one or more of the input devices, or perhaps the central display, thereby altering state of any of these devices, and allowing for these devices to participate in the game state alterations caused by the playing interacting with the projected image.
As a further example, the user may interact with physical objects within the area of the projected game environment image. These physical objects are not virtual, but are real, and thus can be felt by the player as they interact with the physical object.
For instance, the physical object may be an actual physical game board. The input channel 220 may recognize the configuration of the game board and interpret player gestures (such as the movement of a physical game piece, or the interaction with a virtual object) with reference to the physical game board. For instance, in a game of MONOPOLY, a physical MONOPOLY board may be placed within a projected image that might include virtual objects such as for example, virtual chance and community chest cards, and virtual houses and hotels, and perhaps a combination of real and virtual game pieces (according to player preference configured at the beginning of a game). A player might tap on a property owned by that player, which the input channel may interpret as an intent to build a house on the property. The input channel 220 might then coordinate with any external image generation system and the output channel 210 to cause an additional virtual house to appear on the property (with perhaps some animation). In addition, the input channel 220 may coordination to debit the account of that player by the cost of a house. In addition, the user's personal input device 102 may be transmitted information to allow the personal input device 102 to update with a new account balance.
As another MONOPOLY example, the player might roll dice at the beginning of the player's turn. The input channel 220 may recognize what was rolled and cause the projected image to highlight the position that the player's game piece should move to. If the player has a virtual game piece, then the system might automatically move (with perhaps some animation) the virtual game piece, or perhaps have the user move with the player's interaction with the virtual game piece (perhaps configured by the user to suit his/her preference). In response, the system might transmit a prompt to the user's input device, requesting whether the user desires to purchase the properly, or notifying the user of rent owed. In one embodiment, the output channel 210 not only projects images, but also responds to an external game system to provide appropriate output to appropriate devices. For instance, the output channel 210 might recognize that the external game system is prompting the current player as to whether to purchase the property. The output channel 210, in addition to projecting the appropriate game environment image, may also transmit an appropriate prompt to the player's input device 102.
In yet a further example, the central display may provide a displayed image and be positioned within the projected image of the image projection system 101. Thus, a projected image may be superimposed upon an image displayed by the central display.
Thus, the principles described herein may take a conventional system, and allow for a unique interaction with a projected image. That said, the principles described herein are not limited to operation with a conventional game environment image generation system. Figure 4 abstractly illustrates an image generation system 400, which may be used to generate the input image 201 of Figure 2. In one embodiment, the image generation system 400 may be a conventional video game which outputs an image that might, for example, change as a player progresses through the video game. However, one, some, and perhaps even all of the functions described as being included within the image generation system 400 may be performed instead within the interactive image projection system 101.
The image generation system 400 includes logic 411, image generation mechanism 412, and an input interface 413. The logic 41 1 and/or the image generation mechanism 412 have a concept for the virtual space in which the logic 411 operates. The image generation mechanism 412 generates an image that is appropriate give a current state 414 of the logic 411. The input interface 413 receives commands that may alter the state 414 of the logic 411, thereby potentially also affecting the image generated by the image generation mechanism 412. The game state may even be permanently altered from one stage to the next as the players advance through the game. In such systems, images can be generated at such a rate that continuous motion is perceived. There may be a bi-directional channel of communication between the image generation system 400 and the interactive image projection mechanism 200. The bi-directional channel may be wired or wireless, or perhaps wired in one direction and wireless in another. Input commands are typically less data-intensive as compared to images, and thus the communication channel from the interactive image projection system 200 to the image generation system 400 may be wireless. The channel from the image generation system 400 to the interactive image projection system 200 may also be wireless provided that the bandwidth of the channel in that direction is sufficient.
The image projection system 101, and/or any of the surrounding game input devices 102 may have built in microphones to allow sound data (such as the player's voice) to be input into the image generation system 400 to affect the state 414. There may also be voice recognition capability incorporated into interactive image projection system 101 and/or surrounding game input devices 102 to permit such sound data to be converted to more usable form. Speakers, headset ports, and earpieces may also be incorporated into the surrounding input devices 102.
Figure 5 abstractly illustrates a player console 500. As previously mentioned, the input devices 102 of Figure 1 may be player consoles in the context in which the system 100 is a game environment. Figure 5 is an abstract illustration of a player console showing functional components of the player console 500. Once again, Figure 5 is abstract. Accordingly, the various components illustrated as being included within the player console 500 should not be construed as implying any particular shape, orientation, positioning or size of the corresponding component. Figure 6 will illustrate a more concrete representation of an example of the player console 500.
Each player, or perhaps each player team, may have an associated player console, each associated with the corresponding player or team. The player console 500 includes a private display area 501 and game logic 502 capable of rendering at least a portion a private portion of game state 503 associated with the player (or team). The player or team may use an input mechanism 504 to enter control input into the player console. A transmission mechanism illustrated in the form of a transceiver 505 transmits that control information to the interactive image projection system 200 and/or to the image generation system 400, where the control information is used to alter the state 414 of the logic 41 1 used to generate the image.
Figure 6 illustrate a concrete example of a player console 600. Here, the private display area 601 displays the player's private information (in this case, several playing cards). The player console 600 also includes a barrier 602 to prevent other players from seeing the private game state displayed on the private display area 601. The private viewing area 601 may be touch-sensitive, allowing the player to interact with physical gestures on the private viewing area 601, thereby causing control information to update the rendering on the private display area, and the game states on the player console 600, as well as on the central display 101. The private display area 601 also, in this example, displays video images 603 A, 603B and 603C of other players.
In one embodiment, at least one of the player consoles is different from the remaining player consoles. Figure 7 illustrates such a player console 700. In this case, the player console might be a game master console 700, in which the game master may interface with the private viewing area to perhaps control game state. For instance, the game master may use physical gestures on the touch-sensitive display 701 of the game master console 700 to affect what is displayed within the interactive game environment image 1 1 1. For instance, the game master might control what portions of the map are viewable within interactive game environment image 1 11. The game master might also control what effect another player's actions might have on the operation of the game logic. The game master might also create a scenario and setting of a game using the game master console 700.
Figure 8 illustrates a flowchart of a method 800 for projecting an interactive game environment image on a surface. The system receives data (act 801) representing a virtual objects that are spatially positioned in a virtual space. An example of such data is an image in which such virtual objects are represented. The image is then projected (act 802) on a surface in response to the received data. The projected image including a visual representation of at least part of the virtual space. The system then detects user interaction (act 803) with the visualized representation. In response to that user interaction, the projected image is then altered (act 804).
Figure 9 illustrate a concrete example 900 of the interactive image projection system 101 in which multiple modules 902A through 902E are mounted to a stand 901. Each module 902A through 902E includes a projector and a corresponding camera (not shown) which would be in the lower surface of each module 902A through 902E. The projector projects the images downward towards a floor on which the stand 901 is situated. These projectors would each project a corresponding subimage that are each processed such that the projected image is stitched together to appear as a single image on the floor. The camera scans the area of the projected image for user interaction in the area of the projected image. Figure 9 does not use intervening optics in the projection or scanning operations.
Figure 10 illustrates another concrete example 1000 of the interactive image projections system 101 in which a single projector is operating, and which does use intervening optics in the projection operation. The interactive image projection system 1000 includes a housing that includes a rigid base 1001 that is situated on a substantially horizontal surface. A projector mechanism 1011 projects a single image upward through a lens to be reflected off of a curved mirror 1012, through windows 1013, and downward onto the substantially horizontal surface on which the base 1001 is placed. The images are prepared to account for the intervening lenses and mirrors used to direct the image onto the horizontal surface. Four cameras (of which three 1021 A through 1021C are visible in Figure 10) are positioned around the upper circumference of the system 1000. Such cameras scan the area of the projected image.
Accordingly, an interactive game environment image projection system has just been described. Having described the embodiments in some detail, as a side-note, the various operations and structures described herein may, but need, not be implemented by way of a physical computing system. Accordingly, to conclude this description, an example computing system will be described with respect to Figure 11. The computing system 1100 may be incorporated within the interactive image projection system 101, within one or more of the input devices 102, and/or within the image generation system 400.
Figure 11 illustrates a computing system 1100. Computing systems are now increasingly taking a wide variety of forms. Computing systems may, for example, be handheld devices, appliances, laptop computers, desktop computers, mainframes, distributed computing systems, or even devices that have not conventionally been considered a computing system. In this description and in the claims, the term "computing system" is defined broadly as including any device or system (or combination thereof) that includes at least one processor, and a memory capable of having thereon computer-executable instructions that may be executed by the processor. The memory may take any physical form and may depend on the nature and form of the computing system. A computing system may be distributed over a network environment and may include multiple constituent computing systems.
As illustrated in Figure 11, in its most basic configuration, a computing system 1 100 typically includes at least one processing unit 1 102 and memory 1104. The memory 1104 is a physical system memory, which may be volatile, non-volatile, or some combination of the two. The term "memory" may also be used herein to refer to non-volatile mass storage such as physical storage media. If the computing system is distributed, the processing, memory and/or storage capability may be distributed as well. As used herein, the term "module" or "component" can refer to software objects or routines that execute on the computing system. The different components, modules, engines, and services described herein may be implemented as objects or processes that execute on the computing system (e.g., as separate threads).
In the description above, embodiments are described with reference to acts that are performed by one or more computing systems. If such acts are implemented in software, one or more processors of the associated computing system that performs the act direct the operation of the computing system in response to having executed computer-executable instructions. An example of such an operation involves the manipulation of data. The computer-executable instructions (and the manipulated data) may be stored in the memory 1 104 of the computing system 1 100.
Embodiments within the scope of the present invention also include computer- readable media for carrying or having computer-executable instructions or data structures stored thereon. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer. By way of example, and not limitation, such computer-readable media can comprise physical storage and/or memory media such as RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other physical medium which can be used to carry or store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
Computer-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described herein. Rather, the specific features and acts described herein are disclosed as example forms of implementing the claims.
The components of the computing system 1 100 may, for example, be used to provide functionality to game logic, store or remember game state, configure and communicate between devices, and operate the logic of game incorporation. Each of the player consoles may also have a computing system such as computing system 1 100 guiding their processing needs.
Having described interactive projection embodiments, the light emitting boundary embodiments will now be described with respect to Figures 12 through 22. Figure 12 abstractly illustrates a distributed electronic game system 1200 that may be used in the light emitting boundary embodiment, and which may be also used in the interactive projection embodiments in the case where a central display is used.
The system 1200 includes a flat multi-touch functional central display 1201. The central display 1201 may be laid horizontally on a table or other surface and may be used as a horizontal central playing surface. For instance, the central display 1201 may behave as an electronic board of a digital board game. The display 1201 may be movable, or perhaps may be fixed, perhaps being built into a furniture item. Since Figure 12 is abstract, the various components illustrated as being included within the central display 1201 should not be construed as implying any particular shape, orientation, positioning or size of the corresponding component. Subsequent figures will illustrate a more concrete representation of an example of the central display 1201.
The system 1200 also includes surrounding game control devices (also called herein "input devices"). There are eight such game input devices 102A through 102H illustrated in Figure 1, although the ellipses 1021 represents that there may be less than or greater than eight game control devices. Such input devices may, for example, be the same as the input devices 102 described with respect to Figure 1. For instance, the player consoles 500, 600 and 700 may also be examples of the input devices 102 of Figure 12, as well as being examples of the input devices 102 of Figure 1. The game input devices 102 may be orientation-sensitive game input devices, player consoles, or a combination thereof.
Although not required, the central display may preferably be a flat multi-touch functional central display 1201 is capable of detecting and responding to multiple simultaneous instances of players touching the display 1201, and affecting game state in response to each touch instance. Such may be employed to effectively assist in games in which multiple players may be touching the screen simultaneously, although not all games require some simultaneous input. The central display 1201 may also have a scratch resistant coating to prevent scratching that might otherwise be caused by players touching the central display 1201. The central display 1201 may also receive signals from the surrounding game input devices 102, interpret control actions from the signals, and affect game state in response to the control actions.
The central display 1201 includes a public display area 1211. Note that the public display area 1211 is only abstractly represented in Figure 12, and is thus not drawn to scale. In a preferred embodiment, the public display area 121 1 would actually occupy a substantial majority of the viewable surface of the central display 1201 when the display 1201 is laid horizontally, and thus emulate a board-like play area. The public display area displays game information that should be viewable by all of the players and is thus deemed "public". There is no required form for the central display 1201. The central display 1201 might have any size or configuration.
The central display 1201 also includes game logic 1212 that is capable of rendering all or at least a portion of the public game state 1213 on the public display area, and is capable of formulating or determine game state based on game input. A communication mechanism in the form of wireless transceiver 1214 receives control information from surrounding game input devices 102, and in some cases, may transmit information to the surrounding game input devices. A game incorporation mechanism 1215 identifies the control information received from the game input devices 102 and alters a game state based on the control information.
In one embodiment, the central display 1201 incorporates functionality of a general -purpose computing system with a hard drive 1221, memory 1222, general- purpose processor(s) 1223, speakers 1224 (and/or headset ports with headsets or earpieces), a video driver 1225, a wireless transceiver 1226 (such as a BLUETOOTH® transceiver), and so forth (see ellipses 1227). In that case, the game logic 1212, portions of the transceiver mechanism 1214 stack, and the game incorporation mechanism 1215 may be software-implemented. The game state 1213 may be represented as data within the hard drive 1221, memory 1222 and/or video driver 1225. The wireless transceiver 1226 is capable of receiving multiple signals simultaneously. The central display 1201, and/or any of the surrounding game input devices 102 may have built in microphones to allow sound data (such as the player's voice) to be input into the system to affect game configuration or game state. There may also be voice recognition capability incorporated into central display 1201 and/or surrounding game input devices 1202 to permit such sound data to be converted to more usable form. Speakers, headset ports, and earpieces may also be incorporated into the surrounding game input devices.
Although the system 1200 is described as being an electronic game system, the principles described herein are not limited to the use of system 1200 for games. For instance, the central display 1201 may be used to display any public state, whereas input devices 102 may not necessarily be used to provide input for a game. The game logic 1212 may be any logic. Accordingly, the term "player" described herein may more broadly include any participant in a system in which there is a public viewing area for displaying public state associated with any process, and a private viewing area for displaying private state associated with the process.
Figure 13 illustrates a more concrete example 1300 of the display 1201 of Figure 12. The display 1300 includes the public display area 1311 that represents an example of the public display area 1211 of Figure 12. The displayed public game state may be associated with' any type of game, and may render game state in response to instructions provided by the video driver 1225. In one embodiment, the video driver 1225 may, in response to commands from the game logic, display cinematic game introductions and/or scene transitions to help entice the players into a richer playing experience. The video driver 1225 may also display a cinematic conclusion that may depend on a result of the game.
In the display 1300, there are a number of built-in input devices 1312A through 1312H (referred to collectively as "input devices 1312"). In this case, there are eight illustrated built-in input devices (two on each of the four sides of the display 1300), although the display 1300 may have any number of built-in input devices. The built-in input devices may be a camera capable of capturing a still or video image and may be adjustable. Thus, for example, in a game with eight local players, each camera may be adjusted to capture the video of a corresponding player. The display 1300 may include logic that renders the captured video, or portions thereof, on the public display area 131 1 of the display 1300. The logic might also cause all or portions of that video to be transmitted to game input devices (such as player consoles) so that the video may also be displayed at the various game input devices. In one embodiment, the built-in input devices may fold into the display 1300 edge. For instance, in Figure 13, the built-in input devices 1312A, 1312B, 1312E and 1312G are illustrated in contracted collapsed (inactive) position within the display 1300, whereas the input devices 1312C, 1312D, 1312F and 1312H are illustrated in extended position ready to capture video.
Alternatively or in addition, the built-in input devices 1312 may be a scanner, capable of detecting physical game input provided by a player (such as a roll of the dice, the playing of a card, or the positioning of a game piece. For instance, the scanner may include a light-emitting boundary definition mechanism that defines the boundary of an input region using emitted light. For example, the emitted light may be emitted along the perimeter of the input region and/or across the area of the input region. The player may then visualize where the physical game input is to be provided. Once that input is provided, the scanner scans the physical input so that the game input represented by that physical input may be incorporated into the game by game incorporation mechanism 1215. The scanner might be, for example, a three- dimensional image scanner such as those conventionally available on the market. The scanner may be integrated with the camera to form a built-in input device, or they may be separate from each other to allow for independent adjustment of the camera direction and input region positioning.
Figure 14 abstractly illustrates an orientation-sensing game input device 1400. As mentioned above, the surrounding game input devices 102 of Figure 1 and 12 may be orientation-sensing game input devices, player consoles, game master consoles or a combination thereof. Figure 14 is an example of such an orientation-sensing game input device. Once again, Figure 14 is abstract. Accordingly, the various components illustrated as being included within the orientation-sensing device 1400 should not be construed as implying any particular shape, orientation, positioning or size of the corresponding component. Subsequent figures will illustrate a more concrete representation of an example of the orientation-sensing game input device 1400. The orientation-sensing game input device 1400 includes an orientation sensor 1401 that, when active, outputs a spatial orientation signal representing a spatial orientation of the game input device. The orientation sensor 1401 is rigidly attached to the game input device 1400. The orientation sensor 1401 is able to detect how the game input device 1400 is oriented with respect to vertical, and/or how the game input device is oriented with respect to north. In one embodiment, the orientation sensor 1401 is an accelerometer. Alternatively or in addition, the orientation sensor 1401 may be a compass that generates a direction signal indicating a geographical orientation. The orientation-sensing device may also potentially have a Global Positioning System (GPS) that allows the orientation-sensing device 1400 to detect a global position of the orientation-sensing device 1400 in global coordinates.
A transmission mechanism 1402 is communicatively coupled to the orientation sensor 1401 so as to receive the spatial orientation signal from the orientation-sensor 1401 and transmit spatial orientation information present in the spatial orientation signal to the flat multi-touch functional display 1201. In one embodiment, the transmission mechanism 1402 may accomplish this using acoustics, but preferably accomplishes this using wireless electro-magnetic radiation. A suitable protocol for transmission of the spatial orientation information is BLUETOOTH®. As an example, if the orientation-sensing device 1400 is a multi-sided die, and if the orientation sensor 1401 is a tri-axial accelerometer, the spatial orientation signal may indicate or at least include enough information to infer which side of the die is facing up. As another example, if the orientation-sensing device is a playing card or a coin, and if the orientation sensor is a uni-axial accelerometer, the spatial orientation signal may indicate or at least include enough information to infer whether the playing card is face up or face down, or which side of the coin is facing up. As a final example, if the orientation-sensing device 1400 is a domino tile, and the orientation sensor 1401 is an accelerometer, the spatial orientation signal may convey whether the domino tile were face up or face down. Furthermore, if the orientation sensor 1401 is also a compass, the spatial orientation signal may convey which direction the domino was oriented on the table.
The transmission mechanism 1402 may also transmit other useful information. For instance, the transmission mechanism may transmit a locally-unique and perhaps globally-unique identifier. This may be especially useful in a case where there are multiple orientation-sensing devices 1400 being used in a game. For instance, if the orientation-sensing devices 1400 were each six-sided die, the central device could confirm what die was rolled, and the associated rolled value of that specific die, even if multiple dice were rolled.
The orientation-sensing device 1400 might also transmit other information identifying characteristics of the device 1400. For instance, if the device 1400 were a coin, the device 1400 might transmit a device type identifier that identifies the device as a coin, and so forth for other types of devices. The device 1400 might also transmit information from which the central device might infer other characteristics of the device as well, such as color, size, shape, which might be helpful where such characteristics have an impact on game state.
In one embodiment, the device 1400 might transmit information that helps the central display interpret the impact on the game of the orientation of the device 1400. For instance, one die might have a quality of 36 in which the actual value input by the roll result is to be 36 times the number rolled. Such quality information may be included with the transmission. In one embodiment, the transmission mechanism 1402 includes a reliable transmission mechanism in which transmissions are acknowledged by the central display, or else the ; information is retransmitted according to a particular protocol.
There are many example game input devices that may incorporate orientation- sensing capability with suitable modification in accordance with the broad scope of the principles described herein. Several examples have already been given including a multi-sided die, a playing card, a coin, and a domino tile. Other examples include, but are by no means limited to, the following:
a game piece miniature;
bottle caps;
plastic bone pieces;
cans;
tokens;
blocks;
house or hotel pieces; 8) marbles;
9) jewels;
10) treasure chest lid;
1 1) jelly beans;
12) checker pieces;
13) any type of wood game piece;
14) any type of plastic game piece;
15) any type of metallic game piece;
16) and many more.
The presentation of this list is not intended to provide an exhaustive enumeration of the types of orientation-sensing game input devices that may be used consistent with the principles herein. The principles described herein may be applied in any game input device whose orientation has some impact on a game state. Since the types of games are limitless, and subject only to the limits of the human imagination, the types of orientation-sensing game input devices that may be altered to incorporate the features described herein are likewise limitless.
A specific concrete example of an orientation-sensing game input device will now be described with respect to Figure 15, which illustrates an orientation-sensing die 1500. In the illustrated case, the orientation sensing die 1500 is a six-sided die. However, the principles described herein may be applied to any die, regardless of the number of sides. For instance, some die have as few as only four sides. Some commercially available die have as many as 100 sides.
Referring to Figure 15, the die includes a multi-sided body 1501 having at least four flat sides; (in the illustrated example six sides). For clarity, the image on each itself (often, but not always a certain number of distributed dots) is not illustrated such that some of the internal-embedded components may be more easily seen. That said, the various components are not necessarily drawn to size since the precise size and positioning of the components is not critical, so long as the components fit within the boundaries of the die. Furthermore, if the die is desired to be kept random, the components should be distributed appropriately to keep the center of gravity in the middle of the cube. An orientation sensor 1511 (such as a tri-axial accelerometer) is embedded within the multi-sided body 1501 and is structured to, when active, output a spatial orientation signal representing a spatial orientation of the game input device. A transmission mechanism 1512 is also embedded within the multi-sided body 401 and communicatively coupled to the orientation sensor 151 1 so as to receive the spatial orientation signal and transmit spatial orientation information present in the spatial orientation signal to locations external to the multi-sided body. In one embodiment, the orientation sensor 151 1 and the transmission mechanism 1512 are a single integrated BLUETOOTH® - enabled tri-axial accelerometer.
An electronic power source 1513 is also embedded within the multi-sided body 1501 and is coupled to the orientation sensor 151 1 and the transmission mechanism 1512 so as to electronically power the orientation sensor 1511 and the transmission mechanism 1512. In one embodiment, the electronic power source 1513 includes a rechargeable battery. There may be a plurality of electrical contacts 1514A and 1514B accessible from the outside of the multi-sided body 1501, each establishing a corresponding electrical path 1515A and 1515B from the outside of the multi-sided body to the rechargeable battery. The electronic power source 1513 may also be an insertable and removable battery and may even perhaps be disposable. In one embodiment, the electronic power source 1513 is a non-rechargeable disposable battery that is not removable from the die. In that case, the entire die may be considered disposable, or at least converts to a normal non-transmitting die after the battery fails. In the case of a non-rechargeable battery, there would be no need for the electrical paths 1 515A and 1515B. In the case of a removable battery, the die may have a cavity that fits the battery, and that is accessed by removing a cover that snaps into place.
A status indicator 1516 may also be included and may be visible from external to the multi-sided body 1501. For instance, the status indicator 1516 may be on the surface of the die 1500. If the multi-sided body 1501 is composed of translucent material, the status indicator 1516 may also be embedded within the multi-sided body 1501 itself. If necessary or desired, a counterweight 1517 may also be positioned rigidly within the multi-sided body 1501 so as to further center a center of gravity of the wireless die. Figure 16 illustrates abstractly a game input mechanism in the form of a scanning device 1600. The scanning device is an example of the game input devices 102 of Figure 12. The scanning device 1600 is drawn abstractly so once again the various components of the scanning device are not limited to any particular size, position, orientation, or form. The scanning game input device 1600 includes a light- emitting boundary definition mechanism 1601, a scanning mechanism 1602, a communication mechanism 1603, and a mechanical support mechanism 1604. The scanning game input mechanism may also have processor(s) 1605 and memory 1606, thus enabling the scanning game input mechanism to at least partially process information captured by the scanning mechanism 1602, control the light-emitting boundary definition mechanism 1601, and/or communicate with the communication mechanism 1603. After a discussion of the function of the various components 1601 through 1606 of the scanning game input mechanism 1600, various concrete examples will be described with respect to Figures 17, 18A and 18B.
The light-emitting boundary definition mechanism 1601 defines multiple input regions for a game in which multiple players are engage. Each of the so defined input regions is a region on a playing surface in which a corresponding player subset is to provide physical game input. A player subset may be multiple players in a team- oriented game, or may be a single player in a game that does not involve teams. Examples of physical input include 1 ) the rolling of a die or dice, 2) the playing of one or more cards, 3) the positioning of one of more game pieces, 4) the spinning of a spinning or top, 5) a human hand and so forth. For instance, in an electronic version of rock, paper, scissors, a human hand might be used to provide game input within the game input region.
In one embodiment, the light-emitting boundary definition mechanism 1601 may selectively define only one or perhaps a subset of the multiple regions that the mechanism 1601 is capable of defining. For example, in a turn-oriented game in which it is a turn of one or more, but less than all, of the player subsets, the corresponding game input regions for only those player subset(s) whose turn it is might be made visible. Game state transmitted by the central display 1201 and/or the other game input devices 102 might give the scanner game input device information sufficient to derive the identity of whose turn it is, to thereby prompt the scanning game input device 1600 to light the appropriate region corresponding to shows turn it is, while deemphasizing or even not lighting at all the game input region corresponding to player subset(s) whose turn it is not.
A scanning mechanism 1602 is configured to scan at least some objects placed within any of the plurality of input regions. As an example, there may be a single scanner that rotates or otherwise moves so as to be able to perform a three- dimensional scan on whichever region physical game input is configured to be captured in. In another embodiment, there might be a specific three-dimensional scanner allotted for each game input region. The corresponding scanner is then operating when physical game input is expected for the corresponding game input region. The game input regions may be non-overlapping or they may be overlapping depending on the design of a game.
A communication mechanism 1603 communicates information regarding scanned objects scanned by the scanning mechanism 1602. In one embodiment, the scanning game input device 1600 is wireless, in which case the communication mechanism 1603 communicates wirelessly with, for example, the central display 1201 and/or one or more other game input devices 102. The communication mechanism 1603, for example, communicates information regarding scanned objects scanned by the scanning mechanism, and information regarding which input region the scanned object was scanned in.
For instance, the communication mechanism 1603 might simply send image information (e.g., a collection of images of a die) to the central display 1201, and have the central display 1201 extrapolate the three-dimensional rendering of the viewable surfaces, and then calculate the game input. Alternatively, the processor(s) 1605 might take on more processing role by extrapolating the three-dimensional rendering of the scanned image, and then the communication mechanism 1603 communicates that three-dimensional rendering to the central display, which then calculates the game input. As another alternative, the processor(s) 1605 might take on all processing required to determine the game input from a scanning operation. For example, the processor(s) 1605 might determine that the player subset rolled two die, resulting in a roll of a six and a four. The communication mechanism 1603 might also communicate with player consoles to thereby affect the private game state of the private consoles.
The communication mechanism 1603 might additionally communicate with other devices such as, for example, a surrounding computing system (such as a laptop computer), to convey information and/or may receive information from the surrounding computing system (such as configuration information) or from the central display 1201 or other game input devices 102
The mechanical support mechanism 1604 positions the light-emitting boundary definition mechanism 1 01 and the scanning mechanism 1602 with respect to a playing surface. In one embodiment, the mechanical support mechanism couples the scanning game input device 1600 to the central display 1201, or perhaps couples the game input device 1600 to one of the player consoles. Alternatively, the scanning game input device may not be rigidly coupled to the central display 1201 or the player consoles, but may be free standing.
The mechanical support mechanism 1604 may have a different form depending on the configuration of the scanning input system. For instance, if the scanning device 1600 scans from below (e.g., which could be done with a translucent playing surface, the mechanical support mechanism 1604 would be properly configured so that the light-emitting boundary definition mechanism 1601 may light the surface from below, and the scanning mechanism 1602 may scan from below. If the scanning device hangs from the ceiling, or is supported by a wall, the appropriate configuration of mechanical support mechanism 1604 may be provided. Accordingly, the specific example configurations of Figures 17 and 18A and 18B are only examples of one of an infinite variety of ways to configure the scanning system in the context of a game. Mirrors or lens may even be used to direct the flow of light for the light-emitting boundary definition mechanism and/or for the scanning mechanism.
As previously mentioned, the scanning mechanism 1600 may be incorporated into any of the central display 1201 (if present) or any of the surrounding game input device 102 without restriction. The scanning device 1600 may even be incorporated into a pair of glasses, a hat, an eyepiece or other mechanisms that sits on the player's head. In that case, no light-emitting boundary definition mechanism 1601 may be needed, although it still might be helpful. Rather, the player would know that the scanning mechanism 1602 is scanning a region that is relative to the player's field of view. The light-emitting boundary definition mechanism 1601 might still be helpful though to help the player see the area that is to be scanned since the positioning of the glasses or other headgear, the orientation of the eyeball, and so forth might affect whether the game input region is directly in the player's field of view.
In one embodiment, the processor(s) 1605 and the memory 1606 may collaborate to determine, at any given point, which players turn it is. The processor(s) and the memory 1606 may then cause the light-emitting boundary definition mechanism 1601 to provide visual emphasis to the game input region in which physical game input is expected. For instance, the boundaries of the region may be turned green when physical game input is expected.
Figure 17 illustrates one embodiment 1700 of the scanning game input device 1600 of Figure 16. In this embodiment, the scanning game input device has a light- emitting boundary definition mechanism that defines four game input regions 1702A, 1702B, 1702C and 1702D. In this embodiment, each game input region is defined by a dedicated light-emitting boundary definition mechanism positioned within an upper portion 1703 that is supported by base 1701. The boundary definition mechanism may be, for example, a Light Emitting Diode (LED), or any other device capable of defining the game input region by providing visual emphasis to the boundaries of the game input region, and/or by providing visual emphasis over the area of the game input region.
While the light-emitting boundary definition mechanism 1601 may defined fixed-sized boundaries, the light-emitting boundary definition mechanism 1601 may also perhaps be adjustable. For example, the light-emitting boundary definition mechanism may be an array of LEDs. The size and shape of the boundary may be adjusted by turning some of the LEDs off, and keep some on. Each of the LEDs may be mapped to a particular memory location that turns the LED on or off, or adjusted between two discrete intensity levels (in the case of being mapped to a single bit), or have more refined adjustable intensity (in the case of being mapped to multiple bits). As previously mentioned, the boundaries may be overlapping if desired. Such overlapping may also be a reward for a winning player, and a detriment for a losing player, with the winning player perhaps capturing some benefit by the physical game input of the losing player.
The boundary size might be configurable by a user. For instance, a player may choose to have a smaller or larger game input region depending on the player's preference. For instance, a younger player in a dice game might choose to have a larger roll area to accommodate a more aggressive and less controlled roll. An order player might require less of a roll area.
The boundary size might also be adjusted by the game state itself. For instance, as a player is losing a game, the player may have a more and more reduced size of a boundary in which to provide physical game input, or perhaps the boundaries may take a particular form that serves to taunt the player that is moving towards a loss. If the player is winning a game, the boundaries may perhaps expand, and/or take a more congratulatory form. The LEDs may be of different colors such that the boundaries make a different color depending on game state. For instance, greener game input regions might designate the player is winning, whereas redder game input regions might designate the player is losing. Thus, the players can quickly ascertain and have feedback on how the player is doing. Changing of colors of the game input regions may be accomplished by adjusting the proportion of LEDs of particular colors that are turned on and off, and their respective intensity levels. The color of the game input regions may also define whose turn it is. For instance, if the color is green, that may mean it is that player's turn, if red or off, it may mean it is not that player's turn.
Additionally, the scanning device 1600 may be an LED array that directly displays the game input region. For instance, the light-emitting boundary definition mechanism 1602 may essentially be a portion of the public display area 1311 of the display 1300 of the central display 1201. Alternatively or in addition, the light- emitting boundary definition mechanism 1601 may be all or a portion of the private display area 601 of the player console 600. The boundary definition mechanism 1600 may also be a laser that defines a sharp boundary for the game input region.
For instance, when it is the player's turn, a window might pop up on a portion of the public display area 1311 that is closer to the player. A scanning device might be positioned in a predetermined location (e.g., integrated with the display 1300) with respect to that window such that the scanning mechanism 1602 may capture the window. The window may include boundaries that make it easier for the scanning mechanism 1602 to recognize the boundaries of the game input region. The content of the window may display a good contrasting color to the color of the game input so as to optimize scanning accuracy (e.g., if the die are white, then the window may have darker content. Then the player provides physical game input directly on the public display area (e.g., rolls the dice onto the public display area 1311) such that the physical game input that occurs within the window is captured by the scanning mechanism. The shape or size of the window may be adjusted in response to game state.
Alternatively or in addition, when it is the player's turn, a window might pop up on a portion of the private display area 601 corresponding to the player console 600 that belongs to the player whose turn it is. Alternatively, there might just be some indicator on the private display area 601 that instructs the player that the private display area 601 is now acting as a game input region. A scanner might be positioned in a predetermined location (e.g., integrated with the player console 600) with respect to that game input region such that the scanning mechanism may capture the window. Then, the player provides physical game input directly on the private display area 601 (e.g., rolls the dice onto the private display area 601) such that the physical game input that occurs within the windo is captured by the scanning mechanism associated with the player console. In one embodiment, should the player's game input region reduce in size, a different color may be used to represent the game input region itself, as compared to the portions that could be in the game input region had the player done better. Thus, in this case, the scanning device 1600 includes a light- emitting boundary definition mechanism that is a portion of a display itself. Thus, the term "light-emitting boundary definition mechanism" should be interpreted broadly in the claims.
Figure 22 a player console 2200 with an integrated scanning device 2210 that scans a game input region in the form of a window defined on the private display area of the player console itself. For instance, window 2201 might define a game input region in which the player is to enter physical input (e.g., the roll of a die 2204). The window size might change to be for example window 2202 depending on game state. The window may be, for example, a window displayed by an operating system on the private display. The scanning device 2210 may be capable of scanning area 2203, although the system may ignore material scanned outside of the window that defines the game input region. The window may be displayed to have a clear and distinct boundary to make it easier for the scanning device 2210 (or the system that interprets the scanned information to detect the game input region).
Each game input region also has a 3-D scanner associated therewith for scanning the region within the corresponding boundary. Thus, there may be four light-emitting boundary definition mechanism and four 3-D scanners present within the scanning game input device. In one embodiment, there may be more of each, but with pairs of light-emitting boundary definition mechanism and corresponding scanning mechanisms being selectively turned off.
Figures 18A and 18B collectively illustrated another alternative embodiment of the scanning game input device 1600 of Figure 16. The scanning game input device of Figures 18A and 18B appears the same as the scanning game input device 1700 of Figure 17. However, in this embodiment, the upper portion 1703 is rotatably mounted to the base 1701. The upper portion 1703 may have as few as a single light- emitting boundary definition mechanism and single scanning mechanism affixed therein.
The scanning game input device of Figures 18A and 18B rotates the upper portion 1703' when transitioning turns. This might be done according to some predetermined pattern, with the players situating themselves to be proximate their corresponding desired game input region. On the other hand, rather than being in accordance with a predetermined pattern, the scanning game input device may first determine whose turn it is next, which may not be according to a predetermined pattern. The scanning game input device may determine this autonomously, or may determine this in communication with the central display and/or one or more of the player consoles.
Figure 18A illustrates the scanning game input device with the rotatable upper portion 1703' rotatably mounted on the base 170Γ, and with the boundary definition mechanism and scanning mechanism rotated to form game input region 1702A'. In Figure 18B, the upper portion 1703' is rotated to form game input region 1702B'. In an alternative embodiment, there may be multiple fixed light-emitting boundary definition mechanisms, whereas a rotatable portion includes the scanning mechanism, which rotates to whichever game input region corresponds to the player set whose turn it is. In that embodiment, perhaps there is some visual distinction (e.g. boundary color, or intensity level, that gives visual emphasis to the boundaries or area corresponding to the game input region whose turn it is.
In one embodiment, the scanning mechanism rotates not to any fixed position, but senses where the player is whose turn it is presently. For instance, the scanning game input device may detect the position of the player's player console, and rotate the game input region accordingly by rotating the light-emitting boundary definition mechanism and scanning mechanism. The position of the player console may be determined in a number of ways. For instance, the player console may emit ultrasonic or subsonic acoustic signals that the scanning game input device may acoustically sense. Should GPS coordinate systems become more accurate, the player console may transmit GPS information to the scanning game input device. The position of the player may also be calculated based on the orientation of a camera built into the central display. Thus, if a player moves during the course of the game, the position of their corresponding game input region changes accordingly.
The scanning device 1600 might scan any number of physical game input types. For instance, the scanning device 1600 might scan dice, playing pieces, playing cards, spinners, or any other object, even the player himself or herself. For instance, the scanning device 1600 might scan a human hand. This might allow the game state to reflect that the player played a "rock", or a "paper", or a "scissors", or even "ambiguous". The scanning device 1600 might also scan the hand to identify a number of fingers, or whether the hand is facing up or down, and so forth. The scanning device 1600 might use a hand as input to allow people proficient in sign language to enter letters or words into the game system.
The scanning device 1600 might also scan a human face perhaps to analyze the configuration of the face. For instance, the scanning device 1600 may detect whether the face is smiling or is confident, seems angry, frustrated, or nervous, for purposes of making any inference about the players emotions. Such emotional feedback may impact game state. For instance, if the player looks nervous, the player may be more subjected to attack by computerized players, or may have a reduced size of a game input region.
In one embodiment, a game input device 102 may sense other biometrics of a player such as, for example, oxygen levels in the blood, blink rate, perspiration levels, heart rate, breathing rate, chemical content of exhaled breath, blood pressure, and so forth using any appropriate mechanism, whether through a scanning device or by some other mechanism. Any one or more of the measured biometrics, either singly, or in combination, may be used to calculate an effect on game state.
A game input device might also be a scanner positioned to view all or a portion of a playing board (e.g., the central display 1201 or even a non-electronic playing board or surface), and recognize the position, orientation of a game piece with respect to the playing board, or even a type of game piece. Such information may be used to affect the game state.
Such as scanner may also be able to detect whose game piece or game input device belongs to which player. For instance, die for one player may have a certain marker, such as an indented piece with a certain color. Playing cards may have a miniature bar code distinguishing who the cards belong to. A bar code or other marker might also represent other information regarding a playing piece, such as a type of playing piece, the significance of the playing piece, and so forth. Such scanners need not necessarily have the light-emitting boundary definition mechanism if the players intuitively understand where to play the game pieces in a more common area (e.g., on the central display 1201).
Alternatively or in addition, the game input device 1300 might emit images or other visual cues on the playing surface in response to game input. For instance, if the player where to roll a six, then the game input device may emit on the playing surface a cue telling the user where to move, or what the options are for moving. The scanning device 1600 has been described as potentially having a scanning mechanism 1602 that uses light as the scanning signal. However, the scanning mechanism 1602 might rather use any signal for scanning such as acoustic signals or safe frequencies of electro-magnetic radiation. An example of electro-magnetic radiation is visible light, ultraviolet light, infrared light, long-wave and short-wave radio, and so forth. The scanning device 1600 may use combinations of the above to formulate a more complete scanned image of the game input token. The scanning device 1600 might also have any number of different image capture mechanisms. Examples of image capture mechanism include a CCD camera, a bar code scanner, or a 3-D imaging camera.
Figure 19 illustrates an example system 1900 in which there is a central display 1901 (representing an example of the central display 1201 of Figure 12), and four surrounding player consoles 191 1 , 1912, 1913 and 1914 (each representing an example of the player console 500 of Figure 5). The central display 1901 has a rotating camera 1921 that may turn to whomever's turn it is, and capture the player's image for display on the central display 1901 and/or one or more or all of the player consoles 191 1 through 1914.
Each player console 1911 through 1914 is shown equipped with an integrated scanning device 1931 through 1934, respectively. The scanning device represents an example of the scanning device 1600 of Figure 16. A light-emitting boundary definition mechanism associated with the scanning device 1931 is emitting light to define a game input region 1941. In this case, die have been rolled into the game input region 1941. The scanning device 1931 captures the 3-D image of the die, and transmits information to the central display 1901 where the roll is incorporated into the game state. One of the player consoles 1914 is shown having a privacy screen 1942, which may be removably attached to the player console 1914, or perhaps may be removably attached to any of the player consoles to provide appropriate privacy.
As an alternative embodiment, the scanning device 1931 might be turned to focus on the display of the player console 191 1. When it is the player's turn, perhaps a software-driven window pops up on the display of the player console 1911 showing the player where the player should roll. The player thus would roll the die directly on the display of the player console 191 1, whereupon the scanning device 1931 would capture the physical game input for incorporation into the game state.
Figure 20 illustrates another example system 2000 in which there is a central display 2001 (representing an example of the central display 1201 of Figure 12), and three surrounding player consoles 2011, 2012 and 2013 (each representing an example of the player console 500 of Figure 5). Here, the game state captured by the physical game input captured by the scanning device is incorporated to actually give the player a visual cue 2020 of the available movement options.
Figure 21 illustrates a player console 2100 that is similar to the player consoles 191 1, 1912, 1913, 1914, 201 1, 2012 and 2013, except more close up. Here a scanning device 2102 is shown extended, but with a recess 2103 in which the scanning device might contract into perhaps before or after the game. In one embodiment, the scanning device 2102 might automatically extend and contract depending on the game state. For instance, if it is the player's turn, then the player console 2100 may extend in preparation for the player providing game input. In an alternative embodiment, the scanning device 2102 remains extended for the duration of the game, and may be manually extendable and contractable. A data and/or power cable 2101 (such as a USB cable) is also shown demonstrating that the player console may integrate with existing data cables and power cables.
Thus, a sophisticated mechanism is described for inputting physical input into game state. The distributed game system described herein thus allows circle games to be played electronically. Traditionally, it is often teenagers that lose interest in circle games. The wireless distributed game system appeals to a teenager's keenness for a sense of technology, which has the potential to pull teenagers back into the family circle games, potentially enriching family relationships and maintaining important lines of communication.
In one embodiment, the central display 1201 has an Internet connection (represented generally by the ellipses 1227 in Figure 12. During initial power-up of the central display, the central display may be configured to navigate to a predetermined set of one or more web sites, and may have a predetermined set of circle games installed already. The player might use the central display to navigate to a central web site that may be used to download software necessary to engage in other circle games. When a circle game is begun, the central device may inform the surrounding player consoles of the game that is about to begin and, if necessary, provide the appropriate software to the player consoles as well. In one embodiment, the player consoles are general-purpose computing devices with one or more processors, a memory, and potentially a hard disk. Accordingly, a flexible game system has just been described. Having described the embodiments in some detail, as a side-note, the various operations and structures described herein may, but need, not be implemented by way of a physical computing system, such as the computing system 1 100 of Figure 1 1
The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims

CLAIMS What is claimed is:
1. A method for projecting an interactive game environment image on at least one surface, the method comprising:
an act of receiving data representing a plurality of virtual objects that are spatially positioned in a virtual game environment space;
an act of projecting an game environment image on a surface in response to the received data, the projected game environment image including a visual representation of at least part of the virtual game environment space, the visualized representation include a visualized representation of at least some of the virtual objects;
an act of detecting user interaction with the visualized representation; and an act of changing the projected visualized representation in response to the user interaction with the visualized representation.
2. The method in accordance with Claim 1, wherein the user interaction is a physical manipulation of a physical object within the space of the projected visualized representation.
3. The method in accordance with Claim 2, wherein the physical object is a game piece.
4. The method in accordance with Claim 3, wherein the game piece is a die.
5. The method in accordance with Claim 3, wherein the game piece is a playing card.
6. The method in accordance with Claim 1, wherein the user interaction is a physical user interaction with one of the visualized objects.
7. The method in accordance with Claim 6, wherein the act of changing the projected visualized representation comprises an act of affecting the visualized object with which the user interacted.
8. The method in accordance with Claim 7, wherein the act of affecting the visualized object comprises an act of moving the visualized object in the visualized representation.
9. The method in accordance with Claim 7, wherein the act of affecting the visualized object comprises an act of changing a form of the visualized object.
10. The method in accordance with Claim 1 , wherein the act of changing the projected visualized representation comprises an act of creating a new visualized object in the visualized representation.
1 1. The method in accordance with Claim 1, wherein the surface includes a substantially horizontal surface.
12. The method in accordance with Claim 1 1 , wherein the act of projecting a game environment image also projects a game environment image on a substantially vertical surface in response to the received data.
13. The method in accordance with Claim 12, wherein the game environment image projected on the substantially horizontal surface and the game environment image projected on the substantially vertical surface are stitched to form a single projected game environment.
14. The method in accordance with Claim 1, wherein the surface includes a spherical surface.
15. The method in accordance with Claim 1, wherein the surface includes a concave surface.
16. The method in accordance with Claim 1, wherein the surface includes a convex surface.
17. The method in accordance with Claim 1, wherein the surface includes a cylindrical surface.
18. The method in accordance with Claim 1, wherein the surface includes a topology.
19. The method in accordance with Claim 1, wherein the game environment image includes three dimensional information that may be viewed as a three dimensional image by a user.
20. The method in accordance with Claim 19, wherein the game environment image includes a representation of terrain.
21. The method in accordance with Claim 1, wherein the game environment image includes an image of a game board.
22. The method in accordance with Claim 1,
wherein the act of projecting a game environment image on a substantially horizontal surface in response to the received data comprises an act of a plurality of projectors projecting a plurality of sub-images on the surface, the method further comprising: an act of formulating data representing the plurality of sub-images prior to projecting the plurality of sub-images.
23. The method in accordance with Claim 22, wherein each sub-image represents a distinct portion of the virtual game environment space that result in the game environment image representing the virtual game environment space that is spatially continuous.
24. A computer program product comprising one or more computer-readable media having thereon computer-executable instructions that, when executed by one or more processors of a computing system, cause the computing system to perform the following:
an act of detecting a user interaction with a game environment image that is projected on a substantially horizontal surface, the projected game environment image including a visual representation of at least part of a virtual game environment space in which a plurality of virtual objects are spatially positioned; and
an act of changing the projected visualized representation in response to the user interaction with the visualized representation.
25. A surface-top game environment projection mechanism comprising:
a projector mechanism that includes an image supply mechanism and at least one projector, the image preparation mechanism configured to supply a game environment image to project in response to received data that represents a plurality of virtual objects that are spatially positioned in a virtual game environment space, the at least one projector configured to project the game environment image supplied by the image supply mechanism, the projected game environment image including a visual representation of at least part of the virtual game environment space, the visualized representation include a visualized representation of at least some of the virtual objects; and
a scanning mechanism configured to scan the area projected by the projected game environment image to determine one or more positions of a real interactivity input object.
26. The surface-top projection mechanism in accordance with Claim 25, wherein the at least one projector comprises a plurality of projectors, wherein the image preparation mechanism is configured to assign a subimage of the game environment image to each of the plurality of projectors.
27. The surface-top projection mechanism in accordance with Claim 25, wherein the surface-top projection mechanism is portable.
28. The surface-top projection mechanism in accordance with Claim 25, wherein the surface-top projection mechanism takes the form of a tower that can be placed on the substantially horizontal surface.
29. A game input mechanism comprising:
a light-emitting boundary definition mechanism configured to define a plurality of input regions for a game in which a plurality of players are engaged, each of the plurality of input regions being a region on a playing surface in which a corresponding player subset of one or more players are to provide physical input to affect game state;
a scanning mechanism configure to scan at least some objects placed within any of the plurality of input regions; and
a communication mechanism for communicating information regarding scanned objects scanned by the scanning mechanism.
30. The game input mechanism in accordance with Claim 29, further comprising:
a game input identification mechanism configured to identify game input represented by a scanned object scanned by the scanning mechanism, and associate the identified game input with the player subset corresponding to the input region in which the scanned object was scanned.
31. The game input mechanism in accordance with Claim 29, wherein each of input regions corresponds to a player subset that has just a single player.
32. The game input mechanism in accordance with Claim 29, further comprising:
a mechanical support mechanism for positioning the light-emitting boundary definition mechanism and the scanning mechanism with respect to the playing surface.
33. The game input mechanism in accordance with Claim 32, wherein the mechanism support mechanism mechanically couples the game input mechanism to a central horizontal display area that has a plurality of player consoles wirelessly coupled to the central horizontal display area.
34. The game input mechanism in accordance with Claim 32, wherein the mechanism support mechanism mechanically couples the game input mechanism to a player console that is one of several player consoles that are wirelessly coupled to a central horizontal display area.
35. The game input mechanism in accordance with Claim 29, further comprising:
a player association mechanism configured to associate any scanned object with the player subset corresponding to the input region in which the object was scanned.
36. The game input mechanism in accordance with Claim 29, wherein the physical input includes a rolling of at least one die.
37. The game input mechanism in accordance with Claim 29, wherein the physical input includes a playing of at least one card;
38. The game input mechanism in accordance with Claim 29, wherein the physical input includes a positioning of at least one game piece.
39. The game input mechanism in accordance with Claim 29, wherein the game is a turn-oriented game in which it is a turn of one or more, but less than all, of the player subsets are to provide input into the corresponding input regions only when the game has a particular game state.
40. The game input mechanism in accordance with Claim 39, wherein the light-emitting boundary definition mechanism is configured to use emitted light to provide visual emphasis to whichever input region corresponds to the one or more player subsets whose turn it is, as compared to the other one or more input regions corresponding to one or more player subsets whose turn it is not.
41. The game input mechanism in accordance with Claim 40, wherein the light-emitting boundary definition mechanism provides visual emphasis by rotating a light emitter towards whichever input region corresponds to the one or more player subsets whose turn it is, wherein the light-emitting boundary definition mechanism rotates from turn to turn.
42. The game input mechanism in accordance with Claim 41, wherein the light-emitting boundary definition does not provide boundary defining light to the other one or more input regions corresponding to one or more player subsets whose turn it is not.
43. The game input mechanism in accordance with Claim 40, wherein the light-emitting boundary definition mechanism provides visual emphasis by rotating a light emitter towards whichever input region corresponds to the one or more player subsets whose turn it is, wherein the light-emitting boundary definition mechanism rotates from turn to turn.
44. The game input mechanism in accordance with Claim 29, wherein the light-emitting boundary definition mechanism is configured to define at least one of the input regions by emitting light over the entire input region.
45. The game input mechanism in accordance with Claim 44, wherein the light-emitting boundary definition mechanism is further configured to define the at least one of the input regions by emitting a laser along a perimeter of the input region.
46. The game input mechanism in accordance with Claim 29, wherein the light-emitting boundary definition mechanism is further configured to define the at least one of the input regions by emitting a laser along a perimeter of the input region.
47. The game input mechanism in accordance with Claim 29, wherein the communication mechanism communicates wirelessly.
48. The game input mechanism in accordance with Claim 29, wherein the communication mechanism communicates with a horizontal display upon which at least a portion of public game state is displayed.
49. The game input mechanism in accordance with Claim 48, wherein the communication mechanism communicates with a plurality of player consoles
50. The game input mechanism in accordance with Claim 29, wherein a location of an input region corresponding to one or more players is defined in accordance with a position of a player console associated with the one or more players, wherein if the player console moves during the course of the game, the location of the input region also moves accordingly.
1. A game input mechanism comprising:
a light-emitting boundary definition mechanism configured to selectively define a plurality of input regions in which a plurality of players are engaged, each of the plurality of input regions being a region on a playing surface in which a corresponding player subset of one or more players are to provide physical input to affect game state;
a scanning mechanism configure to selectively scan at least some objects placed within any of the plurality of input regions, wherein for at least some parts of the game, the scanning mechanism only scans one or more, but less than all, of the plurality of regions; and
a communication mechanism for communicating information regarding scanned objects scanned by the scanning mechanism.
52. The game input device in accordance with Claim 1, wherein at least two of the plurality of input regions are overlapping.
53. A game input mechanism comprising:
a light-emitting boundary definition mechanism configured to define a plurality of input regions in which a plurality of players are engaged, each of the plurality of input regions being a region on a playing surface in which a corresponding player subset of one or more players are to provide physical input to affect game state;
a scanning mechanism configure to selectively scan at least some objects placed within any of the plurality of input regions; and
a communication mechanism for communicating information regarding scanned objects scanned by the scanning mechanism, and information regarding which input region the scanned object was scanned in.
PCT/US2011/020058 2010-01-04 2011-01-03 Interactive game environment WO2011082405A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US12/651,947 2010-01-04
US12/651,947 US20110165923A1 (en) 2010-01-04 2010-01-04 Electronic circle game system
US12/855,604 2010-08-12
US12/855,604 US20110256927A1 (en) 2009-03-25 2010-08-12 Projection of interactive game environment

Publications (1)

Publication Number Publication Date
WO2011082405A1 true WO2011082405A1 (en) 2011-07-07

Family

ID=43778508

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2011/020058 WO2011082405A1 (en) 2010-01-04 2011-01-03 Interactive game environment

Country Status (2)

Country Link
US (3) US20110256927A1 (en)
WO (1) WO2011082405A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2522408A1 (en) * 2011-05-13 2012-11-14 GIC Sp. z o.o. Sp. k. Method and device for electromechanical selection of the element from the plurality of elements
GB2510811A (en) * 2012-12-18 2014-08-20 Optricks Media Ltd Augmented reality systems
EP3575234A1 (en) * 2018-05-30 2019-12-04 Radoslaw Oryl Crown cap game device
US11331563B2 (en) 2018-05-30 2022-05-17 Caps Apps Spolka Z Ograniczona Odpowiedialnoscia Crown cap game device
USD976106S1 (en) 2020-11-01 2023-01-24 CAPS APPS Spółka z o.o. Crown cap

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8144148B2 (en) * 2007-02-08 2012-03-27 Edge 3 Technologies Llc Method and system for vision-based interaction in a virtual environment
EP2328662A4 (en) 2008-06-03 2013-05-29 Tweedletech Llc An intelligent game system for putting intelligence into board and tabletop games including miniatures
WO2012033862A2 (en) * 2010-09-09 2012-03-15 Tweedletech, Llc A multi-dimensional game comprising interactive physical and virtual components
US9649551B2 (en) 2008-06-03 2017-05-16 Tweedletech, Llc Furniture and building structures comprising sensors for determining the position of one or more objects
US8974295B2 (en) 2008-06-03 2015-03-10 Tweedletech, Llc Intelligent game system including intelligent foldable three-dimensional terrain
US9849369B2 (en) 2008-06-03 2017-12-26 Tweedletech, Llc Board game with dynamic characteristic tracking
US8602857B2 (en) 2008-06-03 2013-12-10 Tweedletech, Llc Intelligent board game system with visual marker based game object tracking and identification
US20110165923A1 (en) 2010-01-04 2011-07-07 Davis Mark L Electronic circle game system
US9971458B2 (en) * 2009-03-25 2018-05-15 Mep Tech, Inc. Projection of interactive environment
US20110256927A1 (en) 2009-03-25 2011-10-20 MEP Games Inc. Projection of interactive game environment
US20120157204A1 (en) * 2010-12-20 2012-06-21 Lai Games Australia Pty Ltd. User-controlled projector-based games
US9408540B2 (en) * 2012-02-27 2016-08-09 Ovio Technologies, Inc. Rotatable imaging system
US10171734B2 (en) * 2012-02-27 2019-01-01 Ovio Technologies, Inc. Rotatable imaging system
US9317109B2 (en) 2012-07-12 2016-04-19 Mep Tech, Inc. Interactive image projection accessory
US9465484B1 (en) * 2013-03-11 2016-10-11 Amazon Technologies, Inc. Forward and backward looking vision system
US9778546B2 (en) 2013-08-15 2017-10-03 Mep Tech, Inc. Projector for projecting visible and non-visible images
JP6340414B2 (en) * 2014-04-16 2018-06-06 株式会社ソニー・インタラクティブエンタテインメント Information processing apparatus, information processing system, and information processing method
JP6420077B2 (en) * 2014-06-30 2018-11-07 株式会社バンダイナムコエンターテインメント Game system
EP3284039A4 (en) * 2015-04-17 2019-07-24 Tulip Interfaces Inc. Containerized communications gateway
US10304234B2 (en) 2016-12-01 2019-05-28 Disney Enterprises, Inc. Virtual environment rendering
JP6884592B2 (en) * 2017-02-20 2021-06-09 株式会社タイトー Game device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040102247A1 (en) * 2002-11-05 2004-05-27 Smoot Lanny Starkes Video actuated interactive environment
US20050245302A1 (en) * 2004-04-29 2005-11-03 Microsoft Corporation Interaction between objects and a virtual environment display
US20070046625A1 (en) * 2005-08-31 2007-03-01 Microsoft Corporation Input method for surface of interactive display
US20070178955A1 (en) * 2005-07-15 2007-08-02 Maurice Mills Land-based, on-line poker system
WO2007107874A2 (en) * 2006-03-22 2007-09-27 Home Focus Development Ltd Interactive playmat
US20080122805A1 (en) * 2000-10-11 2008-05-29 Peter Smith Books, papers, and downloaded information to facilitate human interaction with computers
US20090264196A1 (en) * 2008-04-16 2009-10-22 Aruze Corp. Gaming device
WO2009149112A1 (en) * 2008-06-03 2009-12-10 Tweedletech, Llc An intelligent game system for putting intelligence into board and tabletop games including miniatures

Family Cites Families (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NZ291950A (en) 1994-07-28 1998-06-26 Super Dimension Inc Computerised game board: location of toy figure sensed to actuate audio/visual display sequence
US6281878B1 (en) 1994-11-01 2001-08-28 Stephen V. R. Montellese Apparatus and method for inputing data
US5844985A (en) 1995-09-22 1998-12-01 Qualcomm Incorporated Vertically correcting antenna for portable telephone handsets
IL121666A (en) 1997-08-31 2001-03-19 Bronfeld Joshua Electronic dice
US6614422B1 (en) 1999-11-04 2003-09-02 Canesta, Inc. Method and apparatus for entering data using a virtual input device
US6710770B2 (en) 2000-02-11 2004-03-23 Canesta, Inc. Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
US6611252B1 (en) 2000-05-17 2003-08-26 Dufaux Douglas P. Virtual data input device
US6650318B1 (en) 2000-10-13 2003-11-18 Vkb Inc. Data input device
US6832954B2 (en) 2000-05-30 2004-12-21 Namco Ltd. Photographing game machine, photographing game processing method and information storage medium
FI113094B (en) 2000-12-15 2004-02-27 Nokia Corp An improved method and arrangement for providing a function in an electronic device and an electronic device
US7103236B2 (en) 2001-08-28 2006-09-05 Adobe Systems Incorporated Methods and apparatus for shifting perspective in a composite image
US6997803B2 (en) 2002-03-12 2006-02-14 Igt Virtual gaming peripherals for a gaming machine
US7334791B2 (en) 2002-08-24 2008-02-26 Blinky Bones, Inc. Electronic die
US7884804B2 (en) 2003-04-30 2011-02-08 Microsoft Corporation Keyboard with input-sensitive display device
PT1663419E (en) 2003-09-05 2008-05-23 Bally Gaming Int Inc Systems, methods, and devices for monitoring card games, such as baccarat
US6955297B2 (en) 2004-02-12 2005-10-18 Grant Isaac W Coordinate designation interface
JP3904562B2 (en) * 2004-02-18 2007-04-11 株式会社ソニー・コンピュータエンタテインメント Image display system, recording medium, and program
US7204428B2 (en) * 2004-03-31 2007-04-17 Microsoft Corporation Identification of object on interactive display surface by identifying coded pattern
US7095033B2 (en) 2004-04-27 2006-08-22 Nicholas Sorge Multi-sided die with authenticating characteristics and method for authenticating same
US7399086B2 (en) 2004-09-09 2008-07-15 Jan Huewel Image processing method and image processing device
JP4489555B2 (en) 2004-10-15 2010-06-23 ビーエルデーオリエンタル株式会社 Bowling game machine
ATE395120T1 (en) 2004-10-25 2008-05-15 Koninkl Philips Electronics Nv AUTONOMOUS WIRELESS CUBE
US7599561B2 (en) * 2006-02-28 2009-10-06 Microsoft Corporation Compact interactive tabletop with projection-vision
US8666366B2 (en) 2007-06-22 2014-03-04 Apple Inc. Device activation and access
US20080280682A1 (en) 2007-05-08 2008-11-13 Brunner Kevin P Gaming system having a set of modular game units
US20080278894A1 (en) 2007-05-11 2008-11-13 Miradia Inc. Docking station for projection display applications
US20090020947A1 (en) 2007-07-17 2009-01-22 Albers John H Eight piece dissection puzzle
US20090029754A1 (en) 2007-07-23 2009-01-29 Cybersports, Inc Tracking and Interactive Simulation of Real Sports Equipment
US20090124382A1 (en) * 2007-11-13 2009-05-14 David Lachance Interactive image projection system and method
US8007110B2 (en) * 2007-12-28 2011-08-30 Motorola Mobility, Inc. Projector system employing depth perception to detect speaker position and gestures
US8267524B2 (en) 2008-01-18 2012-09-18 Seiko Epson Corporation Projection system and projector with widened projection of light for projection onto a close object
US7967451B2 (en) * 2008-06-27 2011-06-28 Microsoft Corporation Multi-directional image displaying device
JP5338166B2 (en) 2008-07-16 2013-11-13 ソニー株式会社 Transmitting apparatus, stereoscopic image data transmitting method, receiving apparatus, and stereoscopic image data receiving method
US9218116B2 (en) * 2008-07-25 2015-12-22 Hrvoje Benko Touch interaction with a curved display
US20100035684A1 (en) 2008-08-08 2010-02-11 Bay Tek Games, Inc. System and method for controlling movement of a plurality of game objects along a playfield
US8226476B2 (en) 2008-11-04 2012-07-24 Quado Media Inc. Multi-player, multi-screens, electronic gaming platform and system
JP5282617B2 (en) 2009-03-23 2013-09-04 ソニー株式会社 Information processing apparatus, information processing method, and information processing program
US20110165923A1 (en) 2010-01-04 2011-07-07 Davis Mark L Electronic circle game system
US20110256927A1 (en) 2009-03-25 2011-10-20 MEP Games Inc. Projection of interactive game environment
US8246467B2 (en) 2009-04-29 2012-08-21 Apple Inc. Interactive gaming with co-located, networked direction and location aware devices
US20100285881A1 (en) 2009-05-07 2010-11-11 Microsoft Corporation Touch gesturing on multi-player game space
JP5273478B2 (en) 2009-07-07 2013-08-28 ソニー株式会社 Video display device and video display system
US8421634B2 (en) 2009-12-04 2013-04-16 Microsoft Corporation Sensing mechanical energy to appropriate the body for data input
CN101776836B (en) 2009-12-28 2013-08-07 武汉全真光电科技有限公司 Projection display system and desktop computer
US8491135B2 (en) 2010-01-04 2013-07-23 Microvision, Inc. Interactive projection with gesture recognition
US8751049B2 (en) 2010-05-24 2014-06-10 Massachusetts Institute Of Technology Kinetic input/output
US8388146B2 (en) 2010-08-01 2013-03-05 T-Mobile Usa, Inc. Anamorphic projection device
US20120223885A1 (en) * 2011-03-02 2012-09-06 Microsoft Corporation Immersive display experience
US20130113975A1 (en) 2011-11-04 2013-05-09 Peter Gabris Projector Image Correction Method and System
US9316889B2 (en) 2012-08-07 2016-04-19 Nook Digital, Llc Front projection eReader system
US8933974B1 (en) 2012-09-25 2015-01-13 Rawles Llc Dynamic accommodation of display medium tilt

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080122805A1 (en) * 2000-10-11 2008-05-29 Peter Smith Books, papers, and downloaded information to facilitate human interaction with computers
US20040102247A1 (en) * 2002-11-05 2004-05-27 Smoot Lanny Starkes Video actuated interactive environment
US20050245302A1 (en) * 2004-04-29 2005-11-03 Microsoft Corporation Interaction between objects and a virtual environment display
US20070178955A1 (en) * 2005-07-15 2007-08-02 Maurice Mills Land-based, on-line poker system
US20070046625A1 (en) * 2005-08-31 2007-03-01 Microsoft Corporation Input method for surface of interactive display
WO2007107874A2 (en) * 2006-03-22 2007-09-27 Home Focus Development Ltd Interactive playmat
US20090264196A1 (en) * 2008-04-16 2009-10-22 Aruze Corp. Gaming device
WO2009149112A1 (en) * 2008-06-03 2009-12-10 Tweedletech, Llc An intelligent game system for putting intelligence into board and tabletop games including miniatures

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2522408A1 (en) * 2011-05-13 2012-11-14 GIC Sp. z o.o. Sp. k. Method and device for electromechanical selection of the element from the plurality of elements
GB2510811A (en) * 2012-12-18 2014-08-20 Optricks Media Ltd Augmented reality systems
EP3575234A1 (en) * 2018-05-30 2019-12-04 Radoslaw Oryl Crown cap game device
US11331563B2 (en) 2018-05-30 2022-05-17 Caps Apps Spolka Z Ograniczona Odpowiedialnoscia Crown cap game device
USD976106S1 (en) 2020-11-01 2023-01-24 CAPS APPS Spółka z o.o. Crown cap

Also Published As

Publication number Publication date
US8808089B2 (en) 2014-08-19
US20110256927A1 (en) 2011-10-20
US20140354603A1 (en) 2014-12-04
US20130123013A1 (en) 2013-05-16
US9550124B2 (en) 2017-01-24

Similar Documents

Publication Publication Date Title
US10258878B2 (en) Apparatus for detecting inputs with projected displays
WO2011082405A1 (en) Interactive game environment
JP5627973B2 (en) Program, apparatus, system and method for game processing
CN107656615A (en) The world is presented in a large amount of digital remotes simultaneously
JP2011259243A (en) Image processing program, image processing apparatus, image processing system, and image processing method
JP7369674B2 (en) Programs, methods and viewing devices
JP6726322B1 (en) Game program, method, and information processing device
JP6785325B2 (en) Game programs, methods, and information processing equipment
JP7305599B2 (en) program
JP2020162979A (en) system
Fizek et al. Playing with sound and gesture in digital audio games
US20230115736A1 (en) Interactive environment with virtual environment space scanning
JP7449523B2 (en) Information processing system, information processing method, information processing program
KR20120069008A (en) System for realistic 3d game
JP6826626B2 (en) Viewing program, viewing method, and viewing terminal
JP2020141813A (en) Distribution program, distribution method, computer and viewing terminal
JP7412613B1 (en) Information processing systems and programs
JP7412617B1 (en) Information processing systems and programs
JP7354466B1 (en) Information processing systems and programs
WO2022113335A1 (en) Method, computer-readable medium, and information processing device
JP2021037302A (en) Game program, method, and information processing device
JP2023126796A (en) Distribution program and system
JP2021051762A (en) Viewing program, viewing method, and viewing terminal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11701566

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11701566

Country of ref document: EP

Kind code of ref document: A1