US20110151955A1 - Multi-player augmented reality combat - Google Patents
Multi-player augmented reality combat Download PDFInfo
- Publication number
- US20110151955A1 US20110151955A1 US12/956,646 US95664610A US2011151955A1 US 20110151955 A1 US20110151955 A1 US 20110151955A1 US 95664610 A US95664610 A US 95664610A US 2011151955 A1 US2011151955 A1 US 2011151955A1
- Authority
- US
- United States
- Prior art keywords
- mobile communication
- player
- communication device
- virtual
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/215—Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/65—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
- A63F13/655—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition by importing photos, e.g. of the player
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/70—Game security or game management aspects
- A63F13/79—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/70—Game security or game management aspects
- A63F13/79—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
- A63F13/795—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for finding other players; for building a team; for providing a buddy list
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/837—Shooting of targets
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/90—Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
- A63F13/92—Video game devices specially adapted to be hand-held while playing
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/20—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
- A63F2300/204—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform the platform being a handheld device
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/20—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
- A63F2300/205—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform for detecting the geographical location of the game platform
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6045—Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8023—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game the game being played by multiple players at a common site, e.g. in an arena, theatre, shopping mall using a large public display
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8076—Shooting
Definitions
- the present invention relates to techniques for performing multi-player augmented reality combat.
- Augmented reality is a representation of a physical real-world environment that is combined with (i.e., augmented by) virtual (i.e., computer-generated) imagery.
- the augmentation of the physical real-world environment is usually performed in real-time.
- the augmented reality may be displayed to a user via a live-video stream.
- the user may view the live-video stream using any suitable type of display, such as a head-mounted display, a handheld display, a virtual retinal display, etc.
- Augmented reality systems commonly include hand-held devices each of which has network capabilities, a camera, and a display.
- the cameras capture the physical real-world environment, and the displays display the physical real-world environment in combination with virtual objects.
- Virtual objects may include text, images, or any other computer-generated object.
- Augmented reality may be used in a variety of applications.
- augmented reality may be used to create a virtual object in a museum, an exhibition, or a theme park attraction.
- labels or text e.g., operating instructions
- virtual imagery of a digital mock-up may be compared side-by-side to a physical mock-up to determine discrepancies therebetween.
- each user's hand-held device includes a camera and a display.
- a user's camera captures images of physical real-world objects using pre defined patterns.
- the user's display displays the images of the physical real-world objects in combination with virtual objects.
- the virtual objects may be superimposed on an image of the user's physical real-world environment.
- the user is often able to interact with the virtual objects.
- the user may use a virtual weapon to fire virtual bullets at the virtual objects.
- the virtual weapon may be associated with a virtual targeting axis, for example, that points to a location at which the virtual bullets are to be fired.
- the display may display the virtual targeting axis in the augmented reality.
- the virtual objects with which the user interacts typically do not correspond to physical objects in the physical real-world environment. Accordingly, users of conventional augmented reality games traditionally are not able to perform augmented reality combat with another person.
- Multi-player augmented reality combat is combat that is performed between multiple players using augmented reality.
- Each player wears (or is otherwise associated with) an indicator (e.g., an object that has a designated pattern, a visual tag, etc.) that identifies the player or a team in which the player is included.
- Each player has a mobile communication device, which executes software that enables the mobile communication device to identify the indicators of the players. For example, a user of a mobile communication device may point a camera of the mobile communication device at another player (e.g., an opponent). The image that is captured by the camera includes the indicator that is associated with the opponent.
- the user may choose to fire a virtual bullet (e.g., spear, slug, cannon ball, dart, flames, buckshot, etc.) at the opponent by providing an audio and/or tactile command to the user's mobile communication device.
- the mobile communication device determines an estimated duration of a time period for the virtual bullet to travel from the mobile communication device to the opponent based on the distance between the user and the opponent. For example, the distance between the player and the opponent can be calculated according to their positions, which may be determined by respective location modules.
- the mobile communication device includes a camera, an image recognition module, a display, and an outgoing attack module.
- the camera is configured to capture an image.
- the image recognition module is configured to identify a player indicator in the image, the player indicator corresponding to an opponent player.
- the display is configured to display the image.
- the outgoing attack module is configured to transmit an outgoing attack indicator in accordance with a mobile communication protocol in response to a user-initiated attack command that is received in response to identification of the player indicator.
- the outgoing attack indicator specifies that a virtual bullet is fired at the opponent player.
- the example mobile communication device may further include a location module, a distance determination module, and a time determination module.
- the location module is configured to determine a location of the mobile communication device. For example, the location of the mobile communication device may be associated with a player indicator of the user of the mobile communication device. The location of the mobile communication device is available to mobile communication devices of other players for distance calculation via a central server, P2P communication, or any other communication technique.
- the distance determination module is configured to determine a distance between the location of the user's mobile communication device and a location of the opponent player whose player indicator is identified by the image recognition module.
- the time determination module is configured to determine an estimated duration of a time period for the virtual bullet to travel from the mobile communication device to the player based on the determined distance.
- An example method for performing multi-player augmented reality combat with respect to a user of a mobile communication device is also described.
- an image is captured and displayed.
- a player indicator is identified in the image.
- the player indicator corresponds to an opponent player.
- a user-initiated attack command is received in response to identifying the player indicator.
- An attack indicator is transmitted in accordance with a mobile communication protocol in response to receiving the user-initiated attack command.
- the attack indicator specifies that a virtual bullet is fired at the opponent player.
- a location of the mobile communication device is determined A player location indicator that specifies a location of the opponent player is received. A distance between the location of the mobile communication device and the location of the opponent player is determined An estimated duration of a time period for the virtual bullet to travel from the mobile communication device to the opponent player is determined based on the determined distance.
- the computer program product includes a computer-readable medium having computer program logic recorded thereon for enabling a processor-based system to perform multi-player augmented reality combat with respect to a user of a mobile communication device.
- the computer program logic includes first, second, and third program logic modules.
- the first program logic module is for enabling the processor-based system to capture an image.
- the second program logic module is for enabling the processor-based system to identify a player indicator in the image.
- the player indicator corresponds to a player.
- the third program logic module is for enabling the processor-based system to transmit an outgoing attack indicator in accordance with a mobile communication protocol in response to a user-initiated attack command that is received in response to identification of the player indicator.
- the outgoing attack indicator specifies that a virtual bullet is fired at the player.
- the computer program logic may further include fourth, fifth, and sixth program logic modules.
- the fourth program logic module is for enabling the processor-based system to determine a location of the mobile communication device.
- the fifth program logic module is for enabling the processor-based system to determine a distance between the location of the mobile communication device and a location of the player.
- the sixth program logic module is for enabling the processor-based system to determine an estimated duration of a time period for the virtual bullet to travel from the mobile communication device to the player based on the determined distance.
- FIGS. 1 and 2 are block diagrams of example augmented reality combat systems in accordance with embodiments described herein.
- FIGS. 3 , 9 , and 11 depict flowcharts of methods for performing multi-player augmented reality combat with respect to a user of a mobile communication device in accordance with embodiments described herein.
- FIGS. 4 , 10 , and 12 are block diagrams of example implementations of a mobile communication device shown in FIG. 1 in accordance with embodiments described herein.
- FIGS. 5-8 show mobile communication devices that display example views of an augmented reality environment in accordance with embodiments described herein.
- FIG. 13 is a block diagram of a computer in which embodiments may be implemented.
- FIG. 14 illustrates a technique for determining a difference between an altitude of a mobile communication device and an altitude of a player in accordance with an embodiment described herein.
- references in the specification to “one embodiment,” “an embodiment,” “an example embodiment,” or the like, indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Furthermore, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to implement such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
- Example embodiments are capable of performing multi-player augmented reality combat.
- Multi-player augmented reality combat is combat that is performed between multiple players using augmented reality.
- each player wears (or is otherwise associated with) an indicator (e.g., an object that has a designated pattern, a visual tag, etc.) that identifies the player or a team in which the player is included.
- Each player has a mobile communication device that is capable of identifying the indicators of the players. For example, a user of a mobile communication device may point a camera of the mobile communication device at another player (e.g., an opponent). The image that is captured by the camera includes the indicator that is associated with the opponent.
- the user may choose to fire a virtual bullet (e.g., spear, slug, cannon ball, dart, flames, buckshot, etc.) at the opponent when the camera is pointed at the opponent by providing an audio and/or tactile command to the user's mobile communication device.
- the mobile communication device determines an estimated duration of a time period for the virtual bullet to travel from the mobile communication device to the opponent based on the distance between the user and the opponent.
- locations of the players are determined using location signals, such as global positioning system (GPS) signals, without the need for each player to be associated with an indicator.
- location signals such as global positioning system (GPS) signals
- a user's mobile communication device may be capable of determining that a camera of the mobile communication device is pointed at an opponent based on a location of the user, a location of the opponent, and an orientation of the mobile communication device.
- Techniques described herein for performing multi-player augmented reality combat have a variety of benefits as compared to conventional augmented reality gaming techniques.
- the techniques described herein enable players to interact with other players who exist in the physical real-world environment.
- Virtual environmental conditions may be introduced into the augmented reality that is perceived by the players. Such environmental conditions may affect the time that a virtual bullet takes to travel from a user's mobile communication device to an opponent whose indicator is identified by the user's mobile communication device.
- the environmental conditions may be controlled with respect to a particular user's augmented reality based on environmental control commands that are initiated by the user.
- the speed of a virtual bullet that is directed at a user may be controlled based on a speed control command that is initiated by the user.
- FIG. 1 is a block diagram of an example augmented reality combat system 100 in accordance with an embodiment described herein.
- augmented reality combat system 100 operates to perform multi-player augmented reality combat with respect to users of mobile communication devices based on commands that the users provide via the mobile communication devices.
- users i.e., players
- augmented reality combat system 100 operates to determine the time that each virtual bullet takes to reach the respective targeted player.
- augmented reality combat system 100 includes a plurality of mobile communication devices 102 A- 102 N, a network 104 , a device location system 106 , and server(s) 108 .
- Device location system 106 provides location signals to mobile communication devices 102 A- 102 N via respective links 112 A- 112 N in accordance with a wireless protocol, such as a mobile communication protocol, a global positioning system (GPS) protocol, or any other suitable protocol over a wireless network.
- a wireless protocol such as a mobile communication protocol, a global positioning system (GPS) protocol, or any other suitable protocol over a wireless network.
- GPS global positioning system
- Communications among mobile communication devices 102 A- 102 N and server(s) 108 are carried out over network 104 using well-known network communication protocols.
- Network 104 may be a wide-area network (e.g., the Internet), a local area network (LAN), another type of network, or a combination thereof Communications between network 104 and mobile communication devices 102 A- 102 N are provided wirelessly via respective wireless links 110 A- 110 N.
- LAN local area network
- Mobile communication devices 102 A- 102 N are processing systems that are capable of communicating with server(s) 108 .
- An example of a processing system is a system that includes at least one processor that is capable of manipulating data in accordance with a set of instructions.
- a processing system may be a computer, a personal digital assistant, etc.
- Mobile communication devices 102 A- 102 N process data that is received from server(s) 108 via network 104 to display an augmented representation of the physical real-world environment to users of the mobile communication devices 102 A- 102 N.
- the augmented representation of the physical real-world environment is referred to herein as an augmented reality environment.
- the mobile communication devices 102 A- 102 N may combine virtual imagery, text, and/or any other type of data with images of the physical real-world environment to generate the augmented reality environment.
- augmented reality combat system 100 users register their identities (e.g., user names), player identifiers, device addresses, etc. using mobile communication devices 102 A- 102 N, so that their identifiers may be associated with their identities.
- Mobile communication devices 102 A- 102 N are capable of interpreting signals that are received from device location system 106 to determine their respective locations in the augmented reality environment.
- Each mobile communication device 102 A- 102 N may be configured to provide information regarding its location to server(s) 108 via network 104 in response to the mobile communication device being moved and/or periodically in accordance with a designated schedule.
- Mobile communication devices 102 A- 102 N are capable of interpreting commands that are received from users of the mobile communication devices 102 A- 102 N to perform virtual actions in the augmented reality environment. For instance, mobile communication devices 102 A- 102 N are capable of firing virtual bullets at users of other mobile communication devices in the augmented reality environment in response to user-initiated attack commands Mobile communication devices 102 A- 102 N are capable of identifying the players based on player identifiers that correspond to the players. Multiple players can use the same identity and/or player identifier (e.g., to indicate that they belong to the same group). For example, each of mobile communication devices 102 A- 102 N may store a list that cross-references the players and the player indicators.
- the list is stored on (or otherwise accessible to) server(s) 108 , and each of mobile communication devices 102 A- 102 N is configured to access the list from server(s) 108 .
- server(s) 108 each of mobile communication devices 102 A- 102 N is configured to access the list from server(s) 108 .
- Server(s) 108 is a processing system that is capable of communicating with mobile communication devices 102 A- 102 N.
- Server(s) 108 provides data to mobile communication devices 102 A- 102 N that are to be combined with images of the physical real-world environment to provide an augmented reality environment.
- Server(s) 108 processes commands that are received from mobile communication devices 102 A- 102 N, specifying actions to be taken with respect to objects in the augmented reality environment. For example, if a user of a first communication device 102 A provides a command to fire a virtual bullet at a user of a second communication device 102 B, server(s) may provide an indicator to second communication device 102 B that specifies that a virtual bullet is directed at the user of the second mobile communication device 102 B.
- server(s) 108 may update the virtual imagery of the augmented reality environment to show the virtual bullet travelling toward the user of the second communication device 102 B. For instance, when users aim cameras of their mobile communication devices at an area where the virtual bullet virtually exists, the mobile communication devices may draw the virtual bullet so that the users can see the virtual bullet on displays of their mobile communication devices.
- server(s) 108 is capable of modifying the augmented reality environment to include virtual environmental conditions.
- server(s) may control such environmental conditions with respect to one or more of the users in response to receiving user-initiated environmental control commands.
- server(s) 108 may introduce or eliminate a virtual environmental condition or reduce or increase the intensity of the virtual environmental condition with respect to a user upon receiving a user-initiated environmental control command regarding the environmental condition from a mobile communication device of the user.
- server(s) 108 may introduce or eliminate the virtual environmental condition or reduce or increase the intensity of the virtual environmental condition with respect to users other than the user who initiated the environmental control command upon receiving the environmental control command.
- mobile communication devices 102 A- 102 N are capable of modifying the augmented reality environment to include environmental conditions.
- server(s) 108 may store attributes of users that include environmental control capabilities. Each user's mobile communication device may store a copy of that user's attributes for permitting the user to utilize environmental control commands that are associated with the user's environmental control capabilities.
- the user may initiate an environmental control command using a button or touch screen of the user's mobile communication device, an audible command, or any other suitable technique.
- the user's mobile communication device may change virtual environmental parameters, such that each other users virtual environmental conditions that are associated with the parameters are incorporated into the versions of the augmented reality environment that are displayed to the other users.
- the environmental condition parameters may affect virtual objects, virtual attributes (e.g., health, visibility, etc.) of the user and/or other players, virtual bullet speed and/or direction, a hit effect that is associated with a virtual bullet, firing accuracy, range of explosion, or any other virtual characteristic of the augmented reality combat.
- virtual attributes e.g., health, visibility, etc.
- the environmental condition parameters may affect virtual objects, virtual attributes (e.g., health, visibility, etc.) of the user and/or other players, virtual bullet speed and/or direction, a hit effect that is associated with a virtual bullet, firing accuracy, range of explosion, or any other virtual characteristic of the augmented reality combat.
- Device location system 106 is a processing system that is configured to provide location signals to mobile communication devices 102 A- 102 N via respective links 112 A- 112 N.
- links 112 A- 112 N may be wireless links, GPS links, or any other suitable type of links.
- the location signals may specify the locations of the respective mobile communication devices 102 A- 102 N.
- the location signals may include information that may be used by the mobile communication devices 102 A- 102 N to determine their respective locations.
- Links 112 A- 112 N are shown in FIG. 1 to be unidirectional for illustrative purposes and are not intended to be limiting. It will be recognized that links 112 A- 112 N may be bidirectional.
- device location system 106 may provide ping signals to mobile communication devices 102 A- 102 N for determining locations of the respective mobile communication devices 102 A- 102 N.
- device location system 106 may receive response signals from the respective mobile communication devices 102 A- 102 N in response to the respective ping signals.
- Device location system 106 may determine a location of each mobile communication device 102 A- 102 N based on the time that elapses between providing the respective ping signal and receiving the respective response signal.
- each of the mobile communication devices 102 A- 102 N reports its location to server 108 via network 104 and accesses server 108 to determine the locations of the other mobile communication devices.
- a mobile communication device may compare its location to a location of another mobile communication device to calculate a distance therebetween.
- the locations of the respective mobile communication devices 102 A- 102 N, as indicated by the location signals that are provided by device location system 106 may be estimated locations. Accordingly, the calculated distances between the mobile communication devices 102 A- 102 N may be estimated distances.
- Device location system 106 may be capable of providing a positioning accuracy that is greater than the positioning accuracy that is allowed by government regulations and/or laws. For instance, the full positioning accuracy capabilities of device location system 106 may be reserved for military applications. If restrictions regarding the positioning accuracy of device location system 106 are not imposed, and/or GPS (or other positioning technique) allows for accurate positioning of approximately one meter or less, device location system 106 may provide substantially greater positioning accuracy. For example, the location of each user may be determined using merely the capabilities of device location system 106 , without using image recognition techniques.
- FIG. 2 is a block diagram of another example augmented reality combat system 200 in accordance with an embodiment described herein.
- Augmented reality combat system 200 is similar to the augmented reality combat system 100 shown in FIG. 1 , except that augmented reality combat system 200 does not include network 104 or server(s) 108 , and mobile communication devices 202 A- 202 N are processing systems that are capable of communicating with each other. Thus, communications between mobile communication devices 202 A- 202 N are provided wirelessly using well-known communication protocols that do not require a server.
- first mobile communication device 202 A and second mobile communication device 202 B communicate via wireless link 204 ; second mobile communication device 202 B and nth mobile communication device 202 N communicate via wireless link 206 ; nth mobile communication device 202 N and first mobile communication device 202 A communicate via wireless link 208 , and so on.
- a mobile communication device that performs an action with respect to the augmented reality environment may provide an indicator that specifies the action to each of the other mobile communication devices.
- Each mobile communication device 202 A- 202 N may provide information regarding its location to the other communication devices in response to that mobile communication device being moved and/or periodically in accordance with a designated schedule.
- the mobile communication devices that receive such indicators and/or information may update their respective displays of the augmented reality environment based on the indicators and/or the information.
- augmented reality combat system 200 includes a network and a server.
- attributes that are associated with the players may be stored on (or otherwise accessible to) the server. Examples of attributes include but are not limited to identities of the players, network addresses of the mobile communication devices of the players, etc.
- Each mobile communication device may access the attributes that are stored on (or otherwise accessible to) the server via the network.
- a connection is established directly between mobile communication devices for communication therebetween.
- each mobile communication device communicates with the server, and the server is responsible for transferring communications from the originating mobile communication devices to the recipient mobile communication devices.
- Augmented reality combat systems 100 and 200 are provided for illustrative purposes and are not intended to be limiting. It will be recognized that any of the mobile communication devices described herein may communicate with each other directly and/or via server(s).
- FIG. 3 depicts a flowchart 300 of a method for performing multi-player augmented reality combat with respect to a user of a mobile communication device in accordance with an embodiment described herein.
- Flowchart 300 is described from the perspective of a mobile communication device.
- Flowchart 300 may be performed by any of mobile communication devices 102 A- 102 N of augmented reality combat system 100 shown in FIG. 1 , for example.
- flowchart 300 is described with respect to a mobile communication device 400 shown in FIG. 4 , which is an example of a mobile communication device 102 , according to an embodiment.
- mobile communication device 400 includes a camera 402 , an image recognition module 404 , a command receipt module 406 , an outgoing attack module 408 , a location module 410 , a location indicator receipt module 412 , a distance determination module 414 , a time determination module 416 , an environment module 418 , an environment control module 420 , an identification module 422 , a display module 424 , and an orientation determination module 426 .
- Flowchart 300 is described as follows.
- step 302 an image of a physical real-world environment is captured.
- camera 402 captures the image.
- display module 424 may render the image for viewing by a user of mobile communication device 400 .
- the image may be augmented to include virtual objects, virtual environmental conditions, etc. before it is rendered, though the scope of the example embodiments is not limited in this respect.
- a player indicator is identified in the image.
- the player indicator corresponds to a player in the physical real-world environment.
- the player indicator may be a designated pattern that is provided on an article of the player's clothing or on another object that is associated with the player.
- the player indicator may be a visual tag that is associated with the player.
- the player indicator may be identified in substantially real-time as the image of the physical real-world environment is captured, though the scope of the example embodiments is not limited in this respect.
- image recognition module 404 identifies the player indicator.
- a user-initiated attack command is received in response to identifying the player indicator.
- the user of the mobile communication device may identify another player in the real world by looking at it, aiming the device camera at the other player and the image recognition module identifies the second player.
- the player aims the center of the camera at the identified second player and initiate the attack command by saying a word or phrase that is associated with the attack command, pressing a button on the mobile communication device that is associated with the attack command, touching a touch screen of the mobile communication device at the position on screen where the targeted player is located in a manner that is associated with the attack command (e.g., moving the user's finger up, down, right, left, or diagonally on the touch screen; touching the touch screen in a designated location that is associated with the attack command; etc.), shaking the mobile device, or using any other suitable technique.
- command receipt module 406 receives the user-initiated attack command.
- a user may initiate an attack command at any time, not only in response to identifying a player indicator.
- a user may initiate an attack command to fire a virtual bullet at a virtual object.
- a user may use a virtual weapon (e.g., a cannon) that has a substantially wide area of hit to fire at players without the need for identifying the players.
- the players may be fired upon even if the players are hiding (i.e., not in view of the user).
- location indicators that are associated with the players may be relied upon for determining the location of those players. Accordingly, a user-initiated attack command may be received at any time.
- an attack indicator is transmitted in response to receiving the user-initiated attack command.
- the attack indicator specifies that a virtual bullet is fired at the player.
- the attack indicator may be wirelessly transmitted in accordance with a mobile communication protocol.
- outgoing attack module 408 transmits the attack indicator to the central server or directly to the targeted player device.
- a location of the mobile communication device is determined
- location module 410 determines the location of the mobile communication device.
- the location of the mobile communication device is determined based on location signals (e.g., global positioning system (GPS) signals, wireless signals received from a base station, etc.).
- location signals e.g., global positioning system (GPS) signals, wireless signals received from a base station, etc.
- each location signal may include a location indicator that specifies a location of its source and a time indicator that specifies a time at which the location signal was transmitted by its source.
- Location module 410 may combine the time indicators that are included in the respective location signals and the times at which the mobile communication device received the respective location signals to determine transmit times of the respective location signals.
- a transmit time is a duration of time for a location signal to travel from its source to the mobile communication device.
- Location module 410 may determine distances between the mobile communication device and the sources of the respective location signals based on the transmit times of the respective location signals.
- Location module 410 may combine the distances between the mobile communication device and the sources of respective location signals with the locations of the respective sources to determine the location of the mobile communication device. For instance, location module 410 may use a trilateration technique to combine the distances with the sources' locations. Trilateration is a technique for determining intersections of the surfaces of three spheres based on the centers and radii of the spheres. In accordance with this example embodiment, the centers of the spheres correspond to the locations of the respective sources, and the radii correspond to the distances between the mobile communication device and the respective sources.
- the location of the mobile communication device is determined based on a location indicator that specifies the location of the mobile communication device.
- a location indicator that specifies the location of the mobile communication device.
- one or more servers e.g., server(s) 108
- location module 410 receives a location indicator from the server(s) that specifies the location of the mobile communication device. Location module 410 interprets the location indicator to determine the location of the mobile communication device.
- sources may provide request signals to the mobile communication device.
- the mobile communication device may send response signals to the respective sources in response to the request signals.
- Each source may determine a distance between the mobile communication device and the source based on a duration of a time period between transmission of the respective request signal and receipt of the corresponding response signal.
- the wireless communication system may combine the distances between the mobile communication device and the respective sources with the locations of the respective sources to determine the location of the mobile communication device.
- the wireless communication system may then provide a location indicator that specifies the location of the mobile communication device to the mobile communication device, enabling the mobile communication device to determine its location based on the location indicator.
- the location of the mobile communication device is determined based on the strengths of signals that the mobile communication device receives from respective sources. For instance, a trilateration technique may be used to determine the location of the mobile communication device based on the signal strengths.
- a player location indicator that specifies a location of the player is received.
- the player location indicator may be wirelessly received.
- the player location indicator may specify GPS coordinates that indicate the location of the player.
- location indicator receipt module 412 receives the player location indicator.
- each mobile communication device may send a player location indicator that specifies the location of the player that corresponds to that mobile communication device to a central server, so that other mobile communication devices may access the indicators on the server.
- each mobile communication device may send a player location indicator that specifies the location of the player that corresponds to that mobile communication device to the other mobile communication devices without routing the indicators through a server.
- a distance between the location of the mobile communication device and the location of the player is determined.
- distance determination module 414 determines the distance between the location of the mobile communication device and the location of the player.
- an estimated duration of a time period for the virtual bullet to travel from the mobile communication device to the player is determined based on the determined distance.
- time determination module 416 determines the estimated duration of the time period for the virtual bullet to travel from the mobile communication device to the player.
- the estimated duration of the time period is determined based on a virtual environmental condition.
- a virtual environmental condition include but are not limited to wind, sunlight, moonlight, darkness, rain, snow, hail, fog, a sand storm, etc.
- some environmental conditions e.g., wind speed, wind direction, rain, snow, etc.
- Virtual environmental conditions are discussed in further detail below with reference to environment module 418 and environment control module 420 .
- the estimated duration of the time period is determined based on a difference between an altitude of the mobile communication device and an altitude of the player indicator.
- distance determination module 414 may provide a vector representation of the distance between the location of the mobile communication device and the location of the player that specifies a horizontal distance and a vertical distance between the location of the mobile communication device and the location of the player.
- time determination module 416 may determine the estimated duration of the time period based on the vector representation of the distance. Further description of an example technique for determining a difference between an altitude of the mobile communication device and an altitude of the player indicator is provided below with reference to orientation determination module 426 and FIG. 14 .
- the estimated duration of the time period is determined based on an attribute of a virtual weapon that is used to fire the virtual bullet.
- an attribute of a virtual weapon that is used to fire the virtual bullet examples include but are not limited to a size and/or weight of the virtual bullet that is fired by the virtual weapon, a force with which the virtual weapon fires the virtual bullet, a condition of the virtual weapon, a type of the virtual weapon, etc.
- one or more steps 302 , 304 , 306 , 308 , 310 , 312 , 314 , and/or 316 of flowchart 300 may not be performed. Moreover, steps in addition to or in lieu of steps 302 , 304 , 306 , 308 , 310 , 312 , 314 , and/or 316 may be performed.
- the Environment module 418 is configured to modify the image of the physical real-world environment to include one or more virtual environmental conditions.
- the image of the physical real-world environment may include virtual objects and/or other information in addition to the virtual environmental condition(s), though the scope of the example embodiments is not limited in this respect.
- some virtual environmental conditions e.g., sunlight or mitigation of rain, snow, fog, etc.
- Other virtual environmental conditions e.g., darkness, rain, snow, fog, etc.
- some virtual environmental conditions e.g., mitigation of rain, snow, fog, etc.
- virtual environmental conditions e.g., rain, snow, fog, etc.
- some virtual environmental conditions e.g., wind, sand storm, hail, etc.
- Environment control module 420 is configured to control virtual environmental conditions that are incorporated into the image of the physical real-world environment in response to user-initiated environmental control commands. For instance, a user may acquire a virtual power that enables the user to initiate commands for controlling one or more of the virtual environmental conditions. Environment control module 420 may mitigate or intensify an environmental condition with respect to the user's view of the augmented reality environment (or the views of the other players) upon receipt of a user-initiated environmental control command from the user. The environmental condition may be changed for a designated time period or indefinitely in response to the user's environmental control command.
- the user may request that virtual sunlight be provided with respect to the user's view of the augmented reality environment to enhance the user's visibility.
- Environment control module 420 may provide the virtual sunlight with respect to the user's view of the augmented reality environment, but not with respect to the views of the other users.
- the user may request that the intensity of a virtual snowstorm in the augmented reality environment be mitigated (or that the virtual snowstorm be terminated) with respect to the user's view of the augmented reality environment.
- Environment control module 420 may mitigate (or terminate) the virtual snowstorm with respect to the user's view of the augmented reality environment, but not with respect to the views of the other users.
- the user may request that virtual rain be provided with respect to the other players' views of the augmented reality environment to reduce visibility of the other players.
- Environment control module 420 may provide the virtual rain with respect to the other players' views, but not with respect to the view of the user who requested the rain.
- the user may request that virtual sunlight be removed from the other players' views of the augmented reality environment. Environment control module 420 may remove the virtual sunlight from the views of the other players, but not from the view of the user who initiated the request.
- environment control module 420 may mitigate or intensify an environmental condition with respect to all players' views of the augmented reality environment in response to a user's environment control command.
- Identification module 422 is configured to modify the image to include identification information regarding a player whose player indicator is identified in the image. Examples of identification information include but are not limited to a player's name, photograph, team affiliation, rank, experience level, virtual attack success rate, Twitter® account address, instant message address, score in the game, virtual shield type, virtual shield strength, health condition, virtual weapons available to the player and/or virtual weapon currently being used by the player, etc. Identification module 422 may modify the image to selectively include designated identification information regarding players in accordance with instructions received from the user of mobile communication device 400 .
- identification module 422 may modify the image to include all of the identification information regarding the players when mobile communication device 400 is pointed at the players (e.g., when image recognition module 404 recognizes a player and/or when camera 402 is pointed toward an area having coordinates that correspond to a location of the player).
- Display module 424 is configured to render images that are captured by camera 402 and/or modified by environment module 418 and/or identification module 422 .
- Orientation determination module 426 is configured to determine an orientation (e.g., tilt) of mobile communication device 400 . For instance, if camera 402 is pointed at a player, orientation determination module 426 is capable of determining a difference between an altitude of mobile communication device 400 and an altitude of the player. For example, orientation determination module 426 may include an accelerometer for determining the orientation of mobile communication device 1402 . In accordance with an embodiment, distance determination module 416 determines the distance between the location of mobile communication device 400 and the location of the player based on the altitude difference that is determined by orientation determination module 426 .
- orientation determination module 426 may include an accelerometer for determining the orientation of mobile communication device 1402 .
- distance determination module 416 determines the distance between the location of mobile communication device 400 and the location of the player based on the altitude difference that is determined by orientation determination module 426 .
- FIG. 14 illustrates a technique for determining a difference (labeled as “A”) between an altitude of a mobile communication device and an altitude of a player in accordance with an embodiment described herein.
- a mobile communication device 1402 is pointed at a player 1404 .
- a camera of mobile communication device 1402 may be pointed at player 1404 .
- Element 1406 represents a two-dimensional (e.g., GPS) location of mobile communication device 1402 .
- Element 1408 represents a two-dimensional (e.g., GPS) location of player 1404 .
- the distance “B” represents a two-dimensional distance between mobile communication device 1402 and player 1404 .
- Element 1410 represents a three-dimensional location of mobile communication device 1402 . Accordingly, the distance “C” represents a three-dimensional distance between mobile communication device 1402 and player 1404 .
- the altitude of mobile communication device 1402 is shown with reference to the altitude of player 1404 for ease of discussion. It will be recognized that mobile communication device 1402 and player 1404 may have any respective altitudes.
- the three-dimensional distance “C” between mobile communication device 1402 and player 1404 may be determined in accordance with the following equation:
- B is the two-dimensional distance “B” between mobile communication device 1402 and player 1404
- ⁇ is the angle between lines A and C.
- mobile communication device 400 of FIG. 4 may not include one or more of camera 402 , image recognition module 404 , command receipt module 406 , outgoing attack module 408 , location module 410 , location indicator receipt module 412 , distance determination module 414 , time determination module 416 , environment module 418 , environment control module 420 , identification module 422 , display module 424 , and/or orientation determination module 426 .
- mobile communication device 400 may include modules in addition to or in lieu of camera 402 , image recognition module 404 , command receipt module 406 , outgoing attack module 408 , location module 410 , location indicator receipt module 412 , distance determination module 414 , time determination module 416 , environment module 418 , environment control module 420 , identification module 422 , display module 424 , and/or orientation determination module 426 .
- FIGS. 5-8 show mobile communication devices 500 , 600 , 700 , and 800 that display example views of an augmented reality environment in accordance with embodiments described herein.
- mobile communication device 500 includes a display 502 that displays an image of a physical real-world environment. The image shows a player 504 who has a player indicator 506 affixed to his shirt. It will be recognized that player indicator 506 may be associated with player 504 in any suitable manner and need not necessarily be affixed to the player's person.
- Mobile communication device 500 is configured to identify player indicator 506 .
- mobile communication device 500 may provide a sensory signal to a user of mobile communication device 500 to indicate that player indicator 506 has been identified, though the scope of the example embodiments is not limited in this respect.
- a sensory signal is a signal that is perceptible by a human.
- the sensory signal may be an audio signal having a frequency in the audible spectrum (e.g., in a range between 20 hertz (Hz) and 20,000 kilohertz (kHz)), a visual signal having a frequency in the visible spectrum (e.g., in a range between 400 terahertz (THz) and 790 THz), a tactile signal, or any other signal that is human-perceptible.
- a tactile signal is a signal that a human is capable of perceiving using the sense of touch.
- a tactile signal may be provided using a vibration mechanism of mobile communication device 500 .
- mobile communication device 600 displays an image of an augmented reality environment that includes the physical real-world environment as shown in FIG. 5 with the addition of a virtual environmental condition.
- the virtual environmental condition in this example is rain 602 . It will be recognized that rain 602 may reduce the visibility of a user of mobile communication device 600 .
- FIG. 7 illustrates that a user may control a virtual environmental condition with respect to a view of the augmented reality environment that is displayed to the user.
- mobile communication device 700 displays the augmented reality environment as shown in FIG. 6 , except that rain 702 in FIG. 7 is a mitigated version of rain 602 that is shown in FIG. 6 .
- FIG. 7 illustrates that a user of mobile communication device 700 moves her finger 706 downward on a touch screen of mobile communication device 700 , as depicted by arrow 704 .
- the downward motion is interpreted by communication device to be an environmental control command, in response to which mobile communication device 700 mitigates the intensity of virtual rain 602 to provide rain 702 .
- mitigation of the environmental condition in this example increases visibility with respect to the augmented reality environment.
- mobile communication device 800 displays an image of an augmented reality environment that includes the physical real-world environment as shown in FIG. 5 with the addition of identification information 802 regarding player 504 .
- Identification information 802 is shown to include a name of player 504 and a team affiliation of player 504 for illustrative purposes and is not intended to be limiting. Identification information 802 may include any suitable information regarding player 504 .
- FIG. 9 depicts a flowchart 900 of a method for performing multi-player augmented reality combat with respect to a user of a mobile communication device in accordance with an embodiment described herein.
- flowchart 900 is described with respect to a mobile communication device 1000 shown in FIG. 10 , which is an example of a mobile communication device 102 , according to an embodiment.
- mobile communication device 1000 includes an incoming attack module 1002 , a sensory signal module 1004 , an authorization determination module 1006 , a command determination module 1008 , a speed control module 1010 , and a stationary determination module 1012 .
- incoming attack module 1002 a sensory signal module 1004
- authorization determination module 1006 a command determination module 1008
- speed control module 1010 a speed control module 1010
- stationary determination module 1012 a stationary determination module
- step 902 an incoming attack indicator that specifies that a virtual bullet is directed at a user of a mobile communication device is received.
- the incoming attack indicator may be wirelessly received.
- incoming attack module 1002 receives the incoming attack indicator.
- a sensory signal is provided to the user in response to receiving the incoming attack indicator.
- the sensory signal may be an audio signal having a frequency in the audible spectrum (e.g., in a range between 20 hertz (Hz) and 20,000 kilohertz (kHz)), a visual signal having a frequency in the visible spectrum (e.g., in a range between 400 terahertz (THz) and 790 THz), a tactile signal, or any other signal that is human-perceptible.
- sensory signal module 1004 provides the sensory signal.
- the determination may be based on attributes of the user, attributes of a player who fired the virtual bullet at the user, and/or attributes of the game.
- authorization determination module 1006 determines whether the user is authorized to provide a speed control command. For example, the determination may be based on whether the user has acquired an attribute power that authorizes the user to provide a speed control command. In accordance with this example, if the user has acquired the power, authorization determination module 1006 determines that the user is authorized to provide a speed control command.
- authorization determination module 1006 determines that the user is not authorized to provide a speed control command. If the user is authorized to provide a speed control command, flow continues to step 908 . Otherwise, flowchart 900 ends.
- step 908 a determination is made whether a user-initiated speed control command is received.
- command determination module 1008 determines whether a user-initiated speed control command is received. If a user-initiated speed control command is received, flow continues to step 910 . Otherwise, flow continues to step 912 .
- players are capable having a power-blocking power that blocks another player's ability to utilize a power.
- a player who has a power-blocking power may block use of the speed control command described in step 908 , thereby preventing the user-initiated speed control command from being received.
- the speed of the virtual bullet is controlled in response to the user-initiated speed control command.
- the speed of the virtual bullet may be reduced in response to the user-initiated speed control command.
- the virtual bullet's reduced speed may provide the user more time to react to the virtual bullet.
- the user may view the virtual bullet on a display of the communication device and take action in the physical real-world environment to avoid being hit by the virtual bullet in the augmented reality environment.
- speed control module 1010 controls the speed of the virtual bullet.
- the user may utilize any of a variety of powers in an attempt to avoid being hit by the virtual bullet and/or mitigate an affect of being hit by the virtual bullet.
- the user may increase the strength of the user's virtual shield for a specified duration or indefinitely, increase a speed with which the user is capable of moving for a specified duration or indefinitely, etc.
- stationary determination module 1012 determines whether the virtual bullet is stationary. If the virtual bullet is stationary, flowchart 900 ends. Otherwise, flow returns to step 908 .
- one or more steps 902 , 904 , 906 , 908 , 910 , and/or 912 of flowchart 900 may not be performed. Moreover, steps in addition to or in lieu of steps 902 , 904 , 906 , 908 , 910 , and/or 912 may be performed.
- mobile communication device 900 may not include one or more of incoming attack module 1002 , sensory signal module 1004 , authorization determination module 1006 , command determination module 1008 , speed control module 1010 , and/or stationary determination module 1012 . Furthermore, mobile communication device 900 may include modules in addition to or in lieu of incoming attack module 1002 , sensory signal module 1004 , authorization determination module 1006 , command determination module 1008 , speed control module 1010 , and/or stationary determination module 1012 .
- FIG. 11 depicts a flowchart 1100 of a method for performing multi-player augmented reality combat with respect to a user of a mobile communication device in accordance with an embodiment described herein.
- flowchart 1100 is described with respect to a mobile communication device 1200 shown in FIG. 12 , which is an example of a mobile communication device 102 , according to an embodiment.
- mobile communication device 1200 includes an injury determination module 1202 , a distance determination module 1204 , and a recovery control module 1206 . Further structural and operational embodiments will be apparent to persons skilled in the relevant art(s) based on the discussion regarding flowchart 1100 . Flowchart 1100 is described as follows.
- step 1102 a determination is made that a user has incurred a virtual injury.
- injury determination module 1202 determines that the user has incurred the virtual injury.
- a distance between a location of a mobile communication device of the user and a designated location is determined
- distance determination module 1204 determines the distance between the location of the mobile communication device of the user and the designated location.
- a rate at which the user recovers from the virtual injury is controlled based on the determined distance. For example, if the designated location is a location of a physical or virtual hospital, the rate at which the user recovers from the virtual injury may be inversely proportional to the distance between the location of the mobile communication device and the hospital. For instance, the user may recover more quickly if the user is closer to the hospital. The user may recover more slowly if the user is farther from the hospital. In another example, if the designated location is a location of a toxic dump site, the rate at which the user recovers from the virtual injury may be directly proportional to the distance between the location of the mobile communication device and the toxic dump site. For instance, the user may recover more quickly if the user is farther from the toxic dump site. The user may recover more slowly if the user is closer to the toxic dump site. In an example implementation, recovery control module 1206 controls the rate at which the user recovers from the virtual injury.
- Each hit may be graded according to physical and/or virtual factors, including but not limited to the type of virtual bullet, the type of weapon that fired the virtual bullet, the distance traveled by the virtual bullet before it hit the player, collected tools that each player is using, etc.
- a higher hit grade corresponds to a higher extent of injury
- a lower hit grade corresponds to a lower extent of injury.
- each player may use a virtual shield that offers protection from some weapons and/or bullets. Each shield may be more (or less) effective against weapons and/or bullets in designated virtual environmental conditions.
- a hit grade may have a designated effect on a virtual condition of a player.
- the player's view of the augmented reality environment may be changed such that aiming a virtual weapon is more difficult for the player.
- Other ways in which the player's view may be changed include but are not limited to showing fog, showing blood on the display, covering the player's view of the augmented reality environment (or a portion thereof), causing the player's view of the augmented reality environment to be unstable, out of focus, zoomed out, zoomed in, etc.
- a player may recover from a virtual injury as time passes or by collecting and/or using virtual curing objects. As the player recovers, the player's hit grade decreases.
- Each curing object, each virtual shield, and each unit of time (e.g., second, minute, etc.) may have a respective designated curing effect.
- some players may have virtual healing tools that they may use to heal other players. For example, a player who possesses a healing tool may stand near a virtually wounded player in the physical real-world environment to assist the recovery of the wounded player in the augmented reality environment.
- a player who possesses a virtual healing weapon may fire a virtual healing bullet at a wounded player to assist the recovery of the wounded player. When a player's hit grade reaches an upper threshold, the player may be considered as virtually dead. Any suitable technique may be used to revive the player from the deceased state.
- each type of virtual weapon and/or virtual bullet may have respective characteristics.
- One type of characteristic is a hit effect.
- a hit effect is an effect that results when a virtual bullet that is fired by a virtual weapon hits an object.
- a virtual shotgun may have a wider diameter of hit, but a lower range, as compared to a virtual sniper rifle.
- a virtual cannon may have a substantially wide area of hit.
- a group of players may be targeted based solely on player location indicator(s) associated with the players, without the need for identifying the players using an image recognition technique. Targeting players in this manner may be useful if the players are hiding in such a way that a camera cannot be used to capture an image of the players.
- a virtual sniper rifle may be used to fire a relatively fast and accurate virtual bullet that is less affected by virtual environmental conditions and/or user-initiated speed control commands.
- Any suitable type of virtual weapons, virtual bullets, and/or virtual tools having user-defined characteristics may be created in the augmented reality environment. Such creations may be acquired by players and used in the augmented reality environment to affect targeting, shooting, and hitting simulation in the augmented reality environment.
- Virtual items may be located throughout the augmented reality environment.
- the virtual items may represent points, virtual weapons, virtual tools, and/or any other suitable virtual items.
- Players may collect a virtual item in any of a variety of ways. For example, a player may collect a virtual item by moving in the physical real-world environment such that the player moves closer to a virtual location of the virtual item in the augmented reality environment. Each virtual item has a location in the real world, and when a device location system (e.g., device location system 106 ) determines that the user is close enough to the virtual item's location, the user is collects the virtual item.
- the player may fire a virtual bullet that hits the virtual item.
- the player may collect a virtual item by trading points for the virtual item.
- the player may collect a virtual item by purchasing the virtual item using virtual money, real money, or a combination thereof.
- Players may acquire points by completing tasks; firing virtual bullets that hit other players; collecting virtual points; purchasing the points using virtual money, real money, or a combination thereof; etc.
- the points may be traded for virtual items, virtual money, real money, or a combination thereof.
- Players may view a map showing some or all of the locations of the other players in the augmented reality environment. For instance, players may see the locations of their team members. Players may leave virtual markers, notes, voice messages, recordings, and/or virtual items on the map for other players to pick up. In another example, players can leave traps for other players, such as virtual mines, virtual grenades, etc that may be triggered when those players come within a designated proximity of the traps. Players may communicate using audio and/or video conferencing. A variety of other features that are known in the relevant art(s) (e.g., the computer gaming art) may be incorporated into the multi-player augmented reality combat techniques described herein.
- the direction of a virtual bullet may be controlled by a user who fires the virtual bullet.
- the user may move the camera of the user's mobile communication device to cause the virtual bullet to shift direction toward the new camera direction.
- the ability to control the bullet may be a property of the bullet, the virtual weapon that is used to fire the virtual bullet, and/or a power that is associated with a virtual item that is collected by the user. Controlling the virtual bullet direction is similar to controlling a guided missile that can be used to target players as the players move (e.g., to get away from the virtual bullet) or to target players who are hiding behind shelters.
- a user may initiate virtual mobile controlled objects that may be controlled by the user.
- virtual mobile controlled objects include but are not limited to virtual aircraft (e.g., helicopters, airplanes, gliders, steerable balloons, etc.), virtual vessels (e.g., ships, submarines, etc.), virtual land vehicles, etc.
- Each virtual mobile controlled object may have a corresponding speed, a duration of availability, and a pre-defined source (i.e., virtual base).
- the user may control the navigation of the virtual mobile controlled objects.
- a virtual mobile controlled object may view location indicators of other players at a designated range according to the location of each user's mobile communication device and the type of the virtual mobile controlled object.
- a virtual airplane may carry virtual bombs that the user can release on top of other players.
- the other players' mobile communication devices may have object indicators that are capable of indicating that a virtual mobile controlled object is nearby.
- the object indicators may provide a sound, cause the virtual mobile controlled object to be rendered on a screen of a mobile communication device when a camera of the mobile communication device is pointed at the virtual mobile controlled object, etc.
- the players may fire virtual bullets at the virtual mobile controlled object in order to hit it.
- Other types of virtual mobile controlled objects may be virtual soldiers or any other type of virtual mobile controlled object that a user may control. If a virtual mobile controlled object does not return back to its virtual base within a specified time, is the virtual mobile controlled object may be disabled. It is possible to for a user to collect virtual mobile controlled objects like any other virtual item described herein.
- players may trigger virtual weapons, such as artillery, remotely.
- Each such virtual weapon has a specified range from its source position.
- a user may define the source position of a remotely triggered virtual weapon with respect to the user's location.
- the mobile communication devices of the players who are in the area of the expected hit may generate a sensory signal, such as a sound, vibration of the players' devices, etc. as described in previous examples.
- an augmented reality combat system may include a station controlled by a connected communication device that is connected to the network.
- the station may act as a command center that displays the positions of the players on a map, statuses of the players, etc.
- the command center may use a relatively large screen and/or a relatively more powerful computer system that can consume and process substantial data.
- a user who uses the command center can assist the other players to perform and act as a team.
- the command center may collect information about the positions of opponents whose locations are being provided by their mobile communication devices or by virtual mobile controlled objects.
- the command center may display prior locations of the players, so that the user who uses the command center may have a better understanding of the movement patterns of the opponent players.
- players may be capable of using camera zoom capabilities to facilitate identification of player indicators.
- the ability to use the camera zoom capabilities may be based on possession of specified attributes, use of specified weapons, or any other suitable criteria. For example, although a virtual cannon may not use zoom, a virtual sniper rifle may.
- Each mobile communication device may be mounted on or incorporated in a physical device that is shaped like a weapon. Displays of the mobile communication devices may be displayed using glasses. For instance, cameras of the mobile communication devices may be attached to the glasses, so that the players may point by looking in a direction. Any of the devices and/or components thereof may be carried on a player's person or in accessories that are available to the player.
- FIG. 13 elements of example augmented reality combat systems 100 and 200 , including server(s) 108 depicted in FIG. 1 , device location system 106 depicted in FIGS. 1 and 2 , any of the mobile communication devices 102 A- 102 N depicted in FIGS. 1 and 2 , any of mobile communication devices 400 , 500 , 600 , 700 , 800 , 1000 , and 1200 depicted in respective FIGS.
- Computer 1300 can be any commercially available and well known computer capable of performing the functions described herein, such as computers available from Apple, Dell, Gateway, HP, International Business Machines, Sony, etc.
- Computer 1300 may be any type of computer, including a desktop computer, a server, etc.
- computer 1300 includes one or more processors (e.g., central processing units (CPUs)), such as processor 1306 .
- processor 1306 may include camera 402 , image recognition module 404 , command receipt module 406 , outgoing attack module 408 , location module 410 , location indicator receipt module 412 , distance determination module 414 , time determination module 416 , environment module 418 , environment control module 420 , identification module 422 , display module 424 , and/or orientation determination module 426 of FIG. 4 ; incoming attack module 1002 , sensory signal module 1004 , authorization determination module 1006 , command determination module 1008 , speed control module 1010 , and/or stationary determination module 1012 of FIG.
- processors e.g., central processing units (CPUs)
- processor 1306 may include camera 402 , image recognition module 404 , command receipt module 406 , outgoing attack module 408 , location module 410 , location indicator receipt module 412 , distance determination module 414 , time determination
- Processor 1306 is connected to a communication infrastructure 1302 , such as a communication bus. In some embodiments, processor 1306 can simultaneously operate multiple computing threads.
- Computer 1300 also includes a primary or main memory 1308 , such as a random access memory (RAM).
- Main memory has stored therein control logic 1324 A (computer software), and data.
- Computer 1300 also includes one or more secondary storage devices 1310 .
- Secondary storage devices 1310 include, for example, a hard disk drive 1312 and/or a removable storage device or drive 1314 , as well as other types of storage devices, such as memory cards and memory sticks.
- computer 1300 may include an industry standard interface, such as a universal serial bus (USB) interface for interfacing with devices such as a memory stick.
- Removable storage drive 1314 represents a floppy disk drive, a magnetic tape drive, a compact disk drive, an optical storage device, tape backup, etc.
- Removable storage drive 1314 interacts with a removable storage unit 1316 .
- Removable storage unit 1316 includes a computer useable or readable storage medium 1318 having stored therein computer software 1324 B (control logic) and/or data.
- Removable storage unit 1316 represents a floppy disk, magnetic tape, compact disc (CD), digital versatile disc (DVD), Blue-ray disc, optical storage disk, memory stick, memory card, or any other computer data storage device.
- Removable storage drive 1314 reads from and/or writes to removable storage unit 1316 in a well known manner.
- Computer 1300 also includes input/output/display devices 1304 , such as monitors, keyboards, pointing devices, etc.
- Computer 1300 further includes a communication or network interface 1320 .
- Communication interface 1320 enables computer 1300 to communicate with remote devices.
- communication interface 1320 allows computer 1300 to communicate over communication networks or mediums 1322 (representing a form of a computer useable or readable medium), such as local area networks (LANs), wide area networks (WANs), the Internet, etc.
- Network interface 1320 may interface with remote sites or networks via wired or wireless connections. Examples of communication interface 1322 include but are not limited to a modem, a network interface card (e.g., an Ethernet card), a communication port, a Personal Computer Memory Card International Association (PCMCIA) card, etc.
- PCMCIA Personal Computer Memory Card International Association
- Control logic 1324 C may be transmitted to and from computer 1300 via the communication medium 1322 .
- Any apparatus or manufacture comprising a computer useable or readable medium having control logic (software) stored therein is referred to herein as a computer program product or program storage device.
- each of the elements of example mobile communication device 400 depicted in FIG. 4 including camera 402 , image recognition module 404 , command receipt module 406 , outgoing attack module 408 , location module 410 , location indicator receipt module 412 , distance determination module 414 , time determination module 416 , environment module 418 , environment control module 420 , identification module 422 , display module 424 , and orientation determination module 426 ; each of the elements of example mobile communication device 1000 depicted in FIG. 10 , including incoming attack module 1002 , sensory signal module 1004 , authorization determination module 1006 , command determination module 1008 , speed control module 1010 , and stationary determination module 1012 ; each of the elements of example mobile communication device 1200 depicted in FIG.
- each of the steps of flowcharts 300 , 900 , and 1100 depicted in respective FIGS. 3 , 9 , and 11 can be implemented as control logic that may be stored on a computer useable medium or computer readable medium, which can be executed by one or more processors to operate as described herein.
- the invention can be put into practice using software, hardware, and/or operating system implementations other than those described herein. Any software, hardware, and operating system implementations suitable for performing the functions described herein can be used.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Business, Economics & Management (AREA)
- Computer Security & Cryptography (AREA)
- General Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Techniques are described herein for performing multi-player augmented reality combat. Each player wears (or is otherwise associated with) an indicator (e.g., an object that has a designated pattern, a visual tag, etc.) that identifies the player or a team thereof. Each player has a mobile communication device that is capable of identifying the players' indicators. A user may point a camera of the user's mobile communication device at another player (e.g., an opponent). The image that is captured by the camera includes the indicator that is associated with the opponent. The user may choose to fire a virtual bullet at the opponent using an audio and/or tactile command, which is processed by the user's mobile communication device. The mobile communication device determines the time that the virtual bullet takes to travel from the mobile communication device to the opponent based on the distance between the user and the opponent.
Description
- This application claims the benefit of U.S. Provisional Application No. 61/289,881, filed on Dec. 23, 2009, which is incorporated by reference herein in its entirety.
- 1. Field of the Invention
- The present invention relates to techniques for performing multi-player augmented reality combat.
- 2. Background
- Augmented reality is a representation of a physical real-world environment that is combined with (i.e., augmented by) virtual (i.e., computer-generated) imagery. The augmentation of the physical real-world environment is usually performed in real-time. For example, the augmented reality may be displayed to a user via a live-video stream. The user may view the live-video stream using any suitable type of display, such as a head-mounted display, a handheld display, a virtual retinal display, etc.
- Augmented reality systems commonly include hand-held devices each of which has network capabilities, a camera, and a display. The cameras capture the physical real-world environment, and the displays display the physical real-world environment in combination with virtual objects. Virtual objects may include text, images, or any other computer-generated object.
- Augmented reality may be used in a variety of applications. For example, augmented reality may be used to create a virtual object in a museum, an exhibition, or a theme park attraction. In another example, labels or text (e.g., operating instructions) may be superimposed on an object or parts thereof, such as surgical instruments or aircraft controls. In yet another example, virtual imagery of a digital mock-up may be compared side-by-side to a physical mock-up to determine discrepancies therebetween.
- Another application in which augmented reality may be used is gaming In conventional augmented reality games, each user's hand-held device includes a camera and a display. A user's camera captures images of physical real-world objects using pre defined patterns. The user's display displays the images of the physical real-world objects in combination with virtual objects. For example, the virtual objects may be superimposed on an image of the user's physical real-world environment. The user is often able to interact with the virtual objects. For instance, the user may use a virtual weapon to fire virtual bullets at the virtual objects. The virtual weapon may be associated with a virtual targeting axis, for example, that points to a location at which the virtual bullets are to be fired. For instance, the display may display the virtual targeting axis in the augmented reality. However, the virtual objects with which the user interacts typically do not correspond to physical objects in the physical real-world environment. Accordingly, users of conventional augmented reality games traditionally are not able to perform augmented reality combat with another person.
- Thus, systems, methods, and computer program products are needed that are capable of performing multi-player augmented reality combat.
- Various approaches are described herein for, among other things, performing multi-player augmented reality combat. Multi-player augmented reality combat is combat that is performed between multiple players using augmented reality. Each player wears (or is otherwise associated with) an indicator (e.g., an object that has a designated pattern, a visual tag, etc.) that identifies the player or a team in which the player is included. Each player has a mobile communication device, which executes software that enables the mobile communication device to identify the indicators of the players. For example, a user of a mobile communication device may point a camera of the mobile communication device at another player (e.g., an opponent). The image that is captured by the camera includes the indicator that is associated with the opponent. The user may choose to fire a virtual bullet (e.g., spear, slug, cannon ball, dart, flames, buckshot, etc.) at the opponent by providing an audio and/or tactile command to the user's mobile communication device. The mobile communication device determines an estimated duration of a time period for the virtual bullet to travel from the mobile communication device to the opponent based on the distance between the user and the opponent. For example, the distance between the player and the opponent can be calculated according to their positions, which may be determined by respective location modules.
- An example mobile communication device is described. The mobile communication device includes a camera, an image recognition module, a display, and an outgoing attack module. The camera is configured to capture an image. The image recognition module is configured to identify a player indicator in the image, the player indicator corresponding to an opponent player. The display is configured to display the image. The outgoing attack module is configured to transmit an outgoing attack indicator in accordance with a mobile communication protocol in response to a user-initiated attack command that is received in response to identification of the player indicator. The outgoing attack indicator specifies that a virtual bullet is fired at the opponent player.
- The example mobile communication device may further include a location module, a distance determination module, and a time determination module. The location module is configured to determine a location of the mobile communication device. For example, the location of the mobile communication device may be associated with a player indicator of the user of the mobile communication device. The location of the mobile communication device is available to mobile communication devices of other players for distance calculation via a central server, P2P communication, or any other communication technique. The distance determination module is configured to determine a distance between the location of the user's mobile communication device and a location of the opponent player whose player indicator is identified by the image recognition module. The time determination module is configured to determine an estimated duration of a time period for the virtual bullet to travel from the mobile communication device to the player based on the determined distance.
- An example method for performing multi-player augmented reality combat with respect to a user of a mobile communication device is also described. In accordance with this method, an image is captured and displayed. A player indicator is identified in the image. The player indicator corresponds to an opponent player. A user-initiated attack command is received in response to identifying the player indicator. An attack indicator is transmitted in accordance with a mobile communication protocol in response to receiving the user-initiated attack command. The attack indicator specifies that a virtual bullet is fired at the opponent player.
- In some aspects, a location of the mobile communication device is determined A player location indicator that specifies a location of the opponent player is received. A distance between the location of the mobile communication device and the location of the opponent player is determined An estimated duration of a time period for the virtual bullet to travel from the mobile communication device to the opponent player is determined based on the determined distance.
- A computer program product is also described. The computer program product includes a computer-readable medium having computer program logic recorded thereon for enabling a processor-based system to perform multi-player augmented reality combat with respect to a user of a mobile communication device. The computer program logic includes first, second, and third program logic modules. The first program logic module is for enabling the processor-based system to capture an image. The second program logic module is for enabling the processor-based system to identify a player indicator in the image. The player indicator corresponds to a player. The third program logic module is for enabling the processor-based system to transmit an outgoing attack indicator in accordance with a mobile communication protocol in response to a user-initiated attack command that is received in response to identification of the player indicator. The outgoing attack indicator specifies that a virtual bullet is fired at the player.
- The computer program logic may further include fourth, fifth, and sixth program logic modules. The fourth program logic module is for enabling the processor-based system to determine a location of the mobile communication device. The fifth program logic module is for enabling the processor-based system to determine a distance between the location of the mobile communication device and a location of the player. The sixth program logic module is for enabling the processor-based system to determine an estimated duration of a time period for the virtual bullet to travel from the mobile communication device to the player based on the determined distance.
- Further features and advantages of the disclosed technologies, as well as the structure and operation of various embodiments, are described in detail below with reference to the accompanying drawings. It is noted that the invention is not limited to the specific embodiments described herein. Such embodiments are presented herein for illustrative purposes only. Additional embodiments will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein.
- The accompanying drawings, which are incorporated herein and form part of the specification, illustrate embodiments of the present invention and, together with the description, further serve to explain the principles involved and to enable a person skilled in the relevant art(s) to make and use the disclosed technologies.
-
FIGS. 1 and 2 are block diagrams of example augmented reality combat systems in accordance with embodiments described herein. -
FIGS. 3 , 9, and 11 depict flowcharts of methods for performing multi-player augmented reality combat with respect to a user of a mobile communication device in accordance with embodiments described herein. -
FIGS. 4 , 10, and 12 are block diagrams of example implementations of a mobile communication device shown inFIG. 1 in accordance with embodiments described herein. -
FIGS. 5-8 show mobile communication devices that display example views of an augmented reality environment in accordance with embodiments described herein. -
FIG. 13 is a block diagram of a computer in which embodiments may be implemented. -
FIG. 14 illustrates a technique for determining a difference between an altitude of a mobile communication device and an altitude of a player in accordance with an embodiment described herein. - The features and advantages of the disclosed technologies will become more apparent from the detailed description set forth below when taken in conjunction with the drawings, in which like reference characters identify corresponding elements throughout. In the drawings, like reference numbers generally indicate identical, functionally similar, and/or structurally similar elements. The drawing in which an element first appears is indicated by the leftmost digit(s) in the corresponding reference number.
- The following detailed description refers to the accompanying drawings that illustrate exemplary embodiments of the present invention. However, the scope of the present invention is not limited to these embodiments, but is instead defined by the appended claims. Thus, embodiments beyond those shown in the accompanying drawings, such as modified versions of the illustrated embodiments, may nevertheless be encompassed by the present invention.
- References in the specification to “one embodiment,” “an embodiment,” “an example embodiment,” or the like, indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Furthermore, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to implement such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
- Example embodiments are capable of performing multi-player augmented reality combat. Multi-player augmented reality combat is combat that is performed between multiple players using augmented reality. In some example embodiments, each player wears (or is otherwise associated with) an indicator (e.g., an object that has a designated pattern, a visual tag, etc.) that identifies the player or a team in which the player is included. Each player has a mobile communication device that is capable of identifying the indicators of the players. For example, a user of a mobile communication device may point a camera of the mobile communication device at another player (e.g., an opponent). The image that is captured by the camera includes the indicator that is associated with the opponent. The user may choose to fire a virtual bullet (e.g., spear, slug, cannon ball, dart, flames, buckshot, etc.) at the opponent when the camera is pointed at the opponent by providing an audio and/or tactile command to the user's mobile communication device. The mobile communication device determines an estimated duration of a time period for the virtual bullet to travel from the mobile communication device to the opponent based on the distance between the user and the opponent.
- In other example embodiments, locations of the players are determined using location signals, such as global positioning system (GPS) signals, without the need for each player to be associated with an indicator. For instance, a user's mobile communication device may be capable of determining that a camera of the mobile communication device is pointed at an opponent based on a location of the user, a location of the opponent, and an orientation of the mobile communication device.
- Techniques described herein for performing multi-player augmented reality combat have a variety of benefits as compared to conventional augmented reality gaming techniques. For example, the techniques described herein enable players to interact with other players who exist in the physical real-world environment. Virtual environmental conditions may be introduced into the augmented reality that is perceived by the players. Such environmental conditions may affect the time that a virtual bullet takes to travel from a user's mobile communication device to an opponent whose indicator is identified by the user's mobile communication device. The environmental conditions may be controlled with respect to a particular user's augmented reality based on environmental control commands that are initiated by the user. The speed of a virtual bullet that is directed at a user may be controlled based on a speed control command that is initiated by the user.
-
FIG. 1 is a block diagram of an example augmentedreality combat system 100 in accordance with an embodiment described herein. Generally speaking, augmentedreality combat system 100 operates to perform multi-player augmented reality combat with respect to users of mobile communication devices based on commands that the users provide via the mobile communication devices. In accordance with example embodiments, when users (i.e., players) provide commands to fire virtual bullets at other players, augmentedreality combat system 100 operates to determine the time that each virtual bullet takes to reach the respective targeted player. - As shown in
FIG. 1 , augmentedreality combat system 100 includes a plurality ofmobile communication devices 102A-102N, anetwork 104, adevice location system 106, and server(s) 108.Device location system 106 provides location signals tomobile communication devices 102A-102N viarespective links 112A-112N in accordance with a wireless protocol, such as a mobile communication protocol, a global positioning system (GPS) protocol, or any other suitable protocol over a wireless network. Communications amongmobile communication devices 102A-102N and server(s) 108 are carried out overnetwork 104 using well-known network communication protocols.Network 104 may be a wide-area network (e.g., the Internet), a local area network (LAN), another type of network, or a combination thereof Communications betweennetwork 104 andmobile communication devices 102A-102N are provided wirelessly viarespective wireless links 110A-110N. -
Mobile communication devices 102A-102N are processing systems that are capable of communicating with server(s) 108. An example of a processing system is a system that includes at least one processor that is capable of manipulating data in accordance with a set of instructions. For instance, a processing system may be a computer, a personal digital assistant, etc.Mobile communication devices 102A-102N process data that is received from server(s) 108 vianetwork 104 to display an augmented representation of the physical real-world environment to users of themobile communication devices 102A-102N. The augmented representation of the physical real-world environment is referred to herein as an augmented reality environment. For instance, themobile communication devices 102A-102N may combine virtual imagery, text, and/or any other type of data with images of the physical real-world environment to generate the augmented reality environment. - To initialize augmented
reality combat system 100, users register their identities (e.g., user names), player identifiers, device addresses, etc. usingmobile communication devices 102A-102N, so that their identifiers may be associated with their identities.Mobile communication devices 102A-102N are capable of interpreting signals that are received fromdevice location system 106 to determine their respective locations in the augmented reality environment. Eachmobile communication device 102A-102N may be configured to provide information regarding its location to server(s) 108 vianetwork 104 in response to the mobile communication device being moved and/or periodically in accordance with a designated schedule. -
Mobile communication devices 102A-102N are capable of interpreting commands that are received from users of themobile communication devices 102A-102N to perform virtual actions in the augmented reality environment. For instance,mobile communication devices 102A-102N are capable of firing virtual bullets at users of other mobile communication devices in the augmented reality environment in response to user-initiated attack commandsMobile communication devices 102A-102N are capable of identifying the players based on player identifiers that correspond to the players. Multiple players can use the same identity and/or player identifier (e.g., to indicate that they belong to the same group). For example, each ofmobile communication devices 102A-102N may store a list that cross-references the players and the player indicators. In another example, the list is stored on (or otherwise accessible to) server(s) 108, and each ofmobile communication devices 102A-102N is configured to access the list from server(s) 108. Techniques for performing multi-player augmented reality combat are described in further detail in the following discussion. - Server(s) 108 is a processing system that is capable of communicating with
mobile communication devices 102A-102N. Server(s) 108 provides data tomobile communication devices 102A-102N that are to be combined with images of the physical real-world environment to provide an augmented reality environment. Server(s) 108 processes commands that are received frommobile communication devices 102A-102N, specifying actions to be taken with respect to objects in the augmented reality environment. For example, if a user of afirst communication device 102A provides a command to fire a virtual bullet at a user of asecond communication device 102B, server(s) may provide an indicator tosecond communication device 102B that specifies that a virtual bullet is directed at the user of the secondmobile communication device 102B. In accordance with this example, server(s) 108 may update the virtual imagery of the augmented reality environment to show the virtual bullet travelling toward the user of thesecond communication device 102B. For instance, when users aim cameras of their mobile communication devices at an area where the virtual bullet virtually exists, the mobile communication devices may draw the virtual bullet so that the users can see the virtual bullet on displays of their mobile communication devices. - In an example embodiment, server(s) 108 is capable of modifying the augmented reality environment to include virtual environmental conditions. In accordance with this example embodiment, server(s) may control such environmental conditions with respect to one or more of the users in response to receiving user-initiated environmental control commands. For example, server(s) 108 may introduce or eliminate a virtual environmental condition or reduce or increase the intensity of the virtual environmental condition with respect to a user upon receiving a user-initiated environmental control command regarding the environmental condition from a mobile communication device of the user. In another example, server(s) 108 may introduce or eliminate the virtual environmental condition or reduce or increase the intensity of the virtual environmental condition with respect to users other than the user who initiated the environmental control command upon receiving the environmental control command.
- In another example embodiment,
mobile communication devices 102A-102N are capable of modifying the augmented reality environment to include environmental conditions. For instance, server(s) 108 may store attributes of users that include environmental control capabilities. Each user's mobile communication device may store a copy of that user's attributes for permitting the user to utilize environmental control commands that are associated with the user's environmental control capabilities. The user may initiate an environmental control command using a button or touch screen of the user's mobile communication device, an audible command, or any other suitable technique. Upon initiation of the environmental control command, the user's mobile communication device may change virtual environmental parameters, such that each other users virtual environmental conditions that are associated with the parameters are incorporated into the versions of the augmented reality environment that are displayed to the other users. For instance, the environmental condition parameters may affect virtual objects, virtual attributes (e.g., health, visibility, etc.) of the user and/or other players, virtual bullet speed and/or direction, a hit effect that is associated with a virtual bullet, firing accuracy, range of explosion, or any other virtual characteristic of the augmented reality combat. -
Device location system 106 is a processing system that is configured to provide location signals tomobile communication devices 102A-102N viarespective links 112A-112N.links 112A-112N may be wireless links, GPS links, or any other suitable type of links. For example, the location signals may specify the locations of the respectivemobile communication devices 102A-102N. In another example, the location signals may include information that may be used by themobile communication devices 102A-102N to determine their respective locations. -
Links 112A-112N are shown inFIG. 1 to be unidirectional for illustrative purposes and are not intended to be limiting. It will be recognized that links 112A-112N may be bidirectional. For example,device location system 106 may provide ping signals tomobile communication devices 102A-102N for determining locations of the respectivemobile communication devices 102A-102N. In accordance with this example,device location system 106 may receive response signals from the respectivemobile communication devices 102A-102N in response to the respective ping signals.Device location system 106 may determine a location of eachmobile communication device 102A-102N based on the time that elapses between providing the respective ping signal and receiving the respective response signal. - In accordance with an example embodiment, each of the
mobile communication devices 102A-102N reports its location toserver 108 vianetwork 104 and accessesserver 108 to determine the locations of the other mobile communication devices. A mobile communication device may compare its location to a location of another mobile communication device to calculate a distance therebetween. The locations of the respectivemobile communication devices 102A-102N, as indicated by the location signals that are provided bydevice location system 106, may be estimated locations. Accordingly, the calculated distances between themobile communication devices 102A-102N may be estimated distances. -
Device location system 106 may be capable of providing a positioning accuracy that is greater than the positioning accuracy that is allowed by government regulations and/or laws. For instance, the full positioning accuracy capabilities ofdevice location system 106 may be reserved for military applications. If restrictions regarding the positioning accuracy ofdevice location system 106 are not imposed, and/or GPS (or other positioning technique) allows for accurate positioning of approximately one meter or less,device location system 106 may provide substantially greater positioning accuracy. For example, the location of each user may be determined using merely the capabilities ofdevice location system 106, without using image recognition techniques. -
FIG. 2 is a block diagram of another example augmentedreality combat system 200 in accordance with an embodiment described herein. Augmentedreality combat system 200 is similar to the augmentedreality combat system 100 shown inFIG. 1 , except that augmentedreality combat system 200 does not includenetwork 104 or server(s) 108, andmobile communication devices 202A-202N are processing systems that are capable of communicating with each other. Thus, communications betweenmobile communication devices 202A-202N are provided wirelessly using well-known communication protocols that do not require a server. For instance, firstmobile communication device 202A and secondmobile communication device 202B communicate viawireless link 204; secondmobile communication device 202B and nthmobile communication device 202N communicate viawireless link 206; nthmobile communication device 202N and firstmobile communication device 202 A communicate viawireless link 208, and so on. - Accordingly, in augmented
reality combat system 200, a mobile communication device that performs an action with respect to the augmented reality environment may provide an indicator that specifies the action to each of the other mobile communication devices. Eachmobile communication device 202A-202N may provide information regarding its location to the other communication devices in response to that mobile communication device being moved and/or periodically in accordance with a designated schedule. The mobile communication devices that receive such indicators and/or information may update their respective displays of the augmented reality environment based on the indicators and/or the information. - In accordance with some example embodiments, augmented
reality combat system 200 includes a network and a server. For instance, attributes that are associated with the players may be stored on (or otherwise accessible to) the server. Examples of attributes include but are not limited to identities of the players, network addresses of the mobile communication devices of the players, etc. Each mobile communication device may access the attributes that are stored on (or otherwise accessible to) the server via the network. In a first example implementation of augmentedreality combat system 200, a connection is established directly between mobile communication devices for communication therebetween. In a second example implementation, each mobile communication device communicates with the server, and the server is responsible for transferring communications from the originating mobile communication devices to the recipient mobile communication devices. - Augmented
reality combat systems -
FIG. 3 depicts aflowchart 300 of a method for performing multi-player augmented reality combat with respect to a user of a mobile communication device in accordance with an embodiment described herein.Flowchart 300 is described from the perspective of a mobile communication device.Flowchart 300 may be performed by any ofmobile communication devices 102A-102N of augmentedreality combat system 100 shown inFIG. 1 , for example. For illustrative purposes,flowchart 300 is described with respect to amobile communication device 400 shown inFIG. 4 , which is an example of a mobile communication device 102, according to an embodiment. - As shown in
FIG. 4 ,mobile communication device 400 includes acamera 402, animage recognition module 404, acommand receipt module 406, anoutgoing attack module 408, alocation module 410, a locationindicator receipt module 412, adistance determination module 414, atime determination module 416, anenvironment module 418, anenvironment control module 420, anidentification module 422, adisplay module 424, and anorientation determination module 426. Further structural and operational embodiments will be apparent to persons skilled in the relevant art(s) based on thediscussion regarding flowchart 300.Flowchart 300 is described as follows. - As shown in
FIG. 3 , the method offlowchart 300 begins atstep 302. Instep 302, an image of a physical real-world environment is captured. In an example implementation,camera 402 captures the image. For instance,display module 424 may render the image for viewing by a user ofmobile communication device 400. The image may be augmented to include virtual objects, virtual environmental conditions, etc. before it is rendered, though the scope of the example embodiments is not limited in this respect. - At
step 304, a player indicator is identified in the image. The player indicator corresponds to a player in the physical real-world environment. For example, the player indicator may be a designated pattern that is provided on an article of the player's clothing or on another object that is associated with the player. In another example, the player indicator may be a visual tag that is associated with the player. The player indicator may be identified in substantially real-time as the image of the physical real-world environment is captured, though the scope of the example embodiments is not limited in this respect. In an example implementation,image recognition module 404 identifies the player indicator. - At
step 306, a user-initiated attack command is received in response to identifying the player indicator. For instance, the user of the mobile communication device may identify another player in the real world by looking at it, aiming the device camera at the other player and the image recognition module identifies the second player. The player aims the center of the camera at the identified second player and initiate the attack command by saying a word or phrase that is associated with the attack command, pressing a button on the mobile communication device that is associated with the attack command, touching a touch screen of the mobile communication device at the position on screen where the targeted player is located in a manner that is associated with the attack command (e.g., moving the user's finger up, down, right, left, or diagonally on the touch screen; touching the touch screen in a designated location that is associated with the attack command; etc.), shaking the mobile device, or using any other suitable technique. In an example implementation,command receipt module 406 receives the user-initiated attack command. - It will be recognized that a user may initiate an attack command at any time, not only in response to identifying a player indicator. For example, a user may initiate an attack command to fire a virtual bullet at a virtual object. In another example, a user may use a virtual weapon (e.g., a cannon) that has a substantially wide area of hit to fire at players without the need for identifying the players. In accordance with this example, the players may be fired upon even if the players are hiding (i.e., not in view of the user). For instance, location indicators that are associated with the players may be relied upon for determining the location of those players. Accordingly, a user-initiated attack command may be received at any time.
- At
step 308, an attack indicator is transmitted in response to receiving the user-initiated attack command. The attack indicator specifies that a virtual bullet is fired at the player. For instance, the attack indicator may be wirelessly transmitted in accordance with a mobile communication protocol. In an example implementation,outgoing attack module 408 transmits the attack indicator to the central server or directly to the targeted player device. - At
step 310, a location of the mobile communication device is determined In an example implementation,location module 410 determines the location of the mobile communication device. - In accordance with an example embodiment, the location of the mobile communication device is determined based on location signals (e.g., global positioning system (GPS) signals, wireless signals received from a base station, etc.). For instance, each location signal may include a location indicator that specifies a location of its source and a time indicator that specifies a time at which the location signal was transmitted by its source.
Location module 410 may combine the time indicators that are included in the respective location signals and the times at which the mobile communication device received the respective location signals to determine transmit times of the respective location signals. A transmit time is a duration of time for a location signal to travel from its source to the mobile communication device.Location module 410 may determine distances between the mobile communication device and the sources of the respective location signals based on the transmit times of the respective location signals. -
Location module 410 may combine the distances between the mobile communication device and the sources of respective location signals with the locations of the respective sources to determine the location of the mobile communication device. For instance,location module 410 may use a trilateration technique to combine the distances with the sources' locations. Trilateration is a technique for determining intersections of the surfaces of three spheres based on the centers and radii of the spheres. In accordance with this example embodiment, the centers of the spheres correspond to the locations of the respective sources, and the radii correspond to the distances between the mobile communication device and the respective sources. - In accordance with another example embodiment, the location of the mobile communication device is determined based on a location indicator that specifies the location of the mobile communication device. For example, one or more servers (e.g., server(s) 108) may determine the transmit times of the respective location signals, determine the distances between the mobile communication device and the sources of the respective location signals, and combine the distances between the mobile communication device and the sources of the respective location signals with the locations of the respective sources to determine the location of the mobile communication device. In accordance with this example,
location module 410 receives a location indicator from the server(s) that specifies the location of the mobile communication device.Location module 410 interprets the location indicator to determine the location of the mobile communication device. - In another example, sources (e.g., base stations) may provide request signals to the mobile communication device. The mobile communication device may send response signals to the respective sources in response to the request signals. Each source may determine a distance between the mobile communication device and the source based on a duration of a time period between transmission of the respective request signal and receipt of the corresponding response signal. The wireless communication system may combine the distances between the mobile communication device and the respective sources with the locations of the respective sources to determine the location of the mobile communication device. The wireless communication system may then provide a location indicator that specifies the location of the mobile communication device to the mobile communication device, enabling the mobile communication device to determine its location based on the location indicator.
- In accordance with another example embodiment, the location of the mobile communication device is determined based on the strengths of signals that the mobile communication device receives from respective sources. For instance, a trilateration technique may be used to determine the location of the mobile communication device based on the signal strengths.
- At
step 312, a player location indicator that specifies a location of the player is received. For example, the player location indicator may be wirelessly received. For instance, the player location indicator may specify GPS coordinates that indicate the location of the player. In an example implementation, locationindicator receipt module 412 receives the player location indicator. - For example, each mobile communication device may send a player location indicator that specifies the location of the player that corresponds to that mobile communication device to a central server, so that other mobile communication devices may access the indicators on the server. In another example, each mobile communication device may send a player location indicator that specifies the location of the player that corresponds to that mobile communication device to the other mobile communication devices without routing the indicators through a server.
- At
step 314, a distance between the location of the mobile communication device and the location of the player is determined. In an example implementation,distance determination module 414 determines the distance between the location of the mobile communication device and the location of the player. - At
step 316, an estimated duration of a time period for the virtual bullet to travel from the mobile communication device to the player is determined based on the determined distance. In an example implementation,time determination module 416 determines the estimated duration of the time period for the virtual bullet to travel from the mobile communication device to the player. - In accordance with an example embodiment, the estimated duration of the time period is determined based on a virtual environmental condition. Examples of a virtual environmental condition include but are not limited to wind, sunlight, moonlight, darkness, rain, snow, hail, fog, a sand storm, etc. For instance, some environmental conditions (e.g., wind speed, wind direction, rain, snow, etc.) may affect a flight time of the virtual bullet. Virtual environmental conditions are discussed in further detail below with reference to
environment module 418 andenvironment control module 420. - In accordance with another example embodiment, the estimated duration of the time period is determined based on a difference between an altitude of the mobile communication device and an altitude of the player indicator. For example,
distance determination module 414 may provide a vector representation of the distance between the location of the mobile communication device and the location of the player that specifies a horizontal distance and a vertical distance between the location of the mobile communication device and the location of the player. In accordance with this example,time determination module 416 may determine the estimated duration of the time period based on the vector representation of the distance. Further description of an example technique for determining a difference between an altitude of the mobile communication device and an altitude of the player indicator is provided below with reference toorientation determination module 426 andFIG. 14 . - In accordance with yet another example embodiment, the estimated duration of the time period is determined based on an attribute of a virtual weapon that is used to fire the virtual bullet. Examples of such an attribute include but are not limited to a size and/or weight of the virtual bullet that is fired by the virtual weapon, a force with which the virtual weapon fires the virtual bullet, a condition of the virtual weapon, a type of the virtual weapon, etc.
- In some example embodiments, one or
more steps flowchart 300 may not be performed. Moreover, steps in addition to or in lieu ofsteps -
Environment module 418 is configured to modify the image of the physical real-world environment to include one or more virtual environmental conditions. The image of the physical real-world environment may include virtual objects and/or other information in addition to the virtual environmental condition(s), though the scope of the example embodiments is not limited in this respect. For example, some virtual environmental conditions (e.g., sunlight or mitigation of rain, snow, fog, etc.) may enhance or change visibility with respect to physical and/or virtual objects in the image. Other virtual environmental conditions (e.g., darkness, rain, snow, fog, etc.) may inhibit visibility with respect to physical and/or virtual objects in the image. In another example, some virtual environmental conditions (e.g., mitigation of rain, snow, fog, etc.) may increase a speed of the virtual bullet. Other virtual environmental conditions (e.g., rain, snow, fog, etc.) may decrease the speed of the virtual bullet. In yet another example, some virtual environmental conditions (e.g., wind, sand storm, hail, etc.) may change a direction of the virtual bullet as it travels from the mobile communication device to the player. -
Environment control module 420 is configured to control virtual environmental conditions that are incorporated into the image of the physical real-world environment in response to user-initiated environmental control commands. For instance, a user may acquire a virtual power that enables the user to initiate commands for controlling one or more of the virtual environmental conditions.Environment control module 420 may mitigate or intensify an environmental condition with respect to the user's view of the augmented reality environment (or the views of the other players) upon receipt of a user-initiated environmental control command from the user. The environmental condition may be changed for a designated time period or indefinitely in response to the user's environmental control command. - For example, the user may request that virtual sunlight be provided with respect to the user's view of the augmented reality environment to enhance the user's visibility.
Environment control module 420 may provide the virtual sunlight with respect to the user's view of the augmented reality environment, but not with respect to the views of the other users. In another example, the user may request that the intensity of a virtual snowstorm in the augmented reality environment be mitigated (or that the virtual snowstorm be terminated) with respect to the user's view of the augmented reality environment.Environment control module 420 may mitigate (or terminate) the virtual snowstorm with respect to the user's view of the augmented reality environment, but not with respect to the views of the other users. - In yet another example, the user may request that virtual rain be provided with respect to the other players' views of the augmented reality environment to reduce visibility of the other players.
Environment control module 420 may provide the virtual rain with respect to the other players' views, but not with respect to the view of the user who requested the rain. In still another example, the user may request that virtual sunlight be removed from the other players' views of the augmented reality environment.Environment control module 420 may remove the virtual sunlight from the views of the other players, but not from the view of the user who initiated the request. - The environment control examples described above are provided for illustrative purposes and are not intended to be limiting. It will be recognized that
environment control module 420 may mitigate or intensify an environmental condition with respect to all players' views of the augmented reality environment in response to a user's environment control command. -
Identification module 422 is configured to modify the image to include identification information regarding a player whose player indicator is identified in the image. Examples of identification information include but are not limited to a player's name, photograph, team affiliation, rank, experience level, virtual attack success rate, Twitter® account address, instant message address, score in the game, virtual shield type, virtual shield strength, health condition, virtual weapons available to the player and/or virtual weapon currently being used by the player, etc.Identification module 422 may modify the image to selectively include designated identification information regarding players in accordance with instructions received from the user ofmobile communication device 400. For instance,identification module 422 may modify the image to include all of the identification information regarding the players whenmobile communication device 400 is pointed at the players (e.g., whenimage recognition module 404 recognizes a player and/or whencamera 402 is pointed toward an area having coordinates that correspond to a location of the player). -
Display module 424 is configured to render images that are captured bycamera 402 and/or modified byenvironment module 418 and/oridentification module 422. -
Orientation determination module 426 is configured to determine an orientation (e.g., tilt) ofmobile communication device 400. For instance, ifcamera 402 is pointed at a player,orientation determination module 426 is capable of determining a difference between an altitude ofmobile communication device 400 and an altitude of the player. For example,orientation determination module 426 may include an accelerometer for determining the orientation ofmobile communication device 1402. In accordance with an embodiment,distance determination module 416 determines the distance between the location ofmobile communication device 400 and the location of the player based on the altitude difference that is determined byorientation determination module 426. -
FIG. 14 illustrates a technique for determining a difference (labeled as “A”) between an altitude of a mobile communication device and an altitude of a player in accordance with an embodiment described herein. As shown inFIG. 14 , amobile communication device 1402 is pointed at aplayer 1404. For instance, a camera ofmobile communication device 1402 may be pointed atplayer 1404.Element 1406 represents a two-dimensional (e.g., GPS) location ofmobile communication device 1402.Element 1408 represents a two-dimensional (e.g., GPS) location ofplayer 1404. Accordingly, the distance “B” represents a two-dimensional distance betweenmobile communication device 1402 andplayer 1404. -
Element 1410 represents a three-dimensional location ofmobile communication device 1402. Accordingly, the distance “C” represents a three-dimensional distance betweenmobile communication device 1402 andplayer 1404. The altitude ofmobile communication device 1402 is shown with reference to the altitude ofplayer 1404 for ease of discussion. It will be recognized thatmobile communication device 1402 andplayer 1404 may have any respective altitudes. The three-dimensional distance “C” betweenmobile communication device 1402 andplayer 1404 may be determined in accordance with the following equation: -
- where B is the two-dimensional distance “B” between
mobile communication device 1402 andplayer 1404, and α is the angle between lines A and C. - It will be recognized that
mobile communication device 400 ofFIG. 4 may not include one or more ofcamera 402,image recognition module 404,command receipt module 406,outgoing attack module 408,location module 410, locationindicator receipt module 412,distance determination module 414,time determination module 416,environment module 418,environment control module 420,identification module 422,display module 424, and/ororientation determination module 426. Furthermore,mobile communication device 400 may include modules in addition to or in lieu ofcamera 402,image recognition module 404,command receipt module 406,outgoing attack module 408,location module 410, locationindicator receipt module 412,distance determination module 414,time determination module 416,environment module 418,environment control module 420,identification module 422,display module 424, and/ororientation determination module 426. - The functionality of
environment module 418,environment control module 420, andidentification module 422 is described in further detail below with reference toFIGS. 5-8 .FIGS. 5-8 showmobile communication devices FIG. 5 ,mobile communication device 500 includes adisplay 502 that displays an image of a physical real-world environment. The image shows aplayer 504 who has aplayer indicator 506 affixed to his shirt. It will be recognized thatplayer indicator 506 may be associated withplayer 504 in any suitable manner and need not necessarily be affixed to the player's person.Mobile communication device 500 is configured to identifyplayer indicator 506. - Upon identifying
player indicator 506,mobile communication device 500 may provide a sensory signal to a user ofmobile communication device 500 to indicate thatplayer indicator 506 has been identified, though the scope of the example embodiments is not limited in this respect. A sensory signal is a signal that is perceptible by a human. For instance, the sensory signal may be an audio signal having a frequency in the audible spectrum (e.g., in a range between 20 hertz (Hz) and 20,000 kilohertz (kHz)), a visual signal having a frequency in the visible spectrum (e.g., in a range between 400 terahertz (THz) and 790 THz), a tactile signal, or any other signal that is human-perceptible. A tactile signal is a signal that a human is capable of perceiving using the sense of touch. For example, a tactile signal may be provided using a vibration mechanism ofmobile communication device 500. - As shown in
FIG. 6 ,mobile communication device 600 displays an image of an augmented reality environment that includes the physical real-world environment as shown inFIG. 5 with the addition of a virtual environmental condition. The virtual environmental condition in this example israin 602. It will be recognized thatrain 602 may reduce the visibility of a user ofmobile communication device 600. -
FIG. 7 illustrates that a user may control a virtual environmental condition with respect to a view of the augmented reality environment that is displayed to the user. As shown inFIG. 7 ,mobile communication device 700 displays the augmented reality environment as shown inFIG. 6 , except thatrain 702 inFIG. 7 is a mitigated version ofrain 602 that is shown inFIG. 6 .FIG. 7 illustrates that a user ofmobile communication device 700 moves herfinger 706 downward on a touch screen ofmobile communication device 700, as depicted byarrow 704. The downward motion is interpreted by communication device to be an environmental control command, in response to whichmobile communication device 700 mitigates the intensity ofvirtual rain 602 to providerain 702. It will be recognized that mitigation of the environmental condition in this example increases visibility with respect to the augmented reality environment. - As shown in
FIG. 8 ,mobile communication device 800 displays an image of an augmented reality environment that includes the physical real-world environment as shown inFIG. 5 with the addition ofidentification information 802 regardingplayer 504.Identification information 802 is shown to include a name ofplayer 504 and a team affiliation ofplayer 504 for illustrative purposes and is not intended to be limiting.Identification information 802 may include any suitableinformation regarding player 504. -
FIG. 9 depicts aflowchart 900 of a method for performing multi-player augmented reality combat with respect to a user of a mobile communication device in accordance with an embodiment described herein. For illustrative purposes,flowchart 900 is described with respect to amobile communication device 1000 shown inFIG. 10 , which is an example of a mobile communication device 102, according to an embodiment. - As shown in
FIG. 10 ,mobile communication device 1000 includes anincoming attack module 1002, asensory signal module 1004, anauthorization determination module 1006, acommand determination module 1008, aspeed control module 1010, and astationary determination module 1012. Further structural and operational embodiments will be apparent to persons skilled in the relevant art(s) based on thediscussion regarding flowchart 900.Flowchart 900 is described as follows. - As shown in
FIG. 9 , the method offlowchart 900 begins atstep 902. Instep 902, an incoming attack indicator that specifies that a virtual bullet is directed at a user of a mobile communication device is received. For instance, the incoming attack indicator may be wirelessly received. In an example implementation,incoming attack module 1002 receives the incoming attack indicator. - At
step 904, a sensory signal is provided to the user in response to receiving the incoming attack indicator. For instance, the sensory signal may be an audio signal having a frequency in the audible spectrum (e.g., in a range between 20 hertz (Hz) and 20,000 kilohertz (kHz)), a visual signal having a frequency in the visible spectrum (e.g., in a range between 400 terahertz (THz) and 790 THz), a tactile signal, or any other signal that is human-perceptible. In an example implementation,sensory signal module 1004 provides the sensory signal. - At
step 906, a determination is made whether the user is authorized to provide a speed control command for controlling a speed of the virtual bullet. The determination may be based on attributes of the user, attributes of a player who fired the virtual bullet at the user, and/or attributes of the game. In an example implementation,authorization determination module 1006 determines whether the user is authorized to provide a speed control command. For example, the determination may be based on whether the user has acquired an attribute power that authorizes the user to provide a speed control command. In accordance with this example, if the user has acquired the power,authorization determination module 1006 determines that the user is authorized to provide a speed control command. In further accordance with this example, if the user has not acquired the power,authorization determination module 1006 determines that the user is not authorized to provide a speed control command. If the user is authorized to provide a speed control command, flow continues to step 908. Otherwise,flowchart 900 ends. - At
step 908, a determination is made whether a user-initiated speed control command is received. In an example implementation,command determination module 1008 determines whether a user-initiated speed control command is received. If a user-initiated speed control command is received, flow continues to step 910. Otherwise, flow continues to step 912. - In accordance with an example embodiment, players are capable having a power-blocking power that blocks another player's ability to utilize a power. For instance, a player who has a power-blocking power may block use of the speed control command described in
step 908, thereby preventing the user-initiated speed control command from being received. - At
step 910, the speed of the virtual bullet is controlled in response to the user-initiated speed control command. For example, the speed of the virtual bullet may be reduced in response to the user-initiated speed control command. In accordance with this example, the virtual bullet's reduced speed may provide the user more time to react to the virtual bullet. For instance, the user may view the virtual bullet on a display of the communication device and take action in the physical real-world environment to avoid being hit by the virtual bullet in the augmented reality environment. In an example implementation,speed control module 1010 controls the speed of the virtual bullet. Upon completion ofstep 910,flowchart 900 ends. - The user may utilize any of a variety of powers in an attempt to avoid being hit by the virtual bullet and/or mitigate an affect of being hit by the virtual bullet. For example, the user may increase the strength of the user's virtual shield for a specified duration or indefinitely, increase a speed with which the user is capable of moving for a specified duration or indefinitely, etc.
- At
step 912, a determination is made whether the virtual bullet is stationary. For instance, if the virtual bullet is no longer in motion, it may be non-productive to continue to determine whether a user-initiated speed control command is received. In an example implementation,stationary determination module 1012 determines whether the virtual bullet is stationary. If the virtual bullet is stationary,flowchart 900 ends. Otherwise, flow returns to step 908. - In some example embodiments, one or
more steps flowchart 900 may not be performed. Moreover, steps in addition to or in lieu ofsteps - It will be recognized that
mobile communication device 900 may not include one or more ofincoming attack module 1002,sensory signal module 1004,authorization determination module 1006,command determination module 1008,speed control module 1010, and/orstationary determination module 1012. Furthermore,mobile communication device 900 may include modules in addition to or in lieu ofincoming attack module 1002,sensory signal module 1004,authorization determination module 1006,command determination module 1008,speed control module 1010, and/orstationary determination module 1012. -
FIG. 11 depicts aflowchart 1100 of a method for performing multi-player augmented reality combat with respect to a user of a mobile communication device in accordance with an embodiment described herein. For illustrative purposes,flowchart 1100 is described with respect to amobile communication device 1200 shown inFIG. 12 , which is an example of a mobile communication device 102, according to an embodiment. - As shown in
FIG. 12 ,mobile communication device 1200 includes aninjury determination module 1202, adistance determination module 1204, and arecovery control module 1206. Further structural and operational embodiments will be apparent to persons skilled in the relevant art(s) based on thediscussion regarding flowchart 1100.Flowchart 1100 is described as follows. - As shown in
FIG. 11 , the method offlowchart 1100 begins atstep 1102. Instep 1102, a determination is made that a user has incurred a virtual injury. In an example implementation,injury determination module 1202 determines that the user has incurred the virtual injury. - At
step 1104, a distance between a location of a mobile communication device of the user and a designated location is determined In an example implementation,distance determination module 1204 determines the distance between the location of the mobile communication device of the user and the designated location. - At
step 1106, a rate at which the user recovers from the virtual injury is controlled based on the determined distance. For example, if the designated location is a location of a physical or virtual hospital, the rate at which the user recovers from the virtual injury may be inversely proportional to the distance between the location of the mobile communication device and the hospital. For instance, the user may recover more quickly if the user is closer to the hospital. The user may recover more slowly if the user is farther from the hospital. In another example, if the designated location is a location of a toxic dump site, the rate at which the user recovers from the virtual injury may be directly proportional to the distance between the location of the mobile communication device and the toxic dump site. For instance, the user may recover more quickly if the user is farther from the toxic dump site. The user may recover more slowly if the user is closer to the toxic dump site. In an example implementation,recovery control module 1206 controls the rate at which the user recovers from the virtual injury. - When a player is hit with a virtual bullet, that player's view of the augmented reality environment may be changed to simulate the hit. Each hit may be graded according to physical and/or virtual factors, including but not limited to the type of virtual bullet, the type of weapon that fired the virtual bullet, the distance traveled by the virtual bullet before it hit the player, collected tools that each player is using, etc. A higher hit grade corresponds to a higher extent of injury, and a lower hit grade corresponds to a lower extent of injury. For instance, each player may use a virtual shield that offers protection from some weapons and/or bullets. Each shield may be more (or less) effective against weapons and/or bullets in designated virtual environmental conditions.
- A hit grade may have a designated effect on a virtual condition of a player. For instance, the player's view of the augmented reality environment may be changed such that aiming a virtual weapon is more difficult for the player. Other ways in which the player's view may be changed include but are not limited to showing fog, showing blood on the display, covering the player's view of the augmented reality environment (or a portion thereof), causing the player's view of the augmented reality environment to be unstable, out of focus, zoomed out, zoomed in, etc.
- In accordance with an example embodiment, a player may recover from a virtual injury as time passes or by collecting and/or using virtual curing objects. As the player recovers, the player's hit grade decreases. Each curing object, each virtual shield, and each unit of time (e.g., second, minute, etc.) may have a respective designated curing effect. In accordance with another example embodiment, some players may have virtual healing tools that they may use to heal other players. For example, a player who possesses a healing tool may stand near a virtually wounded player in the physical real-world environment to assist the recovery of the wounded player in the augmented reality environment. In another example, a player who possesses a virtual healing weapon may fire a virtual healing bullet at a wounded player to assist the recovery of the wounded player. When a player's hit grade reaches an upper threshold, the player may be considered as virtually dead. Any suitable technique may be used to revive the player from the deceased state.
- In accordance with another example embodiment, each type of virtual weapon and/or virtual bullet may have respective characteristics. One type of characteristic is a hit effect. A hit effect is an effect that results when a virtual bullet that is fired by a virtual weapon hits an object. For example, a virtual shotgun may have a wider diameter of hit, but a lower range, as compared to a virtual sniper rifle. A virtual cannon may have a substantially wide area of hit. For instance, using a virtual cannon, a group of players may be targeted based solely on player location indicator(s) associated with the players, without the need for identifying the players using an image recognition technique. Targeting players in this manner may be useful if the players are hiding in such a way that a camera cannot be used to capture an image of the players. In another example, a virtual sniper rifle may be used to fire a relatively fast and accurate virtual bullet that is less affected by virtual environmental conditions and/or user-initiated speed control commands.
- Any suitable type of virtual weapons, virtual bullets, and/or virtual tools having user-defined characteristics may be created in the augmented reality environment. Such creations may be acquired by players and used in the augmented reality environment to affect targeting, shooting, and hitting simulation in the augmented reality environment.
- Virtual items may be located throughout the augmented reality environment. The virtual items may represent points, virtual weapons, virtual tools, and/or any other suitable virtual items. Players may collect a virtual item in any of a variety of ways. For example, a player may collect a virtual item by moving in the physical real-world environment such that the player moves closer to a virtual location of the virtual item in the augmented reality environment. Each virtual item has a location in the real world, and when a device location system (e.g., device location system 106) determines that the user is close enough to the virtual item's location, the user is collects the virtual item. In another example, the player may fire a virtual bullet that hits the virtual item. In yet another example, the player may collect a virtual item by trading points for the virtual item. In still another example, the player may collect a virtual item by purchasing the virtual item using virtual money, real money, or a combination thereof.
- Players may acquire points by completing tasks; firing virtual bullets that hit other players; collecting virtual points; purchasing the points using virtual money, real money, or a combination thereof; etc. The points may be traded for virtual items, virtual money, real money, or a combination thereof.
- Players may view a map showing some or all of the locations of the other players in the augmented reality environment. For instance, players may see the locations of their team members. Players may leave virtual markers, notes, voice messages, recordings, and/or virtual items on the map for other players to pick up. In another example, players can leave traps for other players, such as virtual mines, virtual grenades, etc that may be triggered when those players come within a designated proximity of the traps. Players may communicate using audio and/or video conferencing. A variety of other features that are known in the relevant art(s) (e.g., the computer gaming art) may be incorporated into the multi-player augmented reality combat techniques described herein.
- In accordance with another example embodiment, the direction of a virtual bullet may be controlled by a user who fires the virtual bullet. For example, after firing the bullet, the user may move the camera of the user's mobile communication device to cause the virtual bullet to shift direction toward the new camera direction. The ability to control the bullet may be a property of the bullet, the virtual weapon that is used to fire the virtual bullet, and/or a power that is associated with a virtual item that is collected by the user. Controlling the virtual bullet direction is similar to controlling a guided missile that can be used to target players as the players move (e.g., to get away from the virtual bullet) or to target players who are hiding behind shelters.
- In accordance with another example embodiment, a user may initiate virtual mobile controlled objects that may be controlled by the user. Examples of virtual mobile controlled objects include but are not limited to virtual aircraft (e.g., helicopters, airplanes, gliders, steerable balloons, etc.), virtual vessels (e.g., ships, submarines, etc.), virtual land vehicles, etc. Each virtual mobile controlled object may have a corresponding speed, a duration of availability, and a pre-defined source (i.e., virtual base). The user may control the navigation of the virtual mobile controlled objects. A virtual mobile controlled object may view location indicators of other players at a designated range according to the location of each user's mobile communication device and the type of the virtual mobile controlled object.
- For example, a virtual airplane may carry virtual bombs that the user can release on top of other players. The other players' mobile communication devices may have object indicators that are capable of indicating that a virtual mobile controlled object is nearby. For instance, the object indicators may provide a sound, cause the virtual mobile controlled object to be rendered on a screen of a mobile communication device when a camera of the mobile communication device is pointed at the virtual mobile controlled object, etc. The players may fire virtual bullets at the virtual mobile controlled object in order to hit it. Other types of virtual mobile controlled objects may be virtual soldiers or any other type of virtual mobile controlled object that a user may control. If a virtual mobile controlled object does not return back to its virtual base within a specified time, is the virtual mobile controlled object may be disabled. It is possible to for a user to collect virtual mobile controlled objects like any other virtual item described herein.
- In accordance with yet another example embodiment, players may trigger virtual weapons, such as artillery, remotely. Each such virtual weapon has a specified range from its source position. A user may define the source position of a remotely triggered virtual weapon with respect to the user's location. When the user fires the remotely triggered virtual weapon, the mobile communication devices of the players who are in the area of the expected hit may generate a sensory signal, such as a sound, vibration of the players' devices, etc. as described in previous examples.
- In still another example embodiment, an augmented reality combat system may include a station controlled by a connected communication device that is connected to the network. The station may act as a command center that displays the positions of the players on a map, statuses of the players, etc. The command center may use a relatively large screen and/or a relatively more powerful computer system that can consume and process substantial data. A user who uses the command center can assist the other players to perform and act as a team. The command center may collect information about the positions of opponents whose locations are being provided by their mobile communication devices or by virtual mobile controlled objects. The command center may display prior locations of the players, so that the user who uses the command center may have a better understanding of the movement patterns of the opponent players.
- In another example embodiment, players may be capable of using camera zoom capabilities to facilitate identification of player indicators. The ability to use the camera zoom capabilities may be based on possession of specified attributes, use of specified weapons, or any other suitable criteria. For example, although a virtual cannon may not use zoom, a virtual sniper rifle may.
- Each mobile communication device may be mounted on or incorporated in a physical device that is shaped like a weapon. Displays of the mobile communication devices may be displayed using glasses. For instance, cameras of the mobile communication devices may be attached to the glasses, so that the players may point by looking in a direction. Any of the devices and/or components thereof may be carried on a player's person or in accessories that are available to the player.
- The embodiments described herein, including systems, methods/processes, and/or apparatuses, may be implemented using well known computers, such as
computer 1300 shown inFIG. 13 . For example, elements of example augmentedreality combat systems FIG. 1 ,device location system 106 depicted inFIGS. 1 and 2 , any of themobile communication devices 102A-102N depicted inFIGS. 1 and 2 , any ofmobile communication devices FIGS. 4 , 5, 6, 7, 8, 10, and 12 and elements thereof, and each of the steps offlowcharts FIGS. 3 , 9, and 11 can each be implemented using one ormore computers 1300. -
Computer 1300 can be any commercially available and well known computer capable of performing the functions described herein, such as computers available from Apple, Dell, Gateway, HP, International Business Machines, Sony, etc.Computer 1300 may be any type of computer, including a desktop computer, a server, etc. - As shown in
FIG. 13 ,computer 1300 includes one or more processors (e.g., central processing units (CPUs)), such asprocessor 1306.Processor 1306 may includecamera 402,image recognition module 404,command receipt module 406,outgoing attack module 408,location module 410, locationindicator receipt module 412,distance determination module 414,time determination module 416,environment module 418,environment control module 420,identification module 422,display module 424, and/ororientation determination module 426 ofFIG. 4 ;incoming attack module 1002,sensory signal module 1004,authorization determination module 1006,command determination module 1008,speed control module 1010, and/orstationary determination module 1012 ofFIG. 10 ;injury determination module 1202,distance determination module 1204, and/orrecovery control module 1206 ofFIG. 12 ; or any portion or combination thereof, for example, though the scope of the embodiments is not limited in this respect.Processor 1306 is connected to acommunication infrastructure 1302, such as a communication bus. In some embodiments,processor 1306 can simultaneously operate multiple computing threads. -
Computer 1300 also includes a primary ormain memory 1308, such as a random access memory (RAM). Main memory has stored therein controllogic 1324A (computer software), and data. -
Computer 1300 also includes one or moresecondary storage devices 1310.Secondary storage devices 1310 include, for example, ahard disk drive 1312 and/or a removable storage device or drive 1314, as well as other types of storage devices, such as memory cards and memory sticks. For instance,computer 1300 may include an industry standard interface, such as a universal serial bus (USB) interface for interfacing with devices such as a memory stick.Removable storage drive 1314 represents a floppy disk drive, a magnetic tape drive, a compact disk drive, an optical storage device, tape backup, etc. -
Removable storage drive 1314 interacts with aremovable storage unit 1316.Removable storage unit 1316 includes a computer useable orreadable storage medium 1318 having stored thereincomputer software 1324B (control logic) and/or data.Removable storage unit 1316 represents a floppy disk, magnetic tape, compact disc (CD), digital versatile disc (DVD), Blue-ray disc, optical storage disk, memory stick, memory card, or any other computer data storage device.Removable storage drive 1314 reads from and/or writes toremovable storage unit 1316 in a well known manner. -
Computer 1300 also includes input/output/display devices 1304, such as monitors, keyboards, pointing devices, etc. -
Computer 1300 further includes a communication ornetwork interface 1320.Communication interface 1320 enablescomputer 1300 to communicate with remote devices. For example,communication interface 1320 allowscomputer 1300 to communicate over communication networks or mediums 1322 (representing a form of a computer useable or readable medium), such as local area networks (LANs), wide area networks (WANs), the Internet, etc.Network interface 1320 may interface with remote sites or networks via wired or wireless connections. Examples ofcommunication interface 1322 include but are not limited to a modem, a network interface card (e.g., an Ethernet card), a communication port, a Personal Computer Memory Card International Association (PCMCIA) card, etc. -
Control logic 1324C may be transmitted to and fromcomputer 1300 via thecommunication medium 1322. - Any apparatus or manufacture comprising a computer useable or readable medium having control logic (software) stored therein is referred to herein as a computer program product or program storage device. This includes, but is not limited to,
computer 1300,main memory 1308,secondary storage devices 1310, andremovable storage unit 1316. Such computer program products, having control logic stored therein that, when executed by one or more data processing devices, cause such data processing devices to operate as described herein, represent embodiments of the invention. - For example, each of the elements of example
mobile communication device 400 depicted inFIG. 4 , includingcamera 402,image recognition module 404,command receipt module 406,outgoing attack module 408,location module 410, locationindicator receipt module 412,distance determination module 414,time determination module 416,environment module 418,environment control module 420,identification module 422,display module 424, andorientation determination module 426; each of the elements of examplemobile communication device 1000 depicted inFIG. 10 , includingincoming attack module 1002,sensory signal module 1004,authorization determination module 1006,command determination module 1008,speed control module 1010, andstationary determination module 1012; each of the elements of examplemobile communication device 1200 depicted inFIG. 12 , includinginjury determination module 1202,distance determination module 1204, andrecovery control module 1206; and each of the steps offlowcharts FIGS. 3 , 9, and 11 can be implemented as control logic that may be stored on a computer useable medium or computer readable medium, which can be executed by one or more processors to operate as described herein. - The invention can be put into practice using software, hardware, and/or operating system implementations other than those described herein. Any software, hardware, and operating system implementations suitable for performing the functions described herein can be used.
- While various embodiments have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be apparent to persons skilled in the relevant art(s) that various changes in form and details can be made therein without departing from the spirit and scope of the invention. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.
Claims (23)
1. A mobile communication device comprising:
a camera configured to capture an image;
an image recognition module configured to identify a player indicator in the image, the player indicator corresponding to a player; and
an outgoing attack module configured to transmit an outgoing attack indicator in response to a user-initiated attack command that is received in response to identification of the player indicator, the outgoing attack indicator specifying that a virtual bullet is fired at the player.
2. The mobile communication device of claim 1 , further comprising:
a location module configured to determine a location of the mobile communication device;
a distance determination module configured to determine a distance between the location of the mobile communication device and a location of the player; and
a time determination module configured to determine an estimated duration of a time period for the virtual bullet to travel from the mobile communication device to the player based on the determined distance.
3. The mobile communication device of claim 2 , wherein the time determination module is configured to determine the estimated duration of the time period based on a virtual environmental condition.
4. The mobile communication device of claim 2 , wherein the time determination module is configured to determine the estimated duration of the time period based on a difference between an altitude of the mobile communication device and an altitude of the player indicator.
5. The mobile communication device of claim 2 , wherein the time determination module is configured to determine the estimated duration of the time period based on at least one of an attribute of the virtual bullet or an attribute of a virtual weapon that is used to fire the virtual bullet.
6. The mobile communication device of claim 1 , further comprising:
an environment module configured to modify the image to include a virtual environmental condition; and
a display configured to display the modified image.
7. The mobile communication device of claim 6 , further comprising:
an environment control module configured to control the virtual environmental condition in response to a user-initiated environment control command
8. The mobile communication device of claim 1 , further comprising:
an identification module configured to modify the image to include identification information regarding the player; and
a display configured to display the modified image.
9. The mobile communication device of claim 1 , further comprising:
an incoming attack module configured to determine when a virtual bullet is directed at a user of the mobile communication device; and
a sensory signal module configured to provide a sensory signal to the user in response to determination that a virtual bullet is directed at the user.
10. The mobile communication device of claim 9 , further comprising:
a speed control module configured to control a speed of the virtual bullet that is directed at the user in response to a user-initiated speed control command.
11. A method of performing multi-player augmented reality combat with respect to a user of a mobile communication device, comprising:
capturing an image;
identifying a player indicator in the image, the player indicator corresponding to a player;
receiving a user-initiated attack command in response to identifying the player indicator; and
transmitting an attack indicator in response to receiving the user-initiated attack command, the attack indicator specifying that a virtual bullet is fired at the player.
12. The method of claim 11 , further comprising:
determining a location of the mobile communication device;
receiving a player location indicator that specifies a location of the player;
determining a distance between the location of the mobile communication device and the location of the player; and
determining an estimated duration of a time period for the virtual bullet to travel from the mobile communication device to the player based on the determined distance.
13. The method of claim 12 , wherein determining the estimated duration of the time period comprises:
determining the estimated duration of the time period based on a virtual environmental condition.
14. The method of claim 12 , wherein determining the estimated duration of the time period comprises:
determining the estimated duration of the time period based on a difference between an altitude of the mobile communication device and an altitude of the player indicator.
15. The method of claim 12 , wherein determining the estimated duration of the time period comprises:
determining the estimated duration of the time period based on at least one of an attribute of the virtual bullet or an attribute of a virtual weapon that is used to fire the virtual bullet.
16. The method of claim 11 , further comprising:
modifying the image to include a virtual environmental condition; and
displaying the modified image.
17. The method of claim 16 , further comprising:
controlling the virtual environmental condition in response to a user-initiated environment control command.
18. The method of claim 11 , further comprising:
modifying the image to include identification information regarding the player; and
displaying the modified image.
19. The method of claim 11 , further comprising:
receiving an incoming attack indicator that specifies that a virtual bullet is directed at a user of the mobile communication device; and
20. The method of claim 19 , further comprising:
controlling a speed of the virtual bullet that is directed at the user in response to a user-initiated speed control command.
21. The method of claim 11 , further comprising:
controlling a rate at which the user recovers from a virtual injury based on a distance between the location of the mobile communication device and a designated location.
22. A computer program product comprising a computer-readable medium having computer program logic recorded thereon for enabling a processor-based system to perform multi-player augmented reality combat with respect to a user of a mobile communication device, the computer program product comprising:
a first program logic module for enabling the processor-based system to capture an image;
a second program logic module for enabling the processor-based system to identify a player indicator in the image, the player indicator corresponding to a player; and
a third program logic module for enabling the processor-based system to transmit an outgoing attack indicator in accordance with a mobile communication protocol in response to a user-initiated attack command that is received in response to identification of the player indicator, the outgoing attack indicator specifying that a virtual bullet is fired at the player.
23. The computer program product of claim 22 , further comprising:
a fourth program logic module for enabling the processor-based system to determine a location of the mobile communication device;
a fifth program logic module for enabling the processor-based system to determine a distance between the location of the mobile communication device and a location of the player; and
a sixth program logic module for enabling the processor-based system to determine an estimated duration of a time period for the virtual bullet to travel from the mobile communication device to the player based on the determined distance.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/956,646 US20110151955A1 (en) | 2009-12-23 | 2010-11-30 | Multi-player augmented reality combat |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US28988109P | 2009-12-23 | 2009-12-23 | |
US12/956,646 US20110151955A1 (en) | 2009-12-23 | 2010-11-30 | Multi-player augmented reality combat |
Publications (1)
Publication Number | Publication Date |
---|---|
US20110151955A1 true US20110151955A1 (en) | 2011-06-23 |
Family
ID=44151853
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/956,646 Abandoned US20110151955A1 (en) | 2009-12-23 | 2010-11-30 | Multi-player augmented reality combat |
Country Status (1)
Country | Link |
---|---|
US (1) | US20110151955A1 (en) |
Cited By (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110187745A1 (en) * | 2010-01-29 | 2011-08-04 | Pantech Co., Ltd. | Apparatus and method for providing augmented reality information |
US20110292076A1 (en) * | 2010-05-28 | 2011-12-01 | Nokia Corporation | Method and apparatus for providing a localized virtual reality environment |
US20120100912A1 (en) * | 2010-10-25 | 2012-04-26 | Electronics And Telecommunications Research Institute | Method of reusing physics simulation results and game service apparatus using the same |
US20120196684A1 (en) * | 2011-02-01 | 2012-08-02 | David Richardson | Combining motion capture and timing to create a virtual gaming experience |
US20120214562A1 (en) * | 2011-02-23 | 2012-08-23 | DOOBIC Co., Ltd. | Massively mutliplayer online first person shooting game service system and method |
WO2013030672A2 (en) * | 2011-08-22 | 2013-03-07 | Glentworth Holdings Pty Ltd | Sensitive emission device and method of use |
US20130065692A1 (en) * | 2011-09-14 | 2013-03-14 | Steelseries Hq | Apparatus for adapting virtual gaming with real world information |
US20130125027A1 (en) * | 2011-05-06 | 2013-05-16 | Magic Leap, Inc. | Massive simultaneous remote digital presence world |
US20130316811A1 (en) * | 2011-02-15 | 2013-11-28 | Konami Digital Entertainment Co., Ltd. | Game device |
US20140049559A1 (en) * | 2012-08-17 | 2014-02-20 | Rod G. Fleck | Mixed reality holographic object development |
US20140283136A1 (en) * | 2013-03-13 | 2014-09-18 | Optio Labs, Inc. | Systems and methods for securing and locating computing devices |
US20150356812A1 (en) * | 2010-12-15 | 2015-12-10 | Bally Gaming, Inc. | System and method for augmented reality using a player card |
US20150379777A1 (en) * | 2013-03-06 | 2015-12-31 | Megachips Corporation | Augmented reality providing system, recording medium, and augmented reality providing method |
US9268406B2 (en) | 2011-09-30 | 2016-02-23 | Microsoft Technology Licensing, Llc | Virtual spectator experience with a personal audio/visual apparatus |
US9286711B2 (en) | 2011-09-30 | 2016-03-15 | Microsoft Technology Licensing, Llc | Representing a location at a previous time period using an augmented reality display |
US20160129344A1 (en) * | 2013-06-26 | 2016-05-12 | Sony Computer Entertainment Inc. | Information processor, control method of information processor, program, and information storage medium |
US9363670B2 (en) | 2012-08-27 | 2016-06-07 | Optio Labs, Inc. | Systems and methods for restricting access to network resources via in-location access point protocol |
US9498720B2 (en) | 2011-09-30 | 2016-11-22 | Microsoft Technology Licensing, Llc | Sharing games using personal audio/visual apparatus |
US9609020B2 (en) | 2012-01-06 | 2017-03-28 | Optio Labs, Inc. | Systems and methods to enforce security policies on the loading, linking, and execution of native code by mobile applications running inside of virtual machines |
US9606992B2 (en) | 2011-09-30 | 2017-03-28 | Microsoft Technology Licensing, Llc | Personal audio/visual apparatus providing resource management |
US20170140209A1 (en) * | 2015-11-13 | 2017-05-18 | Xiaomi Inc. | Image recognition method and device for game |
US20170168556A1 (en) * | 2015-12-11 | 2017-06-15 | Disney Enterprises, Inc. | Launching virtual objects using a rail device |
US9712530B2 (en) | 2012-01-06 | 2017-07-18 | Optio Labs, Inc. | Systems and methods for enforcing security in mobile computing |
US9773107B2 (en) | 2013-01-07 | 2017-09-26 | Optio Labs, Inc. | Systems and methods for enforcing security in mobile computing |
US9787681B2 (en) | 2012-01-06 | 2017-10-10 | Optio Labs, Inc. | Systems and methods for enforcing access control policies on privileged accesses for mobile devices |
CN107441714A (en) * | 2017-06-01 | 2017-12-08 | 杨玉苹 | A kind of image processing method and its device, shooting game fighting system and its method of work for realizing AR first person shooting games |
US9931566B2 (en) * | 2014-01-29 | 2018-04-03 | Eddie's Social Club, LLC | Game system with interactive show control |
CN108654086A (en) * | 2018-05-09 | 2018-10-16 | 腾讯科技(深圳)有限公司 | Attack object injury acquisition methods, device and equipment in virtual environment |
US20180332395A1 (en) * | 2013-03-19 | 2018-11-15 | Nokia Technologies Oy | Audio Mixing Based Upon Playing Device Location |
US20190015754A1 (en) * | 2017-07-16 | 2019-01-17 | Theodor Radu | Apparatus, computer-readable storage medium storing an application thereon, system and method |
CN109260703A (en) * | 2018-09-28 | 2019-01-25 | 重庆第五维科技有限公司 | True man's gunbattle game information exchange method based on AR scene |
US20190116457A1 (en) * | 2017-10-16 | 2019-04-18 | Christopher Anthony Silva | Method and System for 3-D Location of Mobile Devices |
CN110496389A (en) * | 2018-05-16 | 2019-11-26 | 恩希软件株式会社 | Game server and the method that memorandum is shared in game server |
US20200108312A1 (en) * | 2018-10-03 | 2020-04-09 | Song Chen | Gaming system |
US20200171380A1 (en) * | 2018-06-26 | 2020-06-04 | Sony Interactive Entertainment Inc. | Multipoint slam capture |
US10835828B1 (en) * | 2018-07-25 | 2020-11-17 | Facebook, Inc. | Augmented-reality game overlays in video communications |
US11392636B2 (en) | 2013-10-17 | 2022-07-19 | Nant Holdings Ip, Llc | Augmented reality position-based service, methods, and systems |
US11403823B2 (en) | 2019-02-14 | 2022-08-02 | Lego A/S | Toy system for asymmetric multiplayer game play |
US11410488B2 (en) * | 2019-05-03 | 2022-08-09 | Igt | Augmented reality virtual object collection based on symbol combinations |
US11854153B2 (en) | 2011-04-08 | 2023-12-26 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040157662A1 (en) * | 2002-12-09 | 2004-08-12 | Kabushiki Kaisha Square Enix (Also Trading As Square Enix Co., Ltd.) | Video game that displays player characters of multiple players in the same screen |
US6972734B1 (en) * | 1999-06-11 | 2005-12-06 | Canon Kabushiki Kaisha | Mixed reality apparatus and mixed reality presentation method |
US20060223635A1 (en) * | 2005-04-04 | 2006-10-05 | Outland Research | method and apparatus for an on-screen/off-screen first person gaming experience |
WO2008116982A2 (en) * | 2007-02-13 | 2008-10-02 | Parrot | Method for the recognition of objects in a shooting game for remote-controlled toys |
US20090203445A1 (en) * | 2005-09-14 | 2009-08-13 | Nintendo Co., Ltd. | Pointing device system and method |
US20090209336A1 (en) * | 2005-06-28 | 2009-08-20 | Konami Digital Entertainment Co., Ltd. | Game system, method for controlling game system, game device therefor, and program therefor |
-
2010
- 2010-11-30 US US12/956,646 patent/US20110151955A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6972734B1 (en) * | 1999-06-11 | 2005-12-06 | Canon Kabushiki Kaisha | Mixed reality apparatus and mixed reality presentation method |
US20040157662A1 (en) * | 2002-12-09 | 2004-08-12 | Kabushiki Kaisha Square Enix (Also Trading As Square Enix Co., Ltd.) | Video game that displays player characters of multiple players in the same screen |
US20060223635A1 (en) * | 2005-04-04 | 2006-10-05 | Outland Research | method and apparatus for an on-screen/off-screen first person gaming experience |
US20090209336A1 (en) * | 2005-06-28 | 2009-08-20 | Konami Digital Entertainment Co., Ltd. | Game system, method for controlling game system, game device therefor, and program therefor |
US20090203445A1 (en) * | 2005-09-14 | 2009-08-13 | Nintendo Co., Ltd. | Pointing device system and method |
WO2008116982A2 (en) * | 2007-02-13 | 2008-10-02 | Parrot | Method for the recognition of objects in a shooting game for remote-controlled toys |
US20100178966A1 (en) * | 2007-02-13 | 2010-07-15 | Parrot | A method of recognizing objects in a shooter game for remote-controlled toys |
Cited By (73)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8736636B2 (en) * | 2010-01-29 | 2014-05-27 | Pantech Co., Ltd. | Apparatus and method for providing augmented reality information |
US20110187745A1 (en) * | 2010-01-29 | 2011-08-04 | Pantech Co., Ltd. | Apparatus and method for providing augmented reality information |
US20110292076A1 (en) * | 2010-05-28 | 2011-12-01 | Nokia Corporation | Method and apparatus for providing a localized virtual reality environment |
US9122707B2 (en) * | 2010-05-28 | 2015-09-01 | Nokia Technologies Oy | Method and apparatus for providing a localized virtual reality environment |
US20120100912A1 (en) * | 2010-10-25 | 2012-04-26 | Electronics And Telecommunications Research Institute | Method of reusing physics simulation results and game service apparatus using the same |
US9875600B2 (en) | 2010-12-15 | 2018-01-23 | Bally Gaming, Inc. | System and method for augmented reality using a user-specific card |
US10204476B2 (en) | 2010-12-15 | 2019-02-12 | Bally Gaming, Inc. | System and method for augmented reality using a user-specific object |
US9697676B2 (en) * | 2010-12-15 | 2017-07-04 | Bally Gaming, Inc. | System and method for augmented reality using a player card |
US20150356812A1 (en) * | 2010-12-15 | 2015-12-10 | Bally Gaming, Inc. | System and method for augmented reality using a player card |
US20120196684A1 (en) * | 2011-02-01 | 2012-08-02 | David Richardson | Combining motion capture and timing to create a virtual gaming experience |
US9314693B2 (en) * | 2011-02-15 | 2016-04-19 | Konami Digital Entertainment Co., Ltd. | Game device |
US20130316811A1 (en) * | 2011-02-15 | 2013-11-28 | Konami Digital Entertainment Co., Ltd. | Game device |
US20120214562A1 (en) * | 2011-02-23 | 2012-08-23 | DOOBIC Co., Ltd. | Massively mutliplayer online first person shooting game service system and method |
US11869160B2 (en) | 2011-04-08 | 2024-01-09 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US11854153B2 (en) | 2011-04-08 | 2023-12-26 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US11967034B2 (en) | 2011-04-08 | 2024-04-23 | Nant Holdings Ip, Llc | Augmented reality object management system |
US10101802B2 (en) * | 2011-05-06 | 2018-10-16 | Magic Leap, Inc. | Massive simultaneous remote digital presence world |
US11157070B2 (en) | 2011-05-06 | 2021-10-26 | Magic Leap, Inc. | Massive simultaneous remote digital presence world |
US10671152B2 (en) | 2011-05-06 | 2020-06-02 | Magic Leap, Inc. | Massive simultaneous remote digital presence world |
US20130125027A1 (en) * | 2011-05-06 | 2013-05-16 | Magic Leap, Inc. | Massive simultaneous remote digital presence world |
US11669152B2 (en) | 2011-05-06 | 2023-06-06 | Magic Leap, Inc. | Massive simultaneous remote digital presence world |
WO2013030672A2 (en) * | 2011-08-22 | 2013-03-07 | Glentworth Holdings Pty Ltd | Sensitive emission device and method of use |
WO2013030672A3 (en) * | 2011-08-22 | 2013-07-25 | Glentworth Holdings Pty Ltd | Sensitive emission device and method of use |
US11806623B2 (en) | 2011-09-14 | 2023-11-07 | Steelseries Aps | Apparatus for adapting virtual gaming with real world information |
US11547941B2 (en) | 2011-09-14 | 2023-01-10 | Steelseries Aps | Apparatus for adapting virtual gaming with real world information |
US11020667B2 (en) | 2011-09-14 | 2021-06-01 | Steelseries Aps | Apparatus for adapting virtual gaming with real world information |
US11273377B2 (en) | 2011-09-14 | 2022-03-15 | Steelseries Aps | Apparatus for adapting virtual gaming with real world information |
US10391402B2 (en) | 2011-09-14 | 2019-08-27 | Steelseries Aps | Apparatus for adapting virtual gaming with real world information |
US9155964B2 (en) * | 2011-09-14 | 2015-10-13 | Steelseries Aps | Apparatus for adapting virtual gaming with real world information |
US20130065692A1 (en) * | 2011-09-14 | 2013-03-14 | Steelseries Hq | Apparatus for adapting virtual gaming with real world information |
US10512844B2 (en) | 2011-09-14 | 2019-12-24 | Steelseries Aps | Apparatus for adapting virtual gaming with real world information |
US9861893B2 (en) | 2011-09-14 | 2018-01-09 | Steelseries Aps | Apparatus for adapting virtual gaming with real world information |
US9268406B2 (en) | 2011-09-30 | 2016-02-23 | Microsoft Technology Licensing, Llc | Virtual spectator experience with a personal audio/visual apparatus |
US9606992B2 (en) | 2011-09-30 | 2017-03-28 | Microsoft Technology Licensing, Llc | Personal audio/visual apparatus providing resource management |
US9286711B2 (en) | 2011-09-30 | 2016-03-15 | Microsoft Technology Licensing, Llc | Representing a location at a previous time period using an augmented reality display |
US9498720B2 (en) | 2011-09-30 | 2016-11-22 | Microsoft Technology Licensing, Llc | Sharing games using personal audio/visual apparatus |
US9609020B2 (en) | 2012-01-06 | 2017-03-28 | Optio Labs, Inc. | Systems and methods to enforce security policies on the loading, linking, and execution of native code by mobile applications running inside of virtual machines |
US9712530B2 (en) | 2012-01-06 | 2017-07-18 | Optio Labs, Inc. | Systems and methods for enforcing security in mobile computing |
US9787681B2 (en) | 2012-01-06 | 2017-10-10 | Optio Labs, Inc. | Systems and methods for enforcing access control policies on privileged accesses for mobile devices |
US9429912B2 (en) * | 2012-08-17 | 2016-08-30 | Microsoft Technology Licensing, Llc | Mixed reality holographic object development |
US20140049559A1 (en) * | 2012-08-17 | 2014-02-20 | Rod G. Fleck | Mixed reality holographic object development |
US9363670B2 (en) | 2012-08-27 | 2016-06-07 | Optio Labs, Inc. | Systems and methods for restricting access to network resources via in-location access point protocol |
US9773107B2 (en) | 2013-01-07 | 2017-09-26 | Optio Labs, Inc. | Systems and methods for enforcing security in mobile computing |
US20150379777A1 (en) * | 2013-03-06 | 2015-12-31 | Megachips Corporation | Augmented reality providing system, recording medium, and augmented reality providing method |
US9578445B2 (en) | 2013-03-13 | 2017-02-21 | Optio Labs, Inc. | Systems and methods to synchronize data to a mobile device based on a device usage context |
US20140283136A1 (en) * | 2013-03-13 | 2014-09-18 | Optio Labs, Inc. | Systems and methods for securing and locating computing devices |
US11758329B2 (en) * | 2013-03-19 | 2023-09-12 | Nokia Technologies Oy | Audio mixing based upon playing device location |
US20180332395A1 (en) * | 2013-03-19 | 2018-11-15 | Nokia Technologies Oy | Audio Mixing Based Upon Playing Device Location |
US20160129344A1 (en) * | 2013-06-26 | 2016-05-12 | Sony Computer Entertainment Inc. | Information processor, control method of information processor, program, and information storage medium |
US10376777B2 (en) * | 2013-06-26 | 2019-08-13 | Sony Interactive Entertainment Inc. | Information processor, control method of information processor, program, and information storage medium |
US11392636B2 (en) | 2013-10-17 | 2022-07-19 | Nant Holdings Ip, Llc | Augmented reality position-based service, methods, and systems |
US9931566B2 (en) * | 2014-01-29 | 2018-04-03 | Eddie's Social Club, LLC | Game system with interactive show control |
US20170140209A1 (en) * | 2015-11-13 | 2017-05-18 | Xiaomi Inc. | Image recognition method and device for game |
US20170168556A1 (en) * | 2015-12-11 | 2017-06-15 | Disney Enterprises, Inc. | Launching virtual objects using a rail device |
US9904357B2 (en) * | 2015-12-11 | 2018-02-27 | Disney Enterprises, Inc. | Launching virtual objects using a rail device |
CN107441714A (en) * | 2017-06-01 | 2017-12-08 | 杨玉苹 | A kind of image processing method and its device, shooting game fighting system and its method of work for realizing AR first person shooting games |
US20190015754A1 (en) * | 2017-07-16 | 2019-01-17 | Theodor Radu | Apparatus, computer-readable storage medium storing an application thereon, system and method |
US10708715B2 (en) * | 2017-10-16 | 2020-07-07 | Christopher Anthony Silva | Method and system for 3-D location of mobile devices |
US20190116457A1 (en) * | 2017-10-16 | 2019-04-18 | Christopher Anthony Silva | Method and System for 3-D Location of Mobile Devices |
CN108654086A (en) * | 2018-05-09 | 2018-10-16 | 腾讯科技(深圳)有限公司 | Attack object injury acquisition methods, device and equipment in virtual environment |
US11224815B2 (en) | 2018-05-09 | 2022-01-18 | Tencent Technology (Shenzhen) Company Limited | Method and apparatus for obtaining damage value of attack item in virtual environment, method and apparatus for displaying damage value of attack item in virtual environment, and computer device |
CN110496389A (en) * | 2018-05-16 | 2019-11-26 | 恩希软件株式会社 | Game server and the method that memorandum is shared in game server |
US11318379B2 (en) * | 2018-05-16 | 2022-05-03 | Ncsoft Corporation | Game server and method of sharing note in the game server |
US20200171380A1 (en) * | 2018-06-26 | 2020-06-04 | Sony Interactive Entertainment Inc. | Multipoint slam capture |
US11590416B2 (en) * | 2018-06-26 | 2023-02-28 | Sony Interactive Entertainment Inc. | Multipoint SLAM capture |
US20230249086A1 (en) * | 2018-07-25 | 2023-08-10 | Meta Platforms, Inc. | Augmented-Reality Game Overlays in Video Communications |
US11628367B2 (en) * | 2018-07-25 | 2023-04-18 | Meta Platforms, Inc. | Augmented-reality game overlays in video communications |
US10835828B1 (en) * | 2018-07-25 | 2020-11-17 | Facebook, Inc. | Augmented-reality game overlays in video communications |
CN109260703A (en) * | 2018-09-28 | 2019-01-25 | 重庆第五维科技有限公司 | True man's gunbattle game information exchange method based on AR scene |
US10675536B2 (en) * | 2018-10-03 | 2020-06-09 | Song Chen | Gaming system that alters target images produced by an LED array |
US20200108312A1 (en) * | 2018-10-03 | 2020-04-09 | Song Chen | Gaming system |
US11403823B2 (en) | 2019-02-14 | 2022-08-02 | Lego A/S | Toy system for asymmetric multiplayer game play |
US11410488B2 (en) * | 2019-05-03 | 2022-08-09 | Igt | Augmented reality virtual object collection based on symbol combinations |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20110151955A1 (en) | Multi-player augmented reality combat | |
CN111265869B (en) | Virtual object detection method, device, terminal and storage medium | |
US10341162B2 (en) | Augmented reality gaming system | |
CN110448891B (en) | Method, device and storage medium for controlling virtual object to operate remote virtual prop | |
CN110917619B (en) | Interactive property control method, device, terminal and storage medium | |
CN109529356B (en) | Battle result determining method, device and storage medium | |
CN110585710B (en) | Interactive property control method, device, terminal and storage medium | |
CN107469343B (en) | Virtual reality interaction method, device and system | |
JP2024045184A (en) | Method, apparatus and medium for controlling virtual object to mark virtual item | |
CN110538459A (en) | Method, apparatus, device and medium for throwing virtual explosives in virtual environment | |
CN110507990B (en) | Interaction method, device, terminal and storage medium based on virtual aircraft | |
CN111462307A (en) | Virtual image display method, device, equipment and storage medium of virtual object | |
KR101498610B1 (en) | The Tactical Simulation Training Tool by linking Trainee's movement with Virtual Character's movement, Interoperability Method and Trainee Monitoring Method | |
JP2022539289A (en) | VIRTUAL OBJECT AIMING METHOD, APPARATUS AND PROGRAM | |
WO2022083449A1 (en) | Virtual throwing prop using method and device, terminal, and storage medium | |
CN112870715B (en) | Virtual item putting method, device, terminal and storage medium | |
CN113398601B (en) | Information transmission method, information transmission device, computer-readable medium, and apparatus | |
CN110585706B (en) | Interactive property control method, device, terminal and storage medium | |
US20220161138A1 (en) | Method and apparatus for using virtual prop, device, and storage medium | |
CN112044084B (en) | Virtual item control method, device, storage medium and equipment in virtual environment | |
CN113041622A (en) | Virtual throwing object throwing method in virtual environment, terminal and storage medium | |
CN113144597A (en) | Virtual vehicle display method, device, equipment and storage medium | |
WO2023130807A1 (en) | Front sight control method and apparatus in virtual scene, electronic device, and storage medium | |
CN112704875B (en) | Virtual item control method, device, equipment and storage medium | |
CN113713383A (en) | Throwing prop control method and device, computer equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: EXENT TECHNOLOGIES, LTD., ISRAEL Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NAVE, ITAY;REEL/FRAME:025402/0746 Effective date: 20101128 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |