Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberWO1993016776 A1
Publication typeApplication
Application numberPCT/US1993/001159
Publication date2 Sep 1993
Filing date9 Feb 1993
Priority date24 Feb 1992
Also published asCA2105669A1, EP0581943A1
Publication numberPCT/1993/1159, PCT/US/1993/001159, PCT/US/1993/01159, PCT/US/93/001159, PCT/US/93/01159, PCT/US1993/001159, PCT/US1993/01159, PCT/US1993001159, PCT/US199301159, PCT/US93/001159, PCT/US93/01159, PCT/US93001159, PCT/US9301159, WO 1993/016776 A1, WO 1993016776 A1, WO 1993016776A1, WO 9316776 A1, WO 9316776A1, WO-A1-1993016776, WO-A1-9316776, WO1993/016776A1, WO1993016776 A1, WO1993016776A1, WO9316776 A1, WO9316776A1
InventorsAlbert F. Harvard
ApplicantHughes Aircraft Company
Export CitationBiBTeX, EndNote, RefMan
External Links: Patentscope, Espacenet
Virtual image entertainment
WO 1993016776 A1
Abstract
A real-time, interactive, motion-based, simulator entertainment system that employs a computer generated video game (or network of video games) that interact with a motion-based, operator-controlled control station or simulator. The system employs a computer processor, helmet-mounted projection display technology, a motion-based cockpit, control yokes or joy sticks, a sound system, and computer-generated video games. A plurality of (typically two) participants interact with selective and timed video scenarios to achieve an objective. Interaction is achieved using the control yokes and buttons. Each simulator operates independently of the other, except that groups of participants may play the same scenario, possibly at the same time by ganging or networking sets of simulators. Each motion-based simulator is designed and cosmetically enhanced to appear as an armed space vehicle, for example, and comprises an interactive, video scenario virtually displayed on a windshield screen allowing the participants to interact with the system to achieve the predetermined game objective. The simulator system incorporates selection devices (yokes or joy-sticks), display networks, and selection buttons and controls that permit interaction with the system by the participants in response to information from the scenarios presented on the display.
Claims  (OCR text may contain errors)
What is claimed is:
1. A virtual image entertainment system comprising: a simulator comprising a moveable cockpit; seating means disposed in the cockpit for seating a plurality of participants; a first moveable yoke means disposed in the cockpit and operable by one of the participants for causing movement of the cockpit in response to movement of the first yoke means; a virtual image display system comprising a virtual image projector and a display screen located within the view of the participants; a second moveable yoke means operable by another of the participants for providing real-time engagement interaction with a story portrayed as virtual images on the screen in response to movement of the second yoke means; and an audio system coupled to the virtual image display system for providing audio signals corresponding to the virtual images projected by the virtual image projector, whereby, a real-time motion-based simulator incorporating an interactive video game played by two persons as a team is provided.
2. The virtual image entertainment system of Claim 1 further comprising: a head up display for projecting data observable by the participants that includes video images, engagement reticles, and equipment status indicators that are displayed on the display screen.
3. The virtual image entertainment system of Claim 1 further comprising: a networked plurality of simulators coupled to a central control station which controls the plurality of simulators to engage in the same interactive video game, thus providing for team activities that allow multiple participants in one simulator to play against a like number of participants in another simulator.
Description  (OCR text may contain errors)

VIRTUAL IMAGE ENTERTAINMENT

BACKGROUND

The present invention relates generally to entertainment systems, and more particularly, to an entertainment system that employs a computer generated network of Video games interacting with a motion-based, operator-controlled control station, and wherein participants interact with selective and timed video scenarios to achieve a goal. The present invention was inspired by noticeable trends towards interactivity in the entertainment and theme park business. The present invention was conceived after a review of documentation available from Walt Disney Imagineering, the Wall Street Journal, the Los Angeles Times and the New York Times, the Orlando Sentinel, the Hollywood Reporter, Amusement Magazine, the International Association of Amuse- ment Parks and Attractions Inc, Tourists Attractions and Parks, and Omni.

In recent years, shopping malls have become a location for amusement park type attractions with both, iron rides and dark rides. The advent of these attractions, first installed in large scale in the Edmonton, Alberta, Canada Shopping Mall, launched a new concept called Family Fun Centers, and is a secondary target location for the invention.

Cockpit-type arcade video games are conventional in the entertainment busi¬ ness. However, they rely on crude and sometimes difficult to read computer graphics displayed on a distant screen. The fidelity is often poor in these conventional systems, resulting in fluttering and jerky images. Additionally, the motion bases are unreliable, resulting in considerable down time. These conventional systems do not employ head or helmet mounted display-type systems of technology, uniquely designed sound sys- tems, nor do they provide dual player interactivity, whereby each player may influence the outcome of the event (game).

It is therefore an objective of the present invention to provide an interactive video game ride that uses high fidelity (virtual) images in three dimensions and in full color, displayed on a video screen (windshield), or the like. Another objective of the present invention is to provide a basic building block for an amusement park or Family Fun Center attraction. A further objective of the invention is to develop a ride as a stand-alone system that may be used in multiple installations and networked at remote sites.

SUMMARY OF THE INVENTION

In accordance with these and other objectives, the present invention is an enter¬ tainment system that employs a computer generated video game, or a network of video games, that interact with a motion-based, operator-controlled flight or spacecraft simulator control station. The present system employs a microprocessor based computer processor, helmet-mounted aerospace display technology, a motion-based servo controlled cockpit, control yokes (joy sticks) to control cockpit motion and game (scenario) interaction, a sound system, and a computer-generated, video game or network of games. A plurality of participants (typically two) interact with selective and timed video game scenarios to achieve a game objective.

One aspect of the present invention is a stand-alone system that may be used in multiple installations. In its stand-alone configuration, the game objectives are individ¬ ually carried out by each system. In this way, system owners may install as many systems as their needs dictate. There is an added provision that allows ganging of multiple systems for increased throughput and economy of computer processor usage. More specifically, there is provided a real-time, interactive, motion-based simu¬ lator that involves two participants. The system comprises a microprocessor, or com¬ puter processor, and a two-seat motion-based simulator, or a series of such simulators electronically ganged together through a processor network. The computer processor controls the motion of the simulator in response to movement of one of the control yokes using a servo control system that is part of the simulator. Each simulator operates independently of the other, except that the participants may play the same scenario, possibly at the same time. Each station or motion-based simulator is designed and cosmetically enhanced to appear to be an armed space vehicle, for example, and comprises an interactive, video scenario (electronic video game) virtually displayed on a windshield using helmet-mounted aerospace display technology allowing the players to interact with the system to achieve a predetermined game objective. The simulator system incorporates selection devices such as yokes (joy-sticks), display networks, selection buttons and controls that permit interaction with the system in response to information from the game scenarios presented on the display.

The stand-alone feature of the present invention allows independent operation by two persons. However, if desired, a series of simulators may be networked togeth¬ er, each having independent control, to engage in common or multiple scenarios. Under these circumstances, the computer processor is enhanced to accommodate a finite number of simulators, for example two sets of 15. In this case one set (Set A) engages scenario A, while the other set (Set B) engages scenario B. The combinations of these engagement techniques is limited only by the (system) installation capacity of the user.

The present invention incorporates a new display technology employed by the aerospace industry, a newly developed simulator, and an audio system arranged to provide a real-time motion based interactive video game by two players as a team, whereby each player can influence the outcome of the game. The invention may be used as a stand-alone coin/cash/credit card operated device or may be networked with any number of stations whose story line is centrally controlled.

The present invention thus provides a motion-based interactive video game that incorporates newly developed display technology, thus eliminating conventional CRTs, and provides textured true picture characteristics. Two persons comprise a team and include a pilot responsible for controlling the simulator in the video game environment, and a gunner responsible for engaging targets as features of the video game story line, and each participant influences the outcome of the game. The system can stand alone or be ganged (clustered) for maximum participant throughput

BRIEF DESCRIPTION OF THE DRA INGS The various features and advantages of the present invention may be more read¬ ily understood with reference to the following detailed description taken in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements, and in which:

FIG. la shows a side and front cutaway views of a simulator system in accor¬ dance with the principles of the invention, respectively;

FIG.2 shows a perspective view of the simulator system of FIGS, la and lb; FIG. 3 is an over-the-shoulder view of the inside of the system of FIGS 1 and 2 showing the front panel, control yoke, relative hand positions, and a displayed game scenario;

HGS. 4a and 4b show pilot and gunner control yokes, respectively; FIG.5 shows a diagram illustrating the operation of the simulator system 10 and depicts major control lines associated with its operation; and

FIG.6 shows a networked plurality of simulator systems in accordance with the present invention.

DETAILED DESCRIPTION Referring to the drawing figures, FIG. la shows a side cutaway view of a simulator system 10 in accordance with the principles of the invention, FIG. lb shows a front cutaway view thereof, and FIG.2, shows a partially phantom perspective view of the system 10, illustrating the system 10 having a slidable canopy 11 (the slidability is illustrated by an arrow 29 in FIG> 2). FIGS la and lb portray two players or participants 12a, 12b that sit in seats 13a, 13b and operate separate control yokes 14a, 14b. The control yokes 14a, 14b control a motion base 15, control electronics comprising a command and control unit 16, a video projector 17 and a head-up display projector 18. The simulator system 10 comprises a cockpit 20 or vehicle 20 mounted on a platform 21 that is moved by electromechanical or electrohydraulic actuators 22, for example. A maintenance access door 16a is shown that allows access to the command and control unit 16. Other access doors (not shown) are provided for access to the projectors 17, 18 and the speakers 26. The platform 21 provides three degrees of freedom of motion and is enabled to simulate the dynamic movements of flight or the motion of a vehicle that the system 10 emulates. This type of mounting arrangement is generally well known and is well suited to products manufactured by Rediffusion Simulation Ltd., Wilborne-Dorset, England, using for example, a simulator product known as "Commander" or "Venturer". The cockpit 20 is decorated to simulate the appearance of a cabin or cockpit of a vessel or vehicle such as a space ship, or the like, having an opening that mates with the sliding canopy 11, thus allowing ingress and egress of the two participants 12a, 12b. The participants 12a, 12b, also hereafter referred to as a gunner 12a and a pilot 12b, occupy the seats 13a, 13b in a side-by-side relationship. The gunner 12a and pilot 12b face forward and observe screen images 17a, 18a provided by the video and head- up display projectors 17, 18 that provide for video images, engagement reticles, and equipment status indicators that are displayed on a screen 25 that comprises a front window of the cockpit 20. The video projector 17 and head-up display projector 18 are available from the Radar Systems Group of Hughes Aircraft Company, El Segundo, Ca. Sound effects for a simulated video game that is displayed are provided by an sound retrieval system (SRS) type sound system 19, or the like, available from the Industrial Products Division of Hughes Aircraft Company, Rancho Santa Marguerita, Ca., which is incorporated in the command and control unit 17. Four speakers 26, for example, are located at strategic locations in the cockpit 20 and are coupled to the sound system 19.

Referring to FIG.2, there is shown a partially phantom perspective view of the system 10. FIG.2 shows the seat seat belts 27, which, when they are inserted into the seats 13a, 13b after the canopy 11 is closed, indicate through a signal provided by an interlock 27a in each seat 13a, 13b, that the participants 12a, 12b are secure and ready to play. The seats 13a, 13b are mechanically connected to the movements of the canopy 11 and are thus retracted along tracks 31 in a conventional manner provided for each seat 13a, 13b. The seats 13a, 13b are brought to a forward position for play as the canopy 11 closes.

The video game scenario is projected by the video projector 17 and is displayed on the screen 25, which provides a front windscreen for the cockpit 20. The video generator 17 is made by the Radar Systems Group of Hughes Aircraft Company. The head-up display (HUD) also made by Hughes Aircraft Company 18 provides weapons status indicators and an aiming reticle for the gunner 12a and pilot 12b to view. The weapons status indicators and aiming reticle are shown is FIG.5 and are described with reference thereto. However, only the gunner 12a controls the aiming reticle. The various indicators are projected onto a dashboard panel 32, more for aesthetic effects than functional performance.

Referring to FIG. 3, it shows an over-the-shoulder view of the participants 13a, 13b and their view of a selected game scenario provided by the video image 17a. A dashboard panel 32 is provided for special effects presentations to each participant 13a, 13b such as simulated gauges and dials, and lights, and the like. Such special effects are provided by the head-up display projector 18.

Referring to FIGS. 4a and 4b, detailed views of the pilot's and the gunner's yokes 14a, 14b which are identical in design and motion. Their operating features differ as described below. The pilot's yoke 14b pushes forward and pulls to the rear (arrow 41) and the simulator system 10 moves up and down (pitch) in response thereto. When the pilot's yoke 14b is rotated (arrow 42) around an axis through its "steering mechanism" 43b, the simulator system 10 responds thereto and rolls left and right in a like manner. A combination of both movements by the pilot 13b provides three degrees of freedom for the simulator system 10. The pilot 13b operates the motion base 12 in response to the displayed game scenario in a manner necessary to carry out a mission of the game that is presented.

There are spring-loaded buttons 33, 34 on the top of each end of the steering mechanism 43b of the yoke 14b. The left button 33 is used to select the scenario that is played. It is functional after both participants 12a, 12b have successfully engaged their seat belts 27 and the canopy 11 is closed. Instructions on how to select a scenario are elementary and are provided on the screen 25 and are announced over the speakers 26. The right button 34 is used to create engine noise as a right hand grip 35 is rotated to increase or decrease simulated speed. Rotation of the hand grip 35 in a counterclock¬ wise (looking down) direction, increases the simulated speed of the cockpit 20 and story presentation as shown by a visual display on the screen 25 or dashboard panel 32. Additionally the cockpit 20 is caused to lurch forward, giving the participants 12a, 12b a feeling of acceleration. The participants 12a, 12b feels as though they are being propelled through the simulated display 17a on the screen 25.

Rotation of the hand grip 35 in the opposite direction (clockwise) reduces speed and a corresponding decrease in cockpit 20 and story motion occurs. When the hand grip 35 reaches a clockwise stop (not shown), the cockpit 20 will appear to have stopped unless a computational program in the command and control unit 16 detects that gravity would necessarily pull the cockpit 20 toward the ground. At this time, alerts sound requiring the pilot 12b to accelerate to maintain an airborne condition. In selected scenarios, the cockpit 20 may have ground capabilities. In such a case, the operation of the pilot's yoke 14b acts in a manner similar to the steering wheel of an automobile. Under these conditions, the forward and rearward movement of the pilot' s yoke 14b has no effect on the cockpit 20, but the rotation thereof provides a modified roll, thus imparting to the participants 12a, 12b that the cockpit 20 is moving through turns.

Referring to FIG.4b, the gunner's yoke 14a has the same mechanical features of the pilot's yoke 14b. However the functional translations are different. When the gunner's yoke 14a is pushed forward or pulled rearward (arrow 44), the effect is to position the sighting reticle 49 in elevation. When the steering mechanism 43b of the gunner's yoke 14a is rotated (arrow 45), this positions the sighting reticle 49 in azi¬ muth. The hand grip 38 of the gunner's yoke 14a allows the gunner 12a to slew the reticle 49 in and out in range. In this way the gunner 12a positions his sighting reticle 49 on any feature observed on the screen 25. The left button 36 of the gunner' s yoke 14a is used to engage simulated targets. Time of flight and visual assessment is left to computational elements in the command and control unit 16. The right button 37 on the gunner's yoke 14a allows the gunner to shoot at a target to which the reticle 49 has been positioned. In operation, and with reference to FIG. 2, once the gunner 12a and pilot 12b are seated, the canopy 11 is automatically closed after detection of proper installation of the seat belts 27. The opening and closing motion of the canopy 11 is indicated by the arrow 29. The detection of seat belt closure is facilitated by an interlock switch 27a coupled to the command and control unit 17 activated when the seat belts 27 are secure¬ ly engaged. The game is initiated by the pilot 12b, who selects a game scenario (sever¬ al are available) to be played by depressing the game-select button 33 (shown in FIG. 4a) on the pilot's yoke 14b. The pilot's yoke 14b controls the hydraulic action of the simulator system 10 in a manner appropriate for an aircraft, space vehicle, or an auto¬ mobile, for example. The gunner's yoke 14a controls the use of on-board simulated weapon systems which are designated as lasers or missiles. Selection of weapons type is controlled by the thumb switches 36, 37 (shown in FIG. 4b) on the right and left sides of the gunner's yoke 14a. Display of weapon selected and rounds left or shots left is available on the screen 17a near the reticle 49.

In playing a particular scenario, movement of the simulator system 10 by the pilot 12b during missile engagement, could cause a miss. An alert is provided to the pilot 12b that indicates that missiles are to be fired. To ensure that there is no move- ment of the simulator system 10 until a missile locked-on condition is established, an audible recommendation to the pilot 12b is provided over the speakers 26. This brief delay in the game provides excitement for both participants 12a, 12b and generates a team esprit.

Referring to FIG. 5 it shows a diagram illustrating the operation of the simula- tor system 10 and depicts the major control lines associated with its operation. FIG. 5 shows the projection of the game scenario to the participants 12a, 12b as a virtual image 17a displayed on the windshield or display screen 25 of the cockpit 20. The reticle weapon states, and gages 49 used by the gunner 12a are also displayed on the display screen 25 (or dashboard display 32) in a manner that is easily seen by the gunner 12a and pilot 12b.

The command and control unit 17 comprises a video generator 51, a central processing unit (CPU) 52 such as a microprocessor or computer processor, and the SRS-type sound system 19 A power supply 53 is provided to power each of the units of the command and control unit 17. The central processing unit 52 controls a motion base controller 54 that is coupled to the motion base 15 in a conventional manner.

Position commands are provided to the motion base controller 54 and servo controls are provided from the controller 54 to the motion base 15 to move it in response to move¬ ment of the pilot's yoke 14b. A feedback line is provided from the pilot's yoke 14b to the central processing unit 52 to close the control loop. The central processing unit 52 controls the projection of the image 17a and audio through the speakers 26. The central processing unit 52 also controls the projection of the image 17a representing the reticle and gages 49. The signals from the interlocks 27a are also coupled to the central pro¬ cessing unit 52.

Referring to FIG. 6 it shows a layout of multiple simulator systems 10a in a clustered manner that allows for increased participant throughput. A central control station 60 is designed with appropriate processing systems that allow the simulator systems 10 to play the same scenario, or team up various combinations of systems 10 as opponents. This is provided for by processing logic or software in the central pro¬ cessing unit located in the central control station 60. Such processing logic or software may be readily designed by those skilled in the art and will not be detailed herein. Add- itionally, provisions may be made by appropriately configuring the logic or software to allow each pair of participants 12a, 12b in a particular simulator system 10 to know what other team, by system number, for example, is playing the same game. This allows for performance comparisons after play is over. Provisions are allowed that in the case where multiple systems are to be installed in the users facility, the computing elements and story generators are integrated into the central control station 60. In this way the central control station 60 is activated after all player systems 10 have logged on by virtue of closing the slidable canopy 11. Although FIG. 6 depicts a circular arrange¬ ment, any layout may be employed, limited only by the user's facilities. Power and data may be installed individually as shown by cables 61 or may be bussed as shown by cables 62.

The stand-alone feature of the simulator system 10 of the present invention allows independent operation by two persons. However, if desired, and as described with reference to FIG.6, a plurality of simulators may be networked together, each having independent control, to engage in a common scenario. Under these circum- stances, the central processing unit 52 is enhanced to accommodate processing for a finite number of simulators, for example two sets of 15. In this case one set (Set A) engages scenario A, while the other set (Set B) engages scenario B. The combinations of these engagement techniques is limited only by the system installation capacity of the user. This provides for team activities that allow multiple participants in one simulator to play against a like number of participants in another simulator.

Throughput of the networked embodiment of FIG. 6 is estimated at 24 players per hour per system as follows (times in seconds, minutes indicated in parentheses): Ingress and installation: 30

System intro/start 10 System play 230 (3.8 min)

Disengage/egress 30 system reset

Total time 300 (5 min) As shown in FIG. 6, with sixteen systems in the network, throughput would be 384 players per hour.

In summary, the simulator system 10 of the present invention incorporates virtual image display technology, a simulator, and a audio system adapted to present realistic sound effects arranged to provide a real-time motion based interactive video game for use by two players acting as a team, whereby each players can influence the outcome of the game. The invention may be used as a stand-alone coin/cash/credit card operated device or may be networked to any number of stations centrally controlled, and limited only by the installation capacity of the user. Thus there has been described a new entertainment system that incorporates new display and audio technology and employs a computer generated video game interacting with a motion based operator controlled simulator wherein two participants interact with selective and timed video scenarios to achieve a game objective. It is to be understood that the above-described embodiments are merely illustrative of some of the many spe- cific embodiments which represent applications of the principles of the present inven¬ tion. Clearly, numerous and other arrangements can be readily devised by those skilled in the art without departing from the scope of the invention.

Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
WO1983002028A1 *30 Nov 19829 Jun 1983Christopher JamesGlider flight simulator
WO1992016922A1 *4 Mar 19921 Oct 1992Atari Games CorporationVehicle simulator including cross-network feedback
WO1992021117A1 *22 May 199226 Nov 1992Atari Games CorporationModular display simulator
US4066256 *17 Nov 19753 Jan 1978Future General CorporationAmusement ride
US4303394 *10 Jul 19801 Dec 1981The United States Of America As Represented By The Secretary Of The NavyComputer generated image simulator
US4322726 *19 Dec 197930 Mar 1982The Singer CompanyApparatus for providing a simulated view to hand held binoculars
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
WO1995035140A1 *20 Jun 199528 Dec 1995Sega Enterprises Ltd.Method of and apparatus for controlling direction of object
WO1996031831A1 *28 Mar 199610 Oct 1996Benkel GerardElectronic competition system and method for using same
WO1997003740A1 *24 Jun 19966 Feb 1997Latypov Nurakhmed NurislamovicMethod of surrounding a user with virtual reality and a device for carrying out the method
WO1999030789A1 *15 Dec 199824 Jun 1999Phillip Craig HouriganMini-theatre and frame assembly therefor
WO2000072930A1 *1 Jun 20007 Dec 2000Mark RiderLarge screen gaming system and facility therefor
CN103341271A *25 Jun 20139 Oct 2013上海德理孚自动化系统有限公司Dynamic audio and video system with multiple freedom degrees
DE19934913A1 *21 Jul 199925 Jan 2001Deutsche Telekom AgConducting virtual sports competition via telecommunications network, especially Internet, involves connecting competitors via speech input/output arrangement, video camera and screen
EP0691146A1 *4 Jul 199510 Jan 1996Sega Enterprises, Ltd.A game apparatus using a video display device
EP0887682A1 *19 Nov 199730 Dec 1998Sony CorporationDisplay
EP0887682A4 *19 Nov 199717 Nov 1999Sony CorpDisplay
EP1358918A2 *11 Apr 20035 Nov 2003Nintendo Co., LimitedGame machine and game program
EP1358918A3 *11 Apr 200313 Oct 2004Nintendo Co., LimitedGame machine and game program
EP1745831A2 *1 Jun 200024 Jan 2007Mark RiderLarge screen gaming system and facility therefor
EP1745831A3 *1 Jun 200028 Dec 2011TimePlay IP Inc.Large screen gaming system and facility therefor
US5766079 *20 Jun 199516 Jun 1998Sega Enterprises Ltd.Object direction control method and apparatus
US5971853 *15 Jun 199826 Oct 1999Kabushiki Kaisha Sega EnterprisesObject direction control method and apparatus
US6179619 *12 May 199830 Jan 2001Shigenobu TanakaGame machine for moving object
US62579821 Jun 199910 Jul 2001Mark RiderMotion picture theater interactive gaming system
US625956519 Nov 199710 Jul 2001Sony CorporationDisplay apparatus
US6283757 *8 Oct 19994 Sep 2001Simulation Entertainment Group, Inc.Full motion two seat interactive simulator
US71985686 Nov 20023 Apr 2007Nintendo Co., Ltd.Game machine and game program for changing the movement of one character based on the movement of another character
US886456623 Apr 200721 Oct 2014Timeplay Inc.System, method and handheld controller for multi-player gaming
US895112415 Dec 200910 Feb 2015Timeplay, Inc.System, method and handheld controller for multi-player gaming
US964308327 Jan 20169 May 2017Timeplay Inc.System, method and handheld controller for multi-player gaming
US966257027 Jan 201630 May 2017Timeplay Inc.System, method and handheld controller for multi-player gaming
US96693211 Oct 20156 Jun 2017Figment Productions LimitedSystem for providing a virtual reality experience
US967587922 Jan 201413 Jun 2017Timeplay Inc.System, method and handheld controller for multi-player gaming
US967588012 Dec 201413 Jun 2017Timeplay Inc.System, method and handheld controller for multi-player gaming
US968231722 Jan 201420 Jun 2017Timeplay Inc.System, method and handheld controller for multi-player gaming
US975100922 Jan 20145 Sep 2017Timeplay Inc.System, method and handheld controller for multi-player gaming
Classifications
International ClassificationA63F13/00, G09B9/08, G09B9/00, A63F13/12, A63G31/00, G09B9/32, G09B9/30
Cooperative ClassificationA63F13/12, A63F2300/50, A63F2300/8017, A63F13/28, G09B9/08, A63F2300/8076, A63F13/803, A63F13/843, A63F2300/8088, G09B9/30, G09B9/003
European ClassificationG09B9/08, G09B9/30, G09B9/00B, A63F13/12
Legal Events
DateCodeEventDescription
2 Sep 1993ALDesignated countries for regional patents
Kind code of ref document: A1
Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE
2 Sep 1993AKDesignated states
Kind code of ref document: A1
Designated state(s): CA JP KR
7 Sep 1993ENPEntry into the national phase in:
Ref country code: CA
Ref document number: 2105669
Kind code of ref document: A
Format of ref document f/p: F
7 Sep 1993WWEWipo information: entry into national phase
Ref document number: 2105669
Country of ref document: CA
5 Nov 1993WWEWipo information: entry into national phase
Ref document number: 1993905003
Country of ref document: EP
9 Feb 1994WWPWipo information: published in national office
Ref document number: 1993905003
Country of ref document: EP
9 Apr 1995WWWWipo information: withdrawn in national office
Ref document number: 1993905003
Country of ref document: EP