|Publication number||WO1993016776 A1|
|Publication date||2 Sep 1993|
|Filing date||9 Feb 1993|
|Priority date||24 Feb 1992|
|Also published as||CA2105669A1, EP0581943A1|
|Publication number||PCT/1993/1159, PCT/US/1993/001159, PCT/US/1993/01159, PCT/US/93/001159, PCT/US/93/01159, PCT/US1993/001159, PCT/US1993/01159, PCT/US1993001159, PCT/US199301159, PCT/US93/001159, PCT/US93/01159, PCT/US93001159, PCT/US9301159, WO 1993/016776 A1, WO 1993016776 A1, WO 1993016776A1, WO 9316776 A1, WO 9316776A1, WO-A1-1993016776, WO-A1-9316776, WO1993/016776A1, WO1993016776 A1, WO1993016776A1, WO9316776 A1, WO9316776A1|
|Inventors||Albert F. Harvard|
|Applicant||Hughes Aircraft Company|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (6), Referenced by (30), Classifications (22), Legal Events (7)|
|External Links: Patentscope, Espacenet|
VIRTUAL IMAGE ENTERTAINMENT
The present invention relates generally to entertainment systems, and more particularly, to an entertainment system that employs a computer generated network of Video games interacting with a motion-based, operator-controlled control station, and wherein participants interact with selective and timed video scenarios to achieve a goal. The present invention was inspired by noticeable trends towards interactivity in the entertainment and theme park business. The present invention was conceived after a review of documentation available from Walt Disney Imagineering, the Wall Street Journal, the Los Angeles Times and the New York Times, the Orlando Sentinel, the Hollywood Reporter, Amusement Magazine, the International Association of Amuse- ment Parks and Attractions Inc, Tourists Attractions and Parks, and Omni.
In recent years, shopping malls have become a location for amusement park type attractions with both, iron rides and dark rides. The advent of these attractions, first installed in large scale in the Edmonton, Alberta, Canada Shopping Mall, launched a new concept called Family Fun Centers, and is a secondary target location for the invention.
Cockpit-type arcade video games are conventional in the entertainment busi¬ ness. However, they rely on crude and sometimes difficult to read computer graphics displayed on a distant screen. The fidelity is often poor in these conventional systems, resulting in fluttering and jerky images. Additionally, the motion bases are unreliable, resulting in considerable down time. These conventional systems do not employ head or helmet mounted display-type systems of technology, uniquely designed sound sys- tems, nor do they provide dual player interactivity, whereby each player may influence the outcome of the event (game).
It is therefore an objective of the present invention to provide an interactive video game ride that uses high fidelity (virtual) images in three dimensions and in full color, displayed on a video screen (windshield), or the like. Another objective of the present invention is to provide a basic building block for an amusement park or Family Fun Center attraction. A further objective of the invention is to develop a ride as a stand-alone system that may be used in multiple installations and networked at remote sites.
SUMMARY OF THE INVENTION
In accordance with these and other objectives, the present invention is an enter¬ tainment system that employs a computer generated video game, or a network of video games, that interact with a motion-based, operator-controlled flight or spacecraft simulator control station. The present system employs a microprocessor based computer processor, helmet-mounted aerospace display technology, a motion-based servo controlled cockpit, control yokes (joy sticks) to control cockpit motion and game (scenario) interaction, a sound system, and a computer-generated, video game or network of games. A plurality of participants (typically two) interact with selective and timed video game scenarios to achieve a game objective.
One aspect of the present invention is a stand-alone system that may be used in multiple installations. In its stand-alone configuration, the game objectives are individ¬ ually carried out by each system. In this way, system owners may install as many systems as their needs dictate. There is an added provision that allows ganging of multiple systems for increased throughput and economy of computer processor usage. More specifically, there is provided a real-time, interactive, motion-based simu¬ lator that involves two participants. The system comprises a microprocessor, or com¬ puter processor, and a two-seat motion-based simulator, or a series of such simulators electronically ganged together through a processor network. The computer processor controls the motion of the simulator in response to movement of one of the control yokes using a servo control system that is part of the simulator. Each simulator operates independently of the other, except that the participants may play the same scenario, possibly at the same time. Each station or motion-based simulator is designed and cosmetically enhanced to appear to be an armed space vehicle, for example, and comprises an interactive, video scenario (electronic video game) virtually displayed on a windshield using helmet-mounted aerospace display technology allowing the players to interact with the system to achieve a predetermined game objective. The simulator system incorporates selection devices such as yokes (joy-sticks), display networks, selection buttons and controls that permit interaction with the system in response to information from the game scenarios presented on the display.
The stand-alone feature of the present invention allows independent operation by two persons. However, if desired, a series of simulators may be networked togeth¬ er, each having independent control, to engage in common or multiple scenarios. Under these circumstances, the computer processor is enhanced to accommodate a finite number of simulators, for example two sets of 15. In this case one set (Set A) engages scenario A, while the other set (Set B) engages scenario B. The combinations of these engagement techniques is limited only by the (system) installation capacity of the user.
The present invention incorporates a new display technology employed by the aerospace industry, a newly developed simulator, and an audio system arranged to provide a real-time motion based interactive video game by two players as a team, whereby each player can influence the outcome of the game. The invention may be used as a stand-alone coin/cash/credit card operated device or may be networked with any number of stations whose story line is centrally controlled.
The present invention thus provides a motion-based interactive video game that incorporates newly developed display technology, thus eliminating conventional CRTs, and provides textured true picture characteristics. Two persons comprise a team and include a pilot responsible for controlling the simulator in the video game environment, and a gunner responsible for engaging targets as features of the video game story line, and each participant influences the outcome of the game. The system can stand alone or be ganged (clustered) for maximum participant throughput
BRIEF DESCRIPTION OF THE DRA INGS The various features and advantages of the present invention may be more read¬ ily understood with reference to the following detailed description taken in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements, and in which:
FIG. la shows a side and front cutaway views of a simulator system in accor¬ dance with the principles of the invention, respectively;
FIG.2 shows a perspective view of the simulator system of FIGS, la and lb; FIG. 3 is an over-the-shoulder view of the inside of the system of FIGS 1 and 2 showing the front panel, control yoke, relative hand positions, and a displayed game scenario;
HGS. 4a and 4b show pilot and gunner control yokes, respectively; FIG.5 shows a diagram illustrating the operation of the simulator system 10 and depicts major control lines associated with its operation; and
FIG.6 shows a networked plurality of simulator systems in accordance with the present invention.
DETAILED DESCRIPTION Referring to the drawing figures, FIG. la shows a side cutaway view of a simulator system 10 in accordance with the principles of the invention, FIG. lb shows a front cutaway view thereof, and FIG.2, shows a partially phantom perspective view of the system 10, illustrating the system 10 having a slidable canopy 11 (the slidability is illustrated by an arrow 29 in FIG> 2). FIGS la and lb portray two players or participants 12a, 12b that sit in seats 13a, 13b and operate separate control yokes 14a, 14b. The control yokes 14a, 14b control a motion base 15, control electronics comprising a command and control unit 16, a video projector 17 and a head-up display projector 18. The simulator system 10 comprises a cockpit 20 or vehicle 20 mounted on a platform 21 that is moved by electromechanical or electrohydraulic actuators 22, for example. A maintenance access door 16a is shown that allows access to the command and control unit 16. Other access doors (not shown) are provided for access to the projectors 17, 18 and the speakers 26. The platform 21 provides three degrees of freedom of motion and is enabled to simulate the dynamic movements of flight or the motion of a vehicle that the system 10 emulates. This type of mounting arrangement is generally well known and is well suited to products manufactured by Rediffusion Simulation Ltd., Wilborne-Dorset, England, using for example, a simulator product known as "Commander" or "Venturer". The cockpit 20 is decorated to simulate the appearance of a cabin or cockpit of a vessel or vehicle such as a space ship, or the like, having an opening that mates with the sliding canopy 11, thus allowing ingress and egress of the two participants 12a, 12b. The participants 12a, 12b, also hereafter referred to as a gunner 12a and a pilot 12b, occupy the seats 13a, 13b in a side-by-side relationship. The gunner 12a and pilot 12b face forward and observe screen images 17a, 18a provided by the video and head- up display projectors 17, 18 that provide for video images, engagement reticles, and equipment status indicators that are displayed on a screen 25 that comprises a front window of the cockpit 20. The video projector 17 and head-up display projector 18 are available from the Radar Systems Group of Hughes Aircraft Company, El Segundo, Ca. Sound effects for a simulated video game that is displayed are provided by an sound retrieval system (SRS) type sound system 19, or the like, available from the Industrial Products Division of Hughes Aircraft Company, Rancho Santa Marguerita, Ca., which is incorporated in the command and control unit 17. Four speakers 26, for example, are located at strategic locations in the cockpit 20 and are coupled to the sound system 19.
Referring to FIG.2, there is shown a partially phantom perspective view of the system 10. FIG.2 shows the seat seat belts 27, which, when they are inserted into the seats 13a, 13b after the canopy 11 is closed, indicate through a signal provided by an interlock 27a in each seat 13a, 13b, that the participants 12a, 12b are secure and ready to play. The seats 13a, 13b are mechanically connected to the movements of the canopy 11 and are thus retracted along tracks 31 in a conventional manner provided for each seat 13a, 13b. The seats 13a, 13b are brought to a forward position for play as the canopy 11 closes.
The video game scenario is projected by the video projector 17 and is displayed on the screen 25, which provides a front windscreen for the cockpit 20. The video generator 17 is made by the Radar Systems Group of Hughes Aircraft Company. The head-up display (HUD) also made by Hughes Aircraft Company 18 provides weapons status indicators and an aiming reticle for the gunner 12a and pilot 12b to view. The weapons status indicators and aiming reticle are shown is FIG.5 and are described with reference thereto. However, only the gunner 12a controls the aiming reticle. The various indicators are projected onto a dashboard panel 32, more for aesthetic effects than functional performance.
Referring to FIG. 3, it shows an over-the-shoulder view of the participants 13a, 13b and their view of a selected game scenario provided by the video image 17a. A dashboard panel 32 is provided for special effects presentations to each participant 13a, 13b such as simulated gauges and dials, and lights, and the like. Such special effects are provided by the head-up display projector 18.
Referring to FIGS. 4a and 4b, detailed views of the pilot's and the gunner's yokes 14a, 14b which are identical in design and motion. Their operating features differ as described below. The pilot's yoke 14b pushes forward and pulls to the rear (arrow 41) and the simulator system 10 moves up and down (pitch) in response thereto. When the pilot's yoke 14b is rotated (arrow 42) around an axis through its "steering mechanism" 43b, the simulator system 10 responds thereto and rolls left and right in a like manner. A combination of both movements by the pilot 13b provides three degrees of freedom for the simulator system 10. The pilot 13b operates the motion base 12 in response to the displayed game scenario in a manner necessary to carry out a mission of the game that is presented.
There are spring-loaded buttons 33, 34 on the top of each end of the steering mechanism 43b of the yoke 14b. The left button 33 is used to select the scenario that is played. It is functional after both participants 12a, 12b have successfully engaged their seat belts 27 and the canopy 11 is closed. Instructions on how to select a scenario are elementary and are provided on the screen 25 and are announced over the speakers 26. The right button 34 is used to create engine noise as a right hand grip 35 is rotated to increase or decrease simulated speed. Rotation of the hand grip 35 in a counterclock¬ wise (looking down) direction, increases the simulated speed of the cockpit 20 and story presentation as shown by a visual display on the screen 25 or dashboard panel 32. Additionally the cockpit 20 is caused to lurch forward, giving the participants 12a, 12b a feeling of acceleration. The participants 12a, 12b feels as though they are being propelled through the simulated display 17a on the screen 25.
Rotation of the hand grip 35 in the opposite direction (clockwise) reduces speed and a corresponding decrease in cockpit 20 and story motion occurs. When the hand grip 35 reaches a clockwise stop (not shown), the cockpit 20 will appear to have stopped unless a computational program in the command and control unit 16 detects that gravity would necessarily pull the cockpit 20 toward the ground. At this time, alerts sound requiring the pilot 12b to accelerate to maintain an airborne condition. In selected scenarios, the cockpit 20 may have ground capabilities. In such a case, the operation of the pilot's yoke 14b acts in a manner similar to the steering wheel of an automobile. Under these conditions, the forward and rearward movement of the pilot' s yoke 14b has no effect on the cockpit 20, but the rotation thereof provides a modified roll, thus imparting to the participants 12a, 12b that the cockpit 20 is moving through turns.
Referring to FIG.4b, the gunner's yoke 14a has the same mechanical features of the pilot's yoke 14b. However the functional translations are different. When the gunner's yoke 14a is pushed forward or pulled rearward (arrow 44), the effect is to position the sighting reticle 49 in elevation. When the steering mechanism 43b of the gunner's yoke 14a is rotated (arrow 45), this positions the sighting reticle 49 in azi¬ muth. The hand grip 38 of the gunner's yoke 14a allows the gunner 12a to slew the reticle 49 in and out in range. In this way the gunner 12a positions his sighting reticle 49 on any feature observed on the screen 25. The left button 36 of the gunner' s yoke 14a is used to engage simulated targets. Time of flight and visual assessment is left to computational elements in the command and control unit 16. The right button 37 on the gunner's yoke 14a allows the gunner to shoot at a target to which the reticle 49 has been positioned. In operation, and with reference to FIG. 2, once the gunner 12a and pilot 12b are seated, the canopy 11 is automatically closed after detection of proper installation of the seat belts 27. The opening and closing motion of the canopy 11 is indicated by the arrow 29. The detection of seat belt closure is facilitated by an interlock switch 27a coupled to the command and control unit 17 activated when the seat belts 27 are secure¬ ly engaged. The game is initiated by the pilot 12b, who selects a game scenario (sever¬ al are available) to be played by depressing the game-select button 33 (shown in FIG. 4a) on the pilot's yoke 14b. The pilot's yoke 14b controls the hydraulic action of the simulator system 10 in a manner appropriate for an aircraft, space vehicle, or an auto¬ mobile, for example. The gunner's yoke 14a controls the use of on-board simulated weapon systems which are designated as lasers or missiles. Selection of weapons type is controlled by the thumb switches 36, 37 (shown in FIG. 4b) on the right and left sides of the gunner's yoke 14a. Display of weapon selected and rounds left or shots left is available on the screen 17a near the reticle 49.
In playing a particular scenario, movement of the simulator system 10 by the pilot 12b during missile engagement, could cause a miss. An alert is provided to the pilot 12b that indicates that missiles are to be fired. To ensure that there is no move- ment of the simulator system 10 until a missile locked-on condition is established, an audible recommendation to the pilot 12b is provided over the speakers 26. This brief delay in the game provides excitement for both participants 12a, 12b and generates a team esprit.
Referring to FIG. 5 it shows a diagram illustrating the operation of the simula- tor system 10 and depicts the major control lines associated with its operation. FIG. 5 shows the projection of the game scenario to the participants 12a, 12b as a virtual image 17a displayed on the windshield or display screen 25 of the cockpit 20. The reticle weapon states, and gages 49 used by the gunner 12a are also displayed on the display screen 25 (or dashboard display 32) in a manner that is easily seen by the gunner 12a and pilot 12b.
The command and control unit 17 comprises a video generator 51, a central processing unit (CPU) 52 such as a microprocessor or computer processor, and the SRS-type sound system 19 A power supply 53 is provided to power each of the units of the command and control unit 17. The central processing unit 52 controls a motion base controller 54 that is coupled to the motion base 15 in a conventional manner.
Position commands are provided to the motion base controller 54 and servo controls are provided from the controller 54 to the motion base 15 to move it in response to move¬ ment of the pilot's yoke 14b. A feedback line is provided from the pilot's yoke 14b to the central processing unit 52 to close the control loop. The central processing unit 52 controls the projection of the image 17a and audio through the speakers 26. The central processing unit 52 also controls the projection of the image 17a representing the reticle and gages 49. The signals from the interlocks 27a are also coupled to the central pro¬ cessing unit 52.
Referring to FIG. 6 it shows a layout of multiple simulator systems 10a in a clustered manner that allows for increased participant throughput. A central control station 60 is designed with appropriate processing systems that allow the simulator systems 10 to play the same scenario, or team up various combinations of systems 10 as opponents. This is provided for by processing logic or software in the central pro¬ cessing unit located in the central control station 60. Such processing logic or software may be readily designed by those skilled in the art and will not be detailed herein. Add- itionally, provisions may be made by appropriately configuring the logic or software to allow each pair of participants 12a, 12b in a particular simulator system 10 to know what other team, by system number, for example, is playing the same game. This allows for performance comparisons after play is over. Provisions are allowed that in the case where multiple systems are to be installed in the users facility, the computing elements and story generators are integrated into the central control station 60. In this way the central control station 60 is activated after all player systems 10 have logged on by virtue of closing the slidable canopy 11. Although FIG. 6 depicts a circular arrange¬ ment, any layout may be employed, limited only by the user's facilities. Power and data may be installed individually as shown by cables 61 or may be bussed as shown by cables 62.
The stand-alone feature of the simulator system 10 of the present invention allows independent operation by two persons. However, if desired, and as described with reference to FIG.6, a plurality of simulators may be networked together, each having independent control, to engage in a common scenario. Under these circum- stances, the central processing unit 52 is enhanced to accommodate processing for a finite number of simulators, for example two sets of 15. In this case one set (Set A) engages scenario A, while the other set (Set B) engages scenario B. The combinations of these engagement techniques is limited only by the system installation capacity of the user. This provides for team activities that allow multiple participants in one simulator to play against a like number of participants in another simulator.
Throughput of the networked embodiment of FIG. 6 is estimated at 24 players per hour per system as follows (times in seconds, minutes indicated in parentheses): Ingress and installation: 30
System intro/start 10 System play 230 (3.8 min)
Disengage/egress 30 system reset
Total time 300 (5 min) As shown in FIG. 6, with sixteen systems in the network, throughput would be 384 players per hour.
In summary, the simulator system 10 of the present invention incorporates virtual image display technology, a simulator, and a audio system adapted to present realistic sound effects arranged to provide a real-time motion based interactive video game for use by two players acting as a team, whereby each players can influence the outcome of the game. The invention may be used as a stand-alone coin/cash/credit card operated device or may be networked to any number of stations centrally controlled, and limited only by the installation capacity of the user. Thus there has been described a new entertainment system that incorporates new display and audio technology and employs a computer generated video game interacting with a motion based operator controlled simulator wherein two participants interact with selective and timed video scenarios to achieve a game objective. It is to be understood that the above-described embodiments are merely illustrative of some of the many spe- cific embodiments which represent applications of the principles of the present inven¬ tion. Clearly, numerous and other arrangements can be readily devised by those skilled in the art without departing from the scope of the invention.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|WO1983002028A1 *||30 Nov 1982||9 Jun 1983||Christopher James||Glider flight simulator|
|WO1992016922A1 *||4 Mar 1992||1 Oct 1992||Atari Games Corporation||Vehicle simulator including cross-network feedback|
|WO1992021117A1 *||22 May 1992||26 Nov 1992||Atari Games Corporation||Modular display simulator|
|US4066256 *||17 Nov 1975||3 Jan 1978||Future General Corporation||Amusement ride|
|US4303394 *||10 Jul 1980||1 Dec 1981||The United States Of America As Represented By The Secretary Of The Navy||Computer generated image simulator|
|US4322726 *||19 Dec 1979||30 Mar 1982||The Singer Company||Apparatus for providing a simulated view to hand held binoculars|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|WO1995035140A1 *||20 Jun 1995||28 Dec 1995||Sega Enterprises Ltd.||Method of and apparatus for controlling direction of object|
|WO1996031831A1 *||28 Mar 1996||10 Oct 1996||Benkel Gerard||Electronic competition system and method for using same|
|WO1997003740A1 *||24 Jun 1996||6 Feb 1997||Latypov Nurakhmed Nurislamovic||Method of surrounding a user with virtual reality and a device for carrying out the method|
|WO1999030789A1 *||15 Dec 1998||24 Jun 1999||Phillip Craig Hourigan||Mini-theatre and frame assembly therefor|
|WO2000072930A1 *||1 Jun 2000||7 Dec 2000||Mark Rider||Large screen gaming system and facility therefor|
|CN103341271A *||25 Jun 2013||9 Oct 2013||上海德理孚自动化系统有限公司||Dynamic audio and video system with multiple freedom degrees|
|DE19934913A1 *||21 Jul 1999||25 Jan 2001||Deutsche Telekom Ag||Conducting virtual sports competition via telecommunications network, especially Internet, involves connecting competitors via speech input/output arrangement, video camera and screen|
|EP0691146A1 *||4 Jul 1995||10 Jan 1996||Sega Enterprises, Ltd.||A game apparatus using a video display device|
|EP0887682A1 *||19 Nov 1997||30 Dec 1998||Sony Corporation||Display|
|EP0887682A4 *||19 Nov 1997||17 Nov 1999||Sony Corp||Display|
|EP1358918A2 *||11 Apr 2003||5 Nov 2003||Nintendo Co., Limited||Game machine and game program|
|EP1358918A3 *||11 Apr 2003||13 Oct 2004||Nintendo Co., Limited||Game machine and game program|
|EP1745831A2 *||1 Jun 2000||24 Jan 2007||Mark Rider||Large screen gaming system and facility therefor|
|EP1745831A3 *||1 Jun 2000||28 Dec 2011||TimePlay IP Inc.||Large screen gaming system and facility therefor|
|US5766079 *||20 Jun 1995||16 Jun 1998||Sega Enterprises Ltd.||Object direction control method and apparatus|
|US5971853 *||15 Jun 1998||26 Oct 1999||Kabushiki Kaisha Sega Enterprises||Object direction control method and apparatus|
|US6179619 *||12 May 1998||30 Jan 2001||Shigenobu Tanaka||Game machine for moving object|
|US6257982||1 Jun 1999||10 Jul 2001||Mark Rider||Motion picture theater interactive gaming system|
|US6259565||19 Nov 1997||10 Jul 2001||Sony Corporation||Display apparatus|
|US6283757 *||8 Oct 1999||4 Sep 2001||Simulation Entertainment Group, Inc.||Full motion two seat interactive simulator|
|US7198568||6 Nov 2002||3 Apr 2007||Nintendo Co., Ltd.||Game machine and game program for changing the movement of one character based on the movement of another character|
|US8864566||23 Apr 2007||21 Oct 2014||Timeplay Inc.||System, method and handheld controller for multi-player gaming|
|US8951124||15 Dec 2009||10 Feb 2015||Timeplay, Inc.||System, method and handheld controller for multi-player gaming|
|US9643083||27 Jan 2016||9 May 2017||Timeplay Inc.||System, method and handheld controller for multi-player gaming|
|US9662570||27 Jan 2016||30 May 2017||Timeplay Inc.||System, method and handheld controller for multi-player gaming|
|US9669321||1 Oct 2015||6 Jun 2017||Figment Productions Limited||System for providing a virtual reality experience|
|US9675879||22 Jan 2014||13 Jun 2017||Timeplay Inc.||System, method and handheld controller for multi-player gaming|
|US9675880||12 Dec 2014||13 Jun 2017||Timeplay Inc.||System, method and handheld controller for multi-player gaming|
|US9682317||22 Jan 2014||20 Jun 2017||Timeplay Inc.||System, method and handheld controller for multi-player gaming|
|US9751009||22 Jan 2014||5 Sep 2017||Timeplay Inc.||System, method and handheld controller for multi-player gaming|
|International Classification||A63F13/00, G09B9/08, G09B9/00, A63F13/12, A63G31/00, G09B9/32, G09B9/30|
|Cooperative Classification||A63F13/12, A63F2300/50, A63F2300/8017, A63F13/28, G09B9/08, A63F2300/8076, A63F13/803, A63F13/843, A63F2300/8088, G09B9/30, G09B9/003|
|European Classification||G09B9/08, G09B9/30, G09B9/00B, A63F13/12|
|2 Sep 1993||AL||Designated countries for regional patents|
Kind code of ref document: A1
Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE
|2 Sep 1993||AK||Designated states|
Kind code of ref document: A1
Designated state(s): CA JP KR
|7 Sep 1993||ENP||Entry into the national phase in:|
Ref country code: CA
Ref document number: 2105669
Kind code of ref document: A
Format of ref document f/p: F
|7 Sep 1993||WWE||Wipo information: entry into national phase|
Ref document number: 2105669
Country of ref document: CA
|5 Nov 1993||WWE||Wipo information: entry into national phase|
Ref document number: 1993905003
Country of ref document: EP
|9 Feb 1994||WWP||Wipo information: published in national office|
Ref document number: 1993905003
Country of ref document: EP
|9 Apr 1995||WWW||Wipo information: withdrawn in national office|
Ref document number: 1993905003
Country of ref document: EP