WO2007100204A1 - Stereovision-based virtual reality device - Google Patents

Stereovision-based virtual reality device Download PDF

Info

Publication number
WO2007100204A1
WO2007100204A1 PCT/KR2007/000994 KR2007000994W WO2007100204A1 WO 2007100204 A1 WO2007100204 A1 WO 2007100204A1 KR 2007000994 W KR2007000994 W KR 2007000994W WO 2007100204 A1 WO2007100204 A1 WO 2007100204A1
Authority
WO
WIPO (PCT)
Prior art keywords
marker
stereoscopic image
displayed
user
unit
Prior art date
Application number
PCT/KR2007/000994
Other languages
French (fr)
Inventor
Tae-Jeong Jang
Original Assignee
Knu-Industry Cooperation Foundation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Knu-Industry Cooperation Foundation filed Critical Knu-Industry Cooperation Foundation
Publication of WO2007100204A1 publication Critical patent/WO2007100204A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0325Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1012Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving biosensors worn by the player, e.g. for measuring heart beat, limb activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1037Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted for converting control signals received from the game device into a haptic signal, e.g. using force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/40Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images giving the observer of a single two-dimensional [2D] image a perception of depth
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0331Finger worn pointing device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

Definitions

  • the present invention relates to a virtual reality device using a vibrotactile unit that can be realistically felt by users.
  • keyboard, touch pad, mouse, joystick, data glove, and the like are merely an input device, they cannot transfer information from a computer to a person.
  • the mouse and the joystick are a two-dimensional (2D) input device
  • the data glove and three- dimensional (3D) space mouse are a 3D input device.
  • the 2D and 3D input devices have a disadvantage in that it is difficult to feel a sense of depth in the 3D virtual space projected on a plane. Consequently, a sense of reality is lowered.
  • the 3D input device is used together with a stereovision-based display, a 3D input can be achieved more conveniently and more accurately because of the sense of reality of the sterevision.
  • the present invention is directed to a stereovision-based virtual reality device, which substantially obviates one or more problems due to limitations and disadvantages of the related art.
  • An object of the present invention is to provide a stereovision-based virtual reality device that can provide a reality through a vi- brotactile unit using a stereovision technology.
  • a stereovision-based virtual reality device includes: a stereoscopic image providing unit 110 for providing a stereoscopic image display with respect to an object and changing the stereoscopic image display according to a touch determination control signal with respect to the object; a marker 120 for pointing a position of an object providing an interaction with respect to the object displayed by the stereoscopic image providing unit 110; one or more cameras 130 for detecting a position of the marker 120; a control unit 150 for receiving stereoscopic position information of the object displayed by the stereoscopic image providing unit 110 and stereoscopic position information of the marker 120 detected by the cameras, determining if the marker 120 touches the displayed object, and generating a control signal when it is determined the maker 120 touches the displayed object; and a vibrotactile unit 140 driven by the touch determination control signal received from the control unit 150 when it is determined that the marker 120 touches the displayed object.
  • the stereoscopic image providing unit 110 can include a monitor, stereoscopic glasses, and a head mount display (HMD).
  • HMD head mount display
  • the stereovision-based virtual reality device further includes a second marker for pointing a position and direction of a user's head, and the camera 130 further senses a movement of the second marker, and the stereoscopic image providing unit 110 adaptively changes the stereoscopic image according to the sensing result of the second marker.
  • a wireless communication/driving circuit is installed on a user's body region and the vibrotactile unit 140 is driven according to the control signal outputted from the control unit 150.
  • a stereovision-based virtual reality device includes: a stereoscopic image providing unit 110 for providing a stereoscopic image display with respect to an object and changing the stereoscopic image display according to a touch determination control signal with respect to the object; a marker 120 for pointing a position of an object providing an interaction with respect to the object displayed by the stereoscopic image providing unit 110; one or more cameras 130 for detecting a position of the marker 120; a control unit 150 for receiving stereoscopic position information of the object displayed by the s tereoscopic image providing unit 110 and stereoscopic position information of the marker 120 detected by the cameras, determining if the marker 120 touches the displayed object, and generating a control signal when it is determined the maker 120 touches the displayed object,the control unit 150 including a marker controller 152 for controlling a position and direction sensor 121 mounted on the marker and a position and direction sensor mounted on a head mount display (HMD); a vibrotact
  • a stereovis ion-based virtual reality device includes: a stereoscopic image providing unit 110 for providing a stereoscopic image display with respect to an object and changing the stereoscopic image display according to a touch determination control signal with respect to the object; a position and direction sensor 121 mounted on a head mount display (HMD) 160 and a user's finger, for pointing a position of an object providing an interaction with respect to the object displayed by the stereoscopic image providing unit 110; one or more cameras 130 for detecting a position of the position and direction sensor 121; a control unit 150 for receiving stereoscopic position information of the object displayed by the stereoscopic image providing unit 110 and stereoscopic position information of the position and direction sensor 121 detected by the cameras 130, determining if the position and direction sensor 121 touches the displayed object, and generating a control signal when it is determined the position and direction sensor 121 touches the displayed object,the control unit 150 including
  • the stereovision-based virtual reality device can provide a realistic virtual reality.
  • FIG. 1 illustrates a virtual bubble breaking game using a stereovision-based virtual reality device according to an embodiment of the present invention
  • FIG. 2 illustrates a vibrotactile unit and a marker using a wireless communication in a stereovision-based virtual reality device according to an embodiment of the present invention
  • FIG. 3 illustrates an example when a user wears a vibrotactile unit, a marker, and a communication/driving circuit using a wireless communication on his/her fingers in a stereovision-based virtual reality device according to another embodiment of the present invention
  • FIG. 4 illustrates a virtual bubble breaking game using an HMD in a stereovision- based virtual reality device according to a further embodiment of the present invention
  • FIG. 5 illustrates a position and direction sensor in a stereovision-based virtual reality device according to a still further embodiment of the present invention
  • Fig. 6 illustrates a concept of a display when a user's eyes (head) move in case that a fixed display device shown in Fig. 1 is used;
  • Fig. 7 illustrates a concept of a display when a user's eyes (head) move in case that an HMD shown in Fig. 4 is used. Best Mode for Carrying Out the Invention
  • the present invention provides a stereovision-based virtual reality device that can provide a reality through a vibrotactile unit using a stereovision technology.
  • FIG. 1 illustrates a virtual bubble breaking game according to an embodiment of the present invention.
  • a stereoscopic image providing unit 110 provides a 3D stereoscopic image.
  • the stereoscopic image providing unit 110 provides a user with a 3D stereoscopic image of a specific display target object.
  • a stereoscopic position information about the display target object is stored in a control unit 150.
  • the stereoscopic image providing unit 110 provides the user with the 3D stereoscopic image of the display target object according to the stereoscopic position information, considering a size of a monitor 111 and a position and direction of a user's eyes.
  • the 3D image represents an image that is more realistic than the 2D image and provides depth and spatial shape information.
  • the 3D image is created on a 2D plane by adding factors providing a stereoscopic feeling and a binocular disparity effect to a plane.
  • the binocular disparity is caused because two eyes are spaced apart horizontally by an average of about 6.5 cm and is an important factor of the stereoscopic feeling. That is, the left eye and the right eye view different 2D images with respect to one object, and the two images are transferred to a brain and combined with each other, so that they are recognized as the 3D stereoscopic image providing the sense of the depth and reality.
  • Such a 3D stereoscopic image provides the feeling of actuality and virtual reality as if the user views a real object.
  • the 3D stereoscopic image display method is classified into a glass type display method and a glassless type display method. Both of the glass type display method and the glassless type display method can be implemented.
  • the glassless type display method has an advantage in that the user can view the 3D stereoscopic image wihtout wearing special glasses.
  • the range where the position of the eyes can move is very narrow. Therefore, in case that the user moves while playing a game, the stereoscopic image may be distored. Specifically, the stereoscopic image may disappear or be reversed when the position of the eyes are out of the range.
  • the stereoscopic image providing unit 110 includes a monitor 111 and a stereoscopic glass 112. According to the glass type display method, the image is not reversed depending on the position of the eyes.
  • the glass type display method can be used more effectively by attaching a second marker to the glasses.
  • the use of the second marker has an advantage in that the virtual object can be expressed as if it exists at an absolute position of the space even though the position of the eyes and the viewing direction are changed.
  • Only one second marker can be installed. For example, if a rectangular marker having a sufficient size is installed, it can check the viewing direction or tilt.
  • a plurality of markers can be used. In this case, the markers are installed on both sides and the viewing direction or tilt can be checked from the relative positons of the markers.
  • a shutter glass blocks a right eye when a left image frame is displayed on the monitor 111, and blocks a left eye when a right image frame is displayed. In this way, the user views independent images with the left and right eyes.
  • a display device such as a monitor operating at 120 Hz or above is used in order to reduce eyestrain.
  • the monitor operating at 120 Hz 60 frames per second can be alternately displayed for the left eye and the right eye.
  • an emitter receives a signal from a graphic card and outputs an infrared signal.
  • the stereoscopic image providing unit 110 can be implemented with a head mount display (HMD).
  • the HMD is a combination of the monitor 111 and the stereoscopic glasses 112 and is worn on the user's head.
  • the HMD blocks the user's view as if the user sees movies. Therefore, the HMD can make the user concentrate on the game.
  • a camera 130 detects the position and direction of the marker 120 and tracks the position in the 3D space.
  • the position and direction of the marker can be detected using an infrared sensor, a ultrasonic sensor, an image sensor, and a magnetic sensor.
  • an infrared sensor In the case of the infrared sensor and the ultrasonic sensor, one sensor senses one direction. Therefore, a large number of sensors must be used when information about various directions is simultaneously required.
  • the camera 130 In case that the camera 130 is used, a plurality of cameras must be used in order to determine the stereoscopic position of the object and to prepare for a case where the object is temporarily hided by a part of the user's body. Meanwhile, the distance (far and near) can be determined by meansuring the relative magnitude of the marker 120. In this case, only one camera may be installed.
  • Fig. 2 illustrates a vibrotactile unit and a marker using a wireless communication in a stereovision-based virtual reality device according to an embodiment of the present invention
  • Fig. 3 illustrates an example when a user wears a vibrotactile unit, a marker, and a communication/driving circuit using a wireless communication on his/ her fingers in a stereovision-based virtual reality device according to another embodiment of the present invention.
  • the marker 120 shown in Figs. 2 and 3 is a device for pointing a stereoscopic object
  • the marker 120 can be installed in several body regions (e.g., the finger, the back of the user's hand, the arm, and the foot) together with the tactile driver according to the contents of the game. In addition, different types of markers can be installed, depending on the body regions. An application program can determine each position of the body regions.
  • the second marker may be installed in the stereoscopic glasses 112 in order to configure the user's stereoscopic vision.
  • the user's vision or position is measured by the position of the second marker and is reflected and modified in the display.
  • the stereoscopic display can be provided as if the display target object exists in a specific absolute space in the stereoscopic space.
  • the stereoscopic display can be provided by detecting the user's position using the second marker and modifying the stereoscopic display according to the position of the user's face or the direction change.
  • the soccer ball is displayed as if it comes to the user even though the user turns his/her face or chnages his/her face position.
  • the user's vision or position is measured using the second marker and the detected information is reflected on the displayed soccer ball image. Therefore, when the second marker mounted on the user's stereoscopic glasses is moved/changed, the soccer ball can be displayed in such a way that it is directed in a direction opposite to the movement/change of the second marker.
  • the vibrotactile unit 140 provides an effect of breaking a bubble 114 to the user when the control unit 150 determines that the marker 120 mounted on the finger touches the stereoscopic object.
  • One or more vibrotactile unit 140 can be installed in each body region.
  • the vibrotactile unit 140 can be installed in several body regions (e.g., the finger, the back of the user's hand, the arm, and the foot) according to the contents of the provided virtual reality device.
  • the effet of the virtual reality device can be increased if the vibrotactile unit 140 is mounted in the places where the marker 120 is installed.
  • the vibrotactile unit 140 and the marker 120 are installed in different positions according to the application programs. For example, in case that an object is cut with a knife, the marker 120 is mounted on the knife and the vibrotactile unit 140 is mounted on the hand.
  • Examples of the vibrotactile unit 140 include an electrotactile unit, a vibrotactile unit, a pneumatic tactile unit, and the like.
  • the end portion of the index finger is most suitable for transferring the touch because of the attribute of the game.
  • only a small number of vibrotactile modules 141 can be used.
  • one to six vibrotactile modules 141 are used to generate the vibration.
  • the control unit 150 performs a control operation to vibrate the vibrotactile module
  • the control unit 150 must be able to drive the plurality of vibrotactile modules 141 independently and enables wired/wireless communication with a terminal such as PC.
  • a communication/driving circuit 151 is installed in the user's body region. When a signal from the control unit 150 is recognized as the breaking of the bubble 114, the vibrotactile unit 140 is driven.
  • the communication/driving circuit 151 can use a wireless communication scheme and a wired communication scheme to communicate with the control unit 150.
  • the driving circuit 151 is connected to the marker 120 and the vibrotactile unit 140.
  • the driving circuit 151 can be placed on the back of the hand.
  • the communication/driving circuit 141 connected to the marker 120 and the vibrotactile unit 140 can be installed in a grasp type by using a length-adjustable cable or by providing a sufficient cable path and embedding a battery.
  • the vibrotactile unit 140 is vibrated using signals of several frequencies and the feeling is compared through the end portion of the finger. In this way, the frequency at which the user feels most similarly like breaking the real bubble can be selected.
  • control unit 150 can express other kinds of the touch feeling by changing waveform, frequency and amplitude of the signal outputted from the vibrotactile unit 140 according to the contact with the virtual object, depending on conditions.
  • the vibrotactile unit 140 can be implemented more realistically by changing a volume and kind of the sound according to the size or position of the virtual bubble 114 being broken.
  • Fig. 4 illustrates a virtual bubble breaking game using an HMD in a stereo vision- based virtual reality device according to a further another embodiment of the present invention.
  • the virtual bubble breaking game of Fig. 4 has a structure different from the virtual bubble breaking game of Figs. 1 to 3 using the fixed display device.
  • the user can view all desired directions (e.g., up/ down, left/right, and front/rear) by sensing the position and direction of the eyes (head).
  • the position shift and the direction shift with respect to all directions are possible by changing the virtual display screen according to the position and direction of the eyes (head) sensed using the HMD
  • Fig. 5 illustrates a position and direction sensor in a stereovision-based virtual reality device according to a still further embodiment of the present invention.
  • the position and direction can be determiend by recognizing the user's finger or the stereoscopic glasses 12 using the position and direction sensor 121 without marker
  • Fig. 6 illustrates a concept of a display when the user's eyes (head) move in case that the fixed display device shown in Fig. 1 is used
  • Fig. 7 illustrates a concept of a display when the user's eyes (head) move in case that the HMD shown in Fig. 4 is used.
  • the virtual screen displayed according to the position and direction sensor 121 can be moved while maintaining a predetermined position relationship with the user's eyes (head or HMD).
  • the stereovision-based virtual reality device provides the realistic image through the vibrotactile unit using the stereovision technology.

Abstract

Provided is a virtual reality device using a vibrotactile unit that can enable a user to feel a sense of touch realistically. The stereo vision-based virtual reality device includes: a stereoscopic image providing unit for providing a stereoscopic image display with respect to an object and changing the stereoscopic image display according to a touch determination control signal with respect to the object; a marker for pointing a position of an object providing an interaction with respect to the object displayed by the stereoscopic image providing unit; one or more cameras for detecting a position of the marker; a control unit for receiving stereoscopic position information of the object displayed by the stereoscopic image providing unit and stereoscopic position in¬ formation of the marker detected by the cameras, determining if the marker touches the displayed object, and generating a control signal when it is determined the maker touches the displayed object; and a vibrotactile unit driven by the touch determination control signal received from the control unit when it is determined that the marker touches the displayed object. Accordingly, the user can feel a tactile sense with respect to his/her action.

Description

Description STEREOVISION-BASED VIRTUAL REALITY DEVICE
Technical Field
[1] The present invention relates to a virtual reality device using a vibrotactile unit that can be realistically felt by users. Background Art
[2] Since keyboard, touch pad, mouse, joystick, data glove, and the like are merely an input device, they cannot transfer information from a computer to a person. The mouse and the joystick are a two-dimensional (2D) input device, and the data glove and three- dimensional (3D) space mouse are a 3D input device. In using a general monitor, the 2D and 3D input devices have a disadvantage in that it is difficult to feel a sense of depth in the 3D virtual space projected on a plane. Consequently, a sense of reality is lowered.
[3] If the 3D input device is used together with a stereovision-based display, a 3D input can be achieved more conveniently and more accurately because of the sense of reality of the sterevision.
[4] However, even though the stereovision display is equipped, the user cannot point a desired point of the virtual space with his/her finger if a location of a finger's end in the real space does not coincide with a location of a point corresponding to a finger's end in the virtual space when the finger's end approaches a virtual object displayed in a stereovision.
[5] In order to enable the user to further feel the virtual space as the real space, a virtual space viewed as the stereovision by the user's eyes must be perfectly matched with the real space. To this end, the coordinate systems of the real space and the virtual space must be matched with each other, considering variation of locations of eyes due to the user's movement. To solve the problem, one approach is disclosed in Korean Laid- open Patent Publication No. 2001-86807, entitled "SHOOTING GAME SYSTEM USING VIRTUAL REALITY AND METHOD THEREOF". In this application, when a user shoots toward a target displayed on a screen with a simulation gun just like a real shooting, a bullet fired from the gun hits the screen and the corresponding coordinate of the screen is acquisited and analyzed. In this way, the shootign game can be carried out in the same principle as the real bullet shooting.
[6] However, when the user points a stereoscopic object through a sense organ in the virtual reality world, the object in the virtual space changes. Thus, unlike the transmission of a sense (e.g., a tactile sensation) through a cable/wireless communication, the user enjoys a game using 2D information. As a result, since a real motion does not reflect on the game, it is still insufficient to provide the realistic feeling.
Disclosure of Invention
Technical Problem
[7] Accordingly, the present invention is directed to a stereovision-based virtual reality device, which substantially obviates one or more problems due to limitations and disadvantages of the related art. An object of the present invention is to provide a stereovision-based virtual reality device that can provide a reality through a vi- brotactile unit using a stereovision technology. Technical Solution
[8] According to an aspect of the present invention, as shown in Figs. 1 to 3, a stereovision-based virtual reality device includes: a stereoscopic image providing unit 110 for providing a stereoscopic image display with respect to an object and changing the stereoscopic image display according to a touch determination control signal with respect to the object; a marker 120 for pointing a position of an object providing an interaction with respect to the object displayed by the stereoscopic image providing unit 110; one or more cameras 130 for detecting a position of the marker 120; a control unit 150 for receiving stereoscopic position information of the object displayed by the stereoscopic image providing unit 110 and stereoscopic position information of the marker 120 detected by the cameras, determining if the marker 120 touches the displayed object, and generating a control signal when it is determined the maker 120 touches the displayed object; and a vibrotactile unit 140 driven by the touch determination control signal received from the control unit 150 when it is determined that the marker 120 touches the displayed object.
[9] In addition, the stereoscopic image providing unit 110 can include a monitor, stereoscopic glasses, and a head mount display (HMD).
[10] The stereovision-based virtual reality device further includes a second marker for pointing a position and direction of a user's head, and the camera 130 further senses a movement of the second marker, and the stereoscopic image providing unit 110 adaptively changes the stereoscopic image according to the sensing result of the second marker. A wireless communication/driving circuit is installed on a user's body region and the vibrotactile unit 140 is driven according to the control signal outputted from the control unit 150.
[11] According to another aspect of the present invention, as shown in Figs. 4 and 5, a stereovision-based virtual reality device includes: a stereoscopic image providing unit 110 for providing a stereoscopic image display with respect to an object and changing the stereoscopic image display according to a touch determination control signal with respect to the object; a marker 120 for pointing a position of an object providing an interaction with respect to the object displayed by the stereoscopic image providing unit 110; one or more cameras 130 for detecting a position of the marker 120; a control unit 150 for receiving stereoscopic position information of the object displayed by the s tereoscopic image providing unit 110 and stereoscopic position information of the marker 120 detected by the cameras, determining if the marker 120 touches the displayed object, and generating a control signal when it is determined the maker 120 touches the displayed object,the control unit 150 including a marker controller 152 for controlling a position and direction sensor 121 mounted on the marker and a position and direction sensor mounted on a head mount display (HMD); a vibrotactile unit 140 driven by the touch determination control signal received from the control unit 150 when it is determined that the marker 120 touches the displayed object; and the HMD 160, mounted on a user's head, for detecting a position and direction of the user's head, transmitting the detected position and direction to the control unit 150 by the marker controller 152, and changing a displayed image of the stereoscopic image providing unit according to the position and direction of the user's head.
[12] According to a further aspect of the present invention, as shown in Figs. 4 and 5, a stereovis ion-based virtual reality device includes: a stereoscopic image providing unit 110 for providing a stereoscopic image display with respect to an object and changing the stereoscopic image display according to a touch determination control signal with respect to the object; a position and direction sensor 121 mounted on a head mount display (HMD) 160 and a user's finger, for pointing a position of an object providing an interaction with respect to the object displayed by the stereoscopic image providing unit 110; one or more cameras 130 for detecting a position of the position and direction sensor 121; a control unit 150 for receiving stereoscopic position information of the object displayed by the stereoscopic image providing unit 110 and stereoscopic position information of the position and direction sensor 121 detected by the cameras 130, determining if the position and direction sensor 121 touches the displayed object, and generating a control signal when it is determined the position and direction sensor 121 touches the displayed object,the control unit 150 including a marker controller 152 for controlling a position and direction sensor mounted on the user's finger and the HMD 160; a vibrotactile unit 140 driven by the touch determination control signal received from the control unit 150 when it is determined that the position and direction sensor 121 touches the displayed object; and the HMD 160, mounted on a user's head, for detecting a position and direction of the user's head, transmitting the detected position and direction to the control unit by the marker controller 152, and changing a displayed image of the stereoscopic image providing unit 110 in accordance with the position and direction of the user's head. Advantageous Effects
[13] According to the present invention, when the user touches a virtual object implemented in a stereoscopy, that is, when a body region such as a finger is matched with a contact point of a virtual object, a tactile sensation is provided by a vibrotactile unit so that the user can feel the sense of touch. Therefore, the stereovision-based virtual reality device according to the present invention can provide a realistic virtual reality.
[14] This concept can be applied to a virtual reality game, a virtual touch screen, a virtual experience, and a training device using a virtual reality technology. Brief Description of the Drawings
[15] Fig. 1 illustrates a virtual bubble breaking game using a stereovision-based virtual reality device according to an embodiment of the present invention;
[16] Fig. 2 illustrates a vibrotactile unit and a marker using a wireless communication in a stereovision-based virtual reality device according to an embodiment of the present invention;
[17] Fig. 3 illustrates an example when a user wears a vibrotactile unit, a marker, and a communication/driving circuit using a wireless communication on his/her fingers in a stereovision-based virtual reality device according to another embodiment of the present invention;
[18] Fig. 4 illustrates a virtual bubble breaking game using an HMD in a stereovision- based virtual reality device according to a further embodiment of the present invention;
[19] Fig. 5 illustrates a position and direction sensor in a stereovision-based virtual reality device according to a still further embodiment of the present invention;
[20] Fig. 6 illustrates a concept of a display when a user's eyes (head) move in case that a fixed display device shown in Fig. 1 is used; and
[21] Fig. 7 illustrates a concept of a display when a user's eyes (head) move in case that an HMD shown in Fig. 4 is used. Best Mode for Carrying Out the Invention
[22] Reference will now be made in detail to the embodiments of the present general inventive concept, examples of which are illustrated in the accompanying drawings. When it is judged that the specific description of related known function or constitution can cloud the point of the invention, the description will be omitted. Terms to be described below are defined in consideration of the functions of the invention and can differ in accordance with the intension of a user or operator or the practice. Accordingly, the meanings of the terms should be interpreted on the basis of the overall contents of the specification.
[23] The present invention provides a stereovision-based virtual reality device that can provide a reality through a vibrotactile unit using a stereovision technology.
[24] Fig. 1 illustrates a virtual bubble breaking game according to an embodiment of the present invention.
[25] Referring to Fig. 1, a stereoscopic image providing unit 110 provides a 3D stereoscopic image. The stereoscopic image providing unit 110 provides a user with a 3D stereoscopic image of a specific display target object. At this point, it is assumed that a stereoscopic position information about the display target object is stored in a control unit 150. The stereoscopic image providing unit 110 provides the user with the 3D stereoscopic image of the display target object according to the stereoscopic position information, considering a size of a monitor 111 and a position and direction of a user's eyes.
[26] Meanwhile, the 3D image represents an image that is more realistic than the 2D image and provides depth and spatial shape information. The 3D image is created on a 2D plane by adding factors providing a stereoscopic feeling and a binocular disparity effect to a plane.
[27] The binocular disparity is caused because two eyes are spaced apart horizontally by an average of about 6.5 cm and is an important factor of the stereoscopic feeling. That is, the left eye and the right eye view different 2D images with respect to one object, and the two images are transferred to a brain and combined with each other, so that they are recognized as the 3D stereoscopic image providing the sense of the depth and reality. Such a 3D stereoscopic image provides the feeling of actuality and virtual reality as if the user views a real object.
[28] The 3D stereoscopic image display method is classified into a glass type display method and a glassless type display method. Both of the glass type display method and the glassless type display method can be implemented. The glassless type display method has an advantage in that the user can view the 3D stereoscopic image wihtout wearing special glasses. However, the range where the position of the eyes can move is very narrow. Therefore, in case that the user moves while playing a game, the stereoscopic image may be distored. Specifically, the stereoscopic image may disappear or be reversed when the position of the eyes are out of the range.
[29] The case where the stereoscopic image providing unit 110 is implemented using the glass type display method is shown in Fig. 1. The stereoscopic image providing unit 110 includes a monitor 111 and a stereoscopic glass 112. According to the glass type display method, the image is not reversed depending on the position of the eyes. In addition, the glass type display method can be used more effectively by attaching a second marker to the glasses. The use of the second marker has an advantage in that the virtual object can be expressed as if it exists at an absolute position of the space even though the position of the eyes and the viewing direction are changed. [30] Only one second marker can be installed. For example, if a rectangular marker having a sufficient size is installed, it can check the viewing direction or tilt. In addition, a plurality of markers can be used. In this case, the markers are installed on both sides and the viewing direction or tilt can be checked from the relative positons of the markers.
[31] There are many kinds of the glass type display method. Among them, a shutter glass blocks a right eye when a left image frame is displayed on the monitor 111, and blocks a left eye when a right image frame is displayed. In this way, the user views independent images with the left and right eyes.
[32] Preferably, a display device such as a monitor operating at 120 Hz or above is used in order to reduce eyestrain. In the case of the monitor operating at 120 Hz, 60 frames per second can be alternately displayed for the left eye and the right eye. In order for synchronization between a screen and a shutter glass, an emitter receives a signal from a graphic card and outputs an infrared signal.
[33] The stereoscopic image providing unit 110 can be implemented with a head mount display (HMD). The HMD is a combination of the monitor 111 and the stereoscopic glasses 112 and is worn on the user's head. The HMD blocks the user's view as if the user sees movies. Therefore, the HMD can make the user concentrate on the game.
[34] In addition, in the case of the HMD, the screen is followed in a direction in which the user's head moves. Therefore, when the position and direction of the uer's head is tracked, omnidirectional display can be implemented witout any viewing limitation of the monitor.
[35] A camera 130 detects the position and direction of the marker 120 and tracks the position in the 3D space. The position and direction of the marker can be detected using an infrared sensor, a ultrasonic sensor, an image sensor, and a magnetic sensor. In the case of the infrared sensor and the ultrasonic sensor, one sensor senses one direction. Therefore, a large number of sensors must be used when information about various directions is simultaneously required.
[36] In case that the camera 130 is used, a plurality of cameras must be used in order to determine the stereoscopic position of the object and to prepare for a case where the object is temporarily hided by a part of the user's body. Meanwhile, the distance (far and near) can be determined by meansuring the relative magnitude of the marker 120. In this case, only one camera may be installed.
[37] Fig. 2 illustrates a vibrotactile unit and a marker using a wireless communication in a stereovision-based virtual reality device according to an embodiment of the present invention, and Fig. 3 illustrates an example when a user wears a vibrotactile unit, a marker, and a communication/driving circuit using a wireless communication on his/ her fingers in a stereovision-based virtual reality device according to another embodiment of the present invention.
[38] The marker 120 shown in Figs. 2 and 3 is a device for pointing a stereoscopic object
(e.g., a bubble). The marker 120 can be installed in several body regions (e.g., the finger, the back of the user's hand, the arm, and the foot) together with the tactile driver according to the contents of the game. In addition, different types of markers can be installed, depending on the body regions. An application program can determine each position of the body regions.
[39] The second marker may be installed in the stereoscopic glasses 112 in order to configure the user's stereoscopic vision.
[40] The user's vision or position is measured by the position of the second marker and is reflected and modified in the display. In this way, the stereoscopic display can be provided as if the display target object exists in a specific absolute space in the stereoscopic space. In other words, the stereoscopic display can be provided by detecting the user's position using the second marker and modifying the stereoscopic display according to the position of the user's face or the direction change.
[41] According to the related art, in case that the contents of the stereoscopic image is a soccer ball coming to the user, the soccer ball is displayed as if it comes to the user even though the user turns his/her face or chnages his/her face position. However, according to the present invention, the user's vision or position is measured using the second marker and the detected information is reflected on the displayed soccer ball image. Therefore, when the second marker mounted on the user's stereoscopic glasses is moved/changed, the soccer ball can be displayed in such a way that it is directed in a direction opposite to the movement/change of the second marker.
[42] The vibrotactile unit 140 provides an effect of breaking a bubble 114 to the user when the control unit 150 determines that the marker 120 mounted on the finger touches the stereoscopic object. One or more vibrotactile unit 140 can be installed in each body region. In addition, the vibrotactile unit 140 can be installed in several body regions (e.g., the finger, the back of the user's hand, the arm, and the foot) according to the contents of the provided virtual reality device. The effet of the virtual reality device can be increased if the vibrotactile unit 140 is mounted in the places where the marker 120 is installed. In some cases, the vibrotactile unit 140 and the marker 120 are installed in different positions according to the application programs. For example, in case that an object is cut with a knife, the marker 120 is mounted on the knife and the vibrotactile unit 140 is mounted on the hand.
[43] Examples of the vibrotactile unit 140 include an electrotactile unit, a vibrotactile unit, a pneumatic tactile unit, and the like. In the case of a virtual bubble breaking game, the end portion of the index finger is most suitable for transferring the touch because of the attribute of the game. However, due to the limitation in an area of the end portion of the index finger, only a small number of vibrotactile modules 141 can be used. Preferably, one to six vibrotactile modules 141 are used to generate the vibration.
[44] The control unit 150 performs a control operation to vibrate the vibrotactile module
141 of the vibrotactile unit 140. Preferably, the control unit 150 must be able to drive the plurality of vibrotactile modules 141 independently and enables wired/wireless communication with a terminal such as PC. A communication/driving circuit 151 is installed in the user's body region. When a signal from the control unit 150 is recognized as the breaking of the bubble 114, the vibrotactile unit 140 is driven.
[45] The communication/driving circuit 151 can use a wireless communication scheme and a wired communication scheme to communicate with the control unit 150.
[46] In the case of the wired connection, a wireless communication circuit is unnecessary and the driving circuit 151 can be installed inside the control unit 150.
[47] In the case of the wireless connection, the driving circuit 151 is connected to the marker 120 and the vibrotactile unit 140. The driving circuit 151 can be placed on the back of the hand. In addition, the communication/driving circuit 141 connected to the marker 120 and the vibrotactile unit 140 can be installed in a grasp type by using a length-adjustable cable or by providing a sufficient cable path and embedding a battery.
[48] Since the feeling of breaking the virtual bubble 114 must be transferred realistically, the vibrotactile unit 140 is vibrated using signals of several frequencies and the feeling is compared through the end portion of the finger. In this way, the frequency at which the user feels most similarly like breaking the real bubble can be selected.
[49] In addition, the control unit 150 can express other kinds of the touch feeling by changing waveform, frequency and amplitude of the signal outputted from the vibrotactile unit 140 according to the contact with the virtual object, depending on conditions.
[50] While playing the game, a sound effect can be provided whenever the virtual bubble 114 is broken. The vibrotactile unit 140 can be implemented more realistically by changing a volume and kind of the sound according to the size or position of the virtual bubble 114 being broken.
[51] Fig. 4 illustrates a virtual bubble breaking game using an HMD in a stereo vision- based virtual reality device according to a further another embodiment of the present invention. The virtual bubble breaking game of Fig. 4 has a structure different from the virtual bubble breaking game of Figs. 1 to 3 using the fixed display device.
[52] In case that the HMD 160 is used, the user can view all desired directions (e.g., up/ down, left/right, and front/rear) by sensing the position and direction of the eyes (head). [53] Unlike the use of the fixed display device of Fig. 1, the position shift and the direction shift with respect to all directions are possible by changing the virtual display screen according to the position and direction of the eyes (head) sensed using the HMD
160. [54] Fig. 5 illustrates a position and direction sensor in a stereovision-based virtual reality device according to a still further embodiment of the present invention. [55] The position and direction can be determiend by recognizing the user's finger or the stereoscopic glasses 12 using the position and direction sensor 121 without marker
120. [56] Fig. 6 illustrates a concept of a display when the user's eyes (head) move in case that the fixed display device shown in Fig. 1 is used, and Fig. 7 illustrates a concept of a display when the user's eyes (head) move in case that the HMD shown in Fig. 4 is used. [57] In case that the HMD 160 is used, the virtual screen displayed according to the position and direction sensor 121 can be moved while maintaining a predetermined position relationship with the user's eyes (head or HMD). [58] As described above, the stereovision-based virtual reality device provides the realistic image through the vibrotactile unit using the stereovision technology. [59] Although a few embodiments of the present general inventive concept have been shown and described, it will be appreciated by those skilled in the art that changes may be made in these embodiments without departing from the principles and spirit of the general inventive concept, the scope of which is defined in the appended claims and their equivalents.

Claims

Claims
[1] A stereovision-based virtual reality device comprising: a stereoscopic image providing unit for providing a stereoscopic image display with respect to an object and changing the stereoscopic image display according to a touch determination control signal with respect to the object; a marker for pointing a position of an object providing an interaction with respect to the object displayed by the stereoscopic image providing unit; one or more cameras for detecting a position of the marker; a control unit for receiving stereoscopic position information of the object displayed by the stereoscopic image providing unit and stereoscopic position information of the marker detected by the cameras, determining if the marker touches the displayed object, and generating a control signal when it is determined the maker touches the displayed object; and a vibrotactile unit driven by the touch determination control signal received from the control unit when it is determined that the marker touches the displayed object.
[2] The stereovision-based virtual reality device according to claim 1, wherein the stereoscopic image providing unit includes a monitor and stereoscopic glasses.
[3] The stereovision-based virtual reality device according to claim 1, wherein the stereoscopic image providing unit includes a head mount display (HMD).
[4] The stereovision-based virtual reality device according to claim 2 or 3 further comprising a second marker for pointing a position and direction of a user's head, wherein the camera further senses a movement of the second marker, and the stereoscopic image providing unit adaptively changes the stereoscopic image in accordance with the sensing result of the second marker.
[5] The stereovision-based virtual reality device according to claim 4, wherein a wireless communication/driving circuit is installed on a user's body region and the vibrotactile unit is driven according to the control signal outputted from the control unit.
[6] A stereovision-based virtual reality device comprising: a stereoscopic image providing unit for providing a stereoscopic image display with respect to an object and changing the stereoscopic image display according to a touch determination control signal with respect to the object; a marker for pointing a position of an object providing an interaction with respect to the object displayed by the stereoscopic image providing unit; one or more cameras for detecting a position of the marker; a control unit for receiving stereoscopic position information of the object displayed by the stereoscopic image providing unit and stereoscopic position information of the marker detected by the cameras, determining if the marker touches the displayed object, and generating a control signal when it is determined the maker touches the displayed object,the control unit including a marker controller for controlling a position and direction sensor mounted on the marker and a position and direction sensor mounted on a head mount display (HMD); a vibrotactile unit driven by the touch determination control signal received from the control unit when it is determined that the marker touches the displayed object; and the HMD, mounted on a user's head, for detecting a position and direction of the user's head, transmitting the detected position and direction to the control unit by the marker controller, and changing a displayed image of the stereoscopic image providing unit according to the position and direction of the user's head. [7] A stereovision-based virtual reality device comprising: a stereoscopic image providing unit for providing a stereoscopic image display with respect to an object and changing the stereoscopic image display according to a touch determination control signal with respect to the object; a position and direction sensor mounted on a head mount display (HMD) and a user's finger, for pointing a position of an object providing an interaction with respect to the object displayed by the stereoscopic image providing unit; one or more cameras for detecting a position of the position and direction sensor; a control unit for receiving stereoscopic position information of the object displayed by the stereoscopic image providing unit and stereoscopic position information of the position and direction sensor detected by the cameras, determining if the position and direction sensor touches the displayed object, and generating a control signal when it is determined the position and direction sensor touches the displayed object,the control unit including a marker controller for controlling a position and direction sensor mounted on the user's finger and the HMD; a vibrotactile unit driven by the touch determination control signal received from the control unit when it is determined that the position and direction sensor touches the displayed object; and the HMD, mounted on a user's head, for detecting a position and direction of the user's head, transmitting the detected position and direction to the control unit by the marker controller, and changing a displayed image of the stereoscopic image providing unit in accordance with the position and direction of the user's head.
PCT/KR2007/000994 2006-03-02 2007-02-27 Stereovision-based virtual reality device WO2007100204A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2006-0020091 2006-03-02
KR20060020091 2006-03-02
KR10-2006-0118009 2006-11-28
KR1020060118009A KR100812624B1 (en) 2006-03-02 2006-11-28 Stereovision-Based Virtual Reality Device

Publications (1)

Publication Number Publication Date
WO2007100204A1 true WO2007100204A1 (en) 2007-09-07

Family

ID=38689071

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2007/000994 WO2007100204A1 (en) 2006-03-02 2007-02-27 Stereovision-based virtual reality device

Country Status (2)

Country Link
KR (1) KR100812624B1 (en)
WO (1) WO2007100204A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009136207A1 (en) * 2008-05-09 2009-11-12 Mbda Uk Limited Display of 3-dimensional objects
US20110096072A1 (en) * 2009-10-27 2011-04-28 Samsung Electronics Co., Ltd. Three-dimensional space interface apparatus and method
WO2011060966A1 (en) * 2009-11-19 2011-05-26 Sony Ericsson Mobile Communications Ab Hand-held input device, system comprising the input device and an electronic device and method for controlling the same
WO2010062117A3 (en) * 2008-11-26 2011-06-30 Samsung Electronics Co., Ltd. Immersive display system for interacting with three-dimensional content
EP2034390A3 (en) * 2007-09-10 2011-11-02 Canon Kabushiki Kaisha Information-processing apparatus and information-processing method
EP2469743A3 (en) * 2010-12-23 2012-10-24 Nagravision S.A. A system to identify a user of television services by using biometrics
WO2013154217A1 (en) * 2012-04-13 2013-10-17 Lg Electronics Inc. Electronic device and method of controlling the same
CN105339869A (en) * 2013-06-18 2016-02-17 桂宇硕 Apparatus for controlling physical effects in cyberspace and method therefor
EP3167610A4 (en) * 2014-07-09 2018-01-24 LG Electronics Inc. Display device having scope of accreditation in cooperation with depth of virtual object and controlling method thereof
JP2018125007A (en) * 2010-03-31 2018-08-09 イマージョン コーポレーションImmersion Corporation System and method for providing haptic stimulus based on position
US11353707B2 (en) 2014-10-15 2022-06-07 Samsung Electronics Co., Ltd. Method and apparatus for processing screen using device

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100934614B1 (en) * 2008-01-30 2009-12-31 서경대학교 산학협력단 Mixed reality based mechanical training control system
KR100944027B1 (en) * 2008-06-16 2010-02-24 전남대학교산학협력단 Augmented reality providing system of the table top structure
KR100974900B1 (en) 2008-11-04 2010-08-09 한국전자통신연구원 Marker recognition apparatus using dynamic threshold and method thereof
KR101156728B1 (en) 2010-03-23 2012-06-14 강원대학교산학협력단 Apparatus and method for popping bubble
EP2812088A4 (en) 2012-02-06 2015-05-20 Hothead Games Inc Virtual competitive group management systems and methods
KR101725073B1 (en) * 2012-02-06 2017-04-11 핫헤드 게임즈 인크. Virtual opening of boxes and packs of cards
KR101958778B1 (en) * 2012-08-31 2019-03-15 엘지전자 주식회사 A Head Mounted Display and a Method for Controlling a Digital Device Using the Same
US9919213B2 (en) 2016-05-03 2018-03-20 Hothead Games Inc. Zoom controls for virtual environment user interfaces
US10004991B2 (en) 2016-06-28 2018-06-26 Hothead Games Inc. Systems and methods for customized camera views in virtualized environments
US10010791B2 (en) 2016-06-28 2018-07-03 Hothead Games Inc. Systems and methods for customized camera views and customizable objects in virtualized environments
KR101999953B1 (en) 2017-11-21 2019-07-15 대한민국 Treatment System and Method Based on Virtual-Reality
KR102055627B1 (en) * 2018-03-07 2019-12-13 케이애드에스엔씨 주식회사 Ar guide system for exhibition halls
CN116301384A (en) 2018-07-30 2023-06-23 宏达国际电子股份有限公司 Correction method
KR102200663B1 (en) 2019-05-31 2021-01-11 고려대학교 세종산학협력단 High-resolution magnetic resonance imaging system-compatible 3-dimensional virtual reality system for functional brain imaging study
KR102268293B1 (en) * 2019-10-28 2021-06-23 방신웅 Sensory device using magnetic material

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010095900A (en) * 2000-04-12 2001-11-07 박명수 3D Motion Capture analysis system and its analysis method
KR20030029684A (en) * 2001-10-08 2003-04-16 학교법인 한양학원 Realistic Motion in Virtual Environment based on Real Human Motion
KR20030084401A (en) * 2002-04-26 2003-11-01 장태정 A vibrotactile display module and a multi-channel vibrotactile device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000250689A (en) * 1999-03-03 2000-09-14 Minolta Co Ltd Virtual object presentation system
JP2003330582A (en) * 2002-05-14 2003-11-21 Univ Waseda Sense providing device using sense of sight and sense of hearing

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010095900A (en) * 2000-04-12 2001-11-07 박명수 3D Motion Capture analysis system and its analysis method
KR20030029684A (en) * 2001-10-08 2003-04-16 학교법인 한양학원 Realistic Motion in Virtual Environment based on Real Human Motion
KR20030084401A (en) * 2002-04-26 2003-11-01 장태정 A vibrotactile display module and a multi-channel vibrotactile device

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2034390A3 (en) * 2007-09-10 2011-11-02 Canon Kabushiki Kaisha Information-processing apparatus and information-processing method
US8553049B2 (en) 2007-09-10 2013-10-08 Canon Kabushiki Kaisha Information-processing apparatus and information-processing method
WO2009136207A1 (en) * 2008-05-09 2009-11-12 Mbda Uk Limited Display of 3-dimensional objects
WO2010062117A3 (en) * 2008-11-26 2011-06-30 Samsung Electronics Co., Ltd. Immersive display system for interacting with three-dimensional content
US9880698B2 (en) 2009-10-27 2018-01-30 Samsung Electronics Co., Ltd. Three-dimensional space interface apparatus and method
US9377858B2 (en) * 2009-10-27 2016-06-28 Samsung Electronics Co., Ltd. Three-dimensional space interface apparatus and method
US20110096072A1 (en) * 2009-10-27 2011-04-28 Samsung Electronics Co., Ltd. Three-dimensional space interface apparatus and method
WO2011060966A1 (en) * 2009-11-19 2011-05-26 Sony Ericsson Mobile Communications Ab Hand-held input device, system comprising the input device and an electronic device and method for controlling the same
JP2018125007A (en) * 2010-03-31 2018-08-09 イマージョン コーポレーションImmersion Corporation System and method for providing haptic stimulus based on position
EP2469743A3 (en) * 2010-12-23 2012-10-24 Nagravision S.A. A system to identify a user of television services by using biometrics
US9054819B2 (en) 2010-12-23 2015-06-09 Nagravision S.A. System to identify a user of television services by using biometrics
WO2013154217A1 (en) * 2012-04-13 2013-10-17 Lg Electronics Inc. Electronic device and method of controlling the same
CN105339869A (en) * 2013-06-18 2016-02-17 桂宇硕 Apparatus for controlling physical effects in cyberspace and method therefor
EP3167610A4 (en) * 2014-07-09 2018-01-24 LG Electronics Inc. Display device having scope of accreditation in cooperation with depth of virtual object and controlling method thereof
US11353707B2 (en) 2014-10-15 2022-06-07 Samsung Electronics Co., Ltd. Method and apparatus for processing screen using device
US11914153B2 (en) 2014-10-15 2024-02-27 Samsung Electronics Co., Ltd. Method and apparatus for processing screen using device

Also Published As

Publication number Publication date
KR100812624B1 (en) 2008-03-13
KR20070090730A (en) 2007-09-06

Similar Documents

Publication Publication Date Title
WO2007100204A1 (en) Stereovision-based virtual reality device
US10996757B2 (en) Methods and apparatus for generating haptic interaction for virtual reality
US8866739B2 (en) Display device, image display system, and image display method
JP5893605B2 (en) System and method for providing tactile stimulation based on position
US20170150108A1 (en) Autostereoscopic Virtual Reality Platform
KR102181587B1 (en) Virtual reality control system
EP3364272A1 (en) Automatic localized haptics generation system
US20120135803A1 (en) Game device utilizing stereoscopic display, method of providing game, recording medium storing game program, and game system
EP3106963B1 (en) Mediated reality
JP2012139318A (en) Display control program, display control apparatu, display control system, and display control method
KR20140043522A (en) Apparatus and method for controlling of transparent both-sided display
JP2012141939A (en) Display control program, display control device, display control system and display control method
US20180213213A1 (en) Stereoscopic display
CN102508562A (en) Three-dimensional interaction system
WO2018104732A1 (en) Head mounted display with user head rotation guidance
GB2583535A (en) Data processing
JPH07253773A (en) Three dimentional display device
EP3502839A1 (en) Methods, apparatus, systems, computer programs for enabling mediated reality
CN102508561A (en) Operating rod
KR102218088B1 (en) Virtual Reality Control System
US20220230357A1 (en) Data processing
WO2015196877A1 (en) Autostereoscopic virtual reality platform
KR20230121953A (en) Stereovision-Based Virtual Reality Device
Perry et al. An investigation of current virtual reality interfaces
KR102212508B1 (en) Virtual reality control system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07715405

Country of ref document: EP

Kind code of ref document: A1