WO2014042320A1 - Apparatus and method of providing user interface on head mounted display and head mounted display thereof - Google Patents

Apparatus and method of providing user interface on head mounted display and head mounted display thereof Download PDF

Info

Publication number
WO2014042320A1
WO2014042320A1 PCT/KR2013/000209 KR2013000209W WO2014042320A1 WO 2014042320 A1 WO2014042320 A1 WO 2014042320A1 KR 2013000209 W KR2013000209 W KR 2013000209W WO 2014042320 A1 WO2014042320 A1 WO 2014042320A1
Authority
WO
WIPO (PCT)
Prior art keywords
hmd
physical
mode
detected
view angle
Prior art date
Application number
PCT/KR2013/000209
Other languages
French (fr)
Inventor
Jihwan Kim
Original Assignee
Lg Electronics Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lg Electronics Inc. filed Critical Lg Electronics Inc.
Priority to EP13836413.8A priority Critical patent/EP2896205B1/en
Publication of WO2014042320A1 publication Critical patent/WO2014042320A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/34Graphical or visual programming

Definitions

  • the exemplary embodiments of present invention relate to an apparatus and method of providing a User Interface (UI) and/or a User Experience (UX) (herein after, “UI”), and more particularly to an apparatus and method of determining an optimized UI and providing the UI on a head mounted display and a head mounted display thereof.
  • UI User Interface
  • UX User Experience
  • the UI technology is an interface method that helps users conveniently utilize diverse digital devices.
  • the UI technology is a part of a program that a user counter-interacts with in order for the user and a digital device to offer and obtain information.
  • a command-line interface in which the user inputs a command to run a program
  • a menu-driven interface operated by commands of menu selection
  • GUI Graphic User Interface
  • a figure display program is operated by using position locating devices such as an optical pen, mouse, control ball, and a joystick.
  • a gesture UI operated by a user’s action commands and a voice recognition UI operated by the user’s voice without his or her action have been developed to be applied to the digital devices.
  • HMD head mounted display
  • N Screen technology beyond the simple display use
  • the aforementioned diverse UI technologies can be applied to the HMD.
  • it is difficult to determine an optimized UI for the HMD because the HMD can be worn on a user and freely moved. Therefore, a technology providing a very efficient and convenient UI is demanded considering the characteristics of the HMD and ambient environmental conditions in the proximity of the HMD.
  • the exemplary embodiments of present invention are directed to an apparatus and method of providing a User Interface that substantially obviates one or more problems due to limitations and disadvantages of the related art.
  • the exemplary embodiments of present invention provide a technology providing a very efficient and convenient UI considering the characteristics of the HMD and ambient environmental conditions in the proximity of the HMD.
  • FIGS. 1a and 1b show drawings to explain a HMD according to the exemplary embodiments of present invention
  • FIG. 2 shows a block diagram for a HMD internal configuration according to the exemplary embodiments of present invention
  • FIGS. 3 and 4 show flowcharts of UI mode determination process according to object location as a first exemplary embodiment of present invention
  • FIGS. 5a, 5b, and 5c are drawings to show how the UI mode determination process is applied to the first exemplary embodiment of present invention
  • FIGS. 6a and 6b show an example of physical UI mode (e.g., keyboard, drawing) according to the exemplary embodiments of present invention
  • FIGS. 7a and 7b show an example of non-physical UI mode (e.g., voice, gesture) according to the exemplary embodiments of present invention
  • FIGS. 8 and 9 show flowcharts of UI mode determination considering a HMD view angle as a second exemplary embodiment of present invention
  • FIGS. 10a and 10b are drawings to show how the UI mode determination process is applied to the second exemplary embodiment of present invention.
  • FIGS. 11 and 12 show flowcharts of UI mode determination considering an object type as a third exemplary embodiment of present invention
  • FIGS. 13a and 13b are drawings to show how the UI mode determination process is applied to the third exemplary embodiment of present invention.
  • FIGS. 14 and 15 show flowcharts of UI mode determination process utilizing a digital device within a view angle as a fourth exemplary embodiment of present invention
  • FIG. 16 is a drawing to show how the UI mode determination process is applied to the fourth exemplary embodiment of present invention.
  • FIGS. 17 and 18 show flowcharts of UI mode determination process utilizing a digital device as a fifth exemplary embodiment of present invention.
  • One object of the exemplary embodiments is, in providing a Head Mounted Display User Interface (HMD UI), to provide an optimized HMD UI considering the ambient environmental conditions in the proximity of the HMD.
  • another object of the exemplary embodiments is to apply the HMD UI differently based on whether a usable object for the HMD UI exists in the proximity of the HMD.
  • HMD UI Head Mounted Display User Interface
  • Another object of the exemplary embodiments is to change and provide an optimized HMD UI based on the ambient environmental conditions in the proximity of the HMD that is being used at present.
  • a method of providing a User Interface includes the steps of detecting whether an object exists in the proximity of the HMD and determining a distance between the detected object and the HMD if the object is detected. The method further includes the step of applying a UI mode if the detected object is within a predetermined distance from the HMD. Finally, the method includes the step of applying a non-physical UI mode if the object is not detected or is not within the predetermined distance from the HMD.
  • a UI apparatus comprises a sensor unit detecting whether an object exists in the proximity of the HMD and if the object is detected, the sensor unit senses a distance between the object and the HMD.
  • the apparatus further comprises a processor controlling a User Interface (UI) of the HMD based on a result of the sensor unit.
  • UI User Interface
  • the physical UI mode is applied if the detected object is within a predetermined distance from the HMD and the non-physical UI mode is applied if the object is not detected or is not within the predetermined distance from the HMD.
  • FIGS. 1a and 1b show drawings to explain a HMD as an example according to the exemplary embodiments of present invention.
  • FIG. 1a indicates an example of the external configuration of a HMD 100
  • FIG. 1b indicates an example of the HMD 100 worn on a user 10. Accordingly, the present invention is not limited to the external configuration of the HMD 100 and clearly any external configurations of a HMD can be utilized to realize the exemplary embodiments.
  • the HMD 100 to which an UI is applied includes a display screen 101 and at least one sensor 102. Not only all contents and images are provided the user 10 by the HMD through the display screen 101 but also information about the UI of the exemplary embodiments is provided. Further, the HMD 100 includes at least one sensor 102, detects ambient environmental conditions in the proximity of the HMD 100, and is utilized as an important element to determine a HMD UI to operate such sensor functions. Further, the HMD 100 is able to include a supporting component 103 in order for the user 10 to wear the HMD 100 in the head and an audio outputting unit 104 wearable in the ear.
  • FIG. 1b shows a drawing of the HMD 100 worn on the user 10 in order to explain the status of the HMD 100 that is being used at present.
  • the HMD 100 includes a predetermined distance of a view angle area 200.
  • the view angle area 200 is a predetermined area that corresponds to the user 10 wearing the HMD 100 and can include an area with a certain angle in the forward direction of the HMD 100.
  • the HMD 100 and an external digital device can be connected to communicate by a network 300.
  • usable wireless networks are Near Field Communication (NFC), Zigbee, Infrared Communications, Bluetooth, and WIFI. That is, in the exemplary embodiments, communications between the HMD 100 and the digital device can be realized by one of the enumerated networks 300 above or a combination of them. Yet, the exemplary embodiments is not limited to those mentioned above.
  • FIG. 2 shows a block diagram for a HMD internal configuration.
  • the HMD 100 of the exemplary embodiment includes a processor 110, a sensor unit 120, a storage unit 130, a communications unit 140, a user input unit 150, a display controller 160, a UI control unit 170.
  • the sensor unit 120 can be internally or externally equipped in the HMD 100 and informs the processor 110 of ambient environmental conditions that the HMD 100 recognizes. Then, the sensor unit 120 can include a plurality of sensing methods. For example, the sensor unit 120 not only detects objects or things in the proximity of the HMD 100, called “objects” but also ensures the type of the detected object and can include an object sensor 121 sensing the distance between the detected object and the HMD. Further, the sensor unit 120 can include a view angle sensor 122 sensing the view angle of the HMD 100. More detailed functions and operations of the object sensor 121 and the view angle sensor 122 will be illustrated below. Accordingly, the sensor unit, for example, can be internally or externally equipped in the HMD like a 102 in FIG. 1a.
  • a plurality of the sensing methods the sensor unit 120 can be configured with includes, for example, a gravity sensor, magnetic sensor, motion sensor, gyro sensor, acceleration sensor, infrared sensor, inclination sensor, brightness sensor, elevation sensor, olfactory sensor, temperature sensor, depth sensor, pressure sensor, bending sensor, audio sensor, video sensor, Global Positioning System (GPS) sensor, and touch sensor.
  • a gravity sensor magnetic sensor
  • motion sensor motion sensor
  • gyro sensor acceleration sensor
  • infrared sensor infrared sensor
  • inclination sensor inclination sensor
  • brightness sensor elevation sensor
  • olfactory sensor temperature sensor
  • depth sensor depth sensor
  • pressure sensor bending sensor
  • audio sensor audio sensor
  • video sensor Global Positioning System (GPS) sensor
  • GPS Global Positioning System
  • the sensor unit 120 senses a HMD user and ambient environmental conditions in the proximity of him or her and sends the result of the sensing in order for the processor 110 to be operated accordingly, and the detailed sensing method of the sensor unit 120 is not limited to the enumerated sensing methods.
  • the storage unit 130 can store diverse digital data such as video, audio, pictures, movie clips, and applications.
  • the storage unit 130 indicates diverse digital data storage space such as flash memory, Random Access Memory (RAM), and Solid State Drive (SSD).
  • the communications unit 140 transmits and receives data by performing communications with external digital devices and diverse protocols.
  • the HMD 100 of the exemplary embodiment performs pairing with and connecting communications with digital devices in the proximity of the HMD 100 by using the communications unit 140.
  • the communications unit 140 can include a plurality of antennas.
  • the HMD 100 detects the location of a digital device that is being communicated with the HMD by using a plurality of the antennas. That is, the HMD 100 detects the location of the digital device that is being communicated with the HMD by using the time and altitude differences between the transmitted or received signals through a plurality of the antennas.
  • the user input unit is a device that receives user control commands controlling the use of the HMD 100.
  • the user control commands can include user configuration commands.
  • the user 10 may set the operation corresponding to ambient environmental conditions in the proximity of the HMD UI and predetermine a certain UI mode without the ambient environmental conditions by using the user control information from the user input unit 150.
  • the UI mode can automatically change according to the embodiment of the exemplary embodiments.
  • the fixed UI mode will operate on “On” status if the fixed UI mode is in the operable status and if not the UI mode will operate on “Off” status.
  • the display control unit 160 outputs motion pictures or images on the HMD display screen 101. Further, according to an embodiment, the display control unit 160 provides an HMD external object with an imagery UI screen or performs controlling functions for providing an external digital device with a UI screen. In addition, according to an embodiment, the display control unit 160 provides information to determine the UI for the HMD display screen 101 or images according to the determined UI. Detailed description will be further illustrated.
  • the UI control unit 170 provides a HMD user with an UI and controls the provided UI.
  • the UI control unit 170 includes a physical UI control unit 171 and a non-physical UI control unit 172.
  • the physical UI control unit 171 indicates an UI that can have a physical contact with a user, it can be, for example, a virtual keyboard UI and drawing UI as an embodiment.
  • the virtual keyboard UI indicates a UI method that displays a virtual keyboard on the surface of the detected object in the proximity of the HMD, receives a command by the user’s keyboard touch, and operates accordingly.
  • the drawing UI indicates a UI method that provides with an imaginary drawing panel on the surface of the detected object in the proximity of the HMD and inputs a command on the drawing panel by the user using drawing tools such as an electric pen or a finger.
  • non-physical UI control unit 172 indicates an UI that does not have a physical contact with the user, it can be, for example, a gesture UI and voice recognition UI as an embodiment. Thus, detailed description of the UI methods will be further illustrated.
  • the physical and non-physical UIs are certain UI methods and are not limited to only the aforementioned virtual keyboard, drawing, gesture, and voice recognition UIs. That is, the physical UI indicates all UIs that can have a physical contact with an object and the non-physical UI indicates all UIs that do not need a physical contact with an object.
  • the processor 110 is a main HMD controller, it controls not only each block of the HMD 100 but also information and data transmission and reception between each block. In the following, detailed process of the exemplary embodiments can be operated mainly by the control operations of the processor 110. Accordingly, the internal configuration block diagram of the HMD 100 shown in FIG. 2 is illustrated as an embodiment for explanation purposes only. Thus, each block shown in FIG. 2 may be combined as a whole or some necessary blocks may be separated and combined as a whole. For example, the processor 110 may be composed with the UI control unit as a combined controller.
  • FIGS. 3 and 4 as the embodiments of the exemplary embodiments show flowcharts of the HMD UI mode determination process according to the object location. Further, FIGS. 5a, 5b, 5c, 6a, 6b, 7a, and 7b are drawings to explain the exemplary embodiments of the present invention.
  • an object is a physical thing or entity that exists in the proximity of the user worn on the HMD and it includes, for example, a wall, table, and ball.
  • the first embodiment of the exemplary embodiments applies a UI mode based on the detection of an object in the proximity of the HMD 100. Further description is as follows.
  • the UI mode determination process to apply the HMD is operated by a user’s request or automatic system settings.
  • S110 can be operated based on settings determined by the user.
  • the user 10 can predetermine settings according to ambient environmental conditions in the proximity of the HMD UI and in that case, the processor 110 can control the HMD so that the HMD UI automatically changes according to the ambient environmental conditions.
  • the exemplary embodiments of present invention will describe an optimized HMD UI determination process in the case that the HMD UI is preset to be operated according to the ambient environmental conditions.
  • the HMD UI mode process comprises the steps of object location determination S120 and HMD UI mode determination S130.
  • the HMD processor 110 detects an object in the proximity of the HMD and determines the location of the object through the object sensor S121. Based on the result of the S121 determination, the processor 110 separates the relationship between the HMD and the object into three statuses.
  • F1 status when an object is detected and the detected object stays within distance in which physical feedback is possible
  • F2 status when an object is detected and the detected object stays not within distance in which physical feedback is possible
  • F3 status when an object does not exist in the proximity of the HMD, shown in S124, FIG. 5c.
  • FIGS. 5a, 5b, and 5c The further illustration of the F1, F2, and F3 statuses in FIGS. 5a, 5b, and 5c are as follows.
  • the HMD 100 processor 110 determines whether an object 400 exists and where it is located through the object sensor 121.
  • FIGS. 5a and 5b demonstrate when the object 400 is detected in the proximity of the HMD and
  • FIG. 5c demonstrates when an object does not exist.
  • FIG. 5a shows when a distance, called D1, between the detected object 400 and the HMD is less than a predetermined distance, called Th. S122.
  • FIG. 5b shows when a distance, called D2, between the detected object 400 and the HMD is larger than the predetermined distance, called Th.
  • the predetermined distance Th can be set as a distance in which the user can have a physical contact with and touch the object 400. Therefore, the F1 status of FIG. 5a indicates a status in which the user can touch the object and the F2 status of FIG. 5b indicates a status in which the user cannot touch the object 400 even if it exists. Further, the F3 status of FIG. 5c is a status in which the object 400 does not exist in the proximity of the HMD.
  • the HMD processor 110 selects a HMD UI and operates it by the UI control unit 170. For example, in the case of F1(S122), the aforementioned physical UI mode is applied S131 and in the case of F2 status (S123) or F3 status (S124), the aforementioned non-physical UI mode is applied S132. These physical and non-physical UI modes can also be referred as Object and Non-object modes, respectively.
  • the processor 110 continuously detects an object and determines the location of the object S121 and when the state changes, for example, from F1 to F3, or from F2 to F1, the HMD UI mode can be automatically changed. Accordingly, in an embodiment, it is possible that the user 10 may be informed of the change when the UI mode automatically changes.
  • the UI mode that F1 status is applied to indicates that the user 10 can directly contact or touch the UI, it can be, for example, a virtual keyboard UI method shown in FIG. 6a or a drawing UI method shown in FIG. 6b. Yet, these are only the embodiments of the exemplary embodiments and other diverse UI methods that can be physically touched can clearly exist.
  • the virtual keyboard UI method of FIG. 6a displays the virtual keyboard 410 on the surface of the detected object 400 and generates a command that the user directly inputs by touching the virtual keyboard 410. Then, the corresponding object 400 provides the user 10 with the touch feeling so that the user 10 can efficiently use the virtual keyboard 410.
  • the drawing UI method of FIG. 6b is a method, for example, in which a virtual window 420 that can be drawn is displayed on the surface of the detected object 400 and the user 10 generates desired commands by using a pen 430. Then, the corresponding object 400 provides the user 10 with the touch feeling so that the user 10 can efficiently use the pen 430.
  • one of a plurality of the physical mode UIs can be selected by the user’s settings or the system’s settings.
  • the user 10 can predetermine the settings by using the user input unit 150, it is possible that in the case of the physical UI mode determination, either one of the virtual keyboard shown in FIG. 6a or the drawing UI method shown in FIG. 6b can be prematurely set as a default.
  • the UI control unit 170 can determine whether drawing input devices such as the pen 430 exist. If a drawing input device exists, the drawing UI method shown in FIG.
  • the virtual keyboard shown in FIG. 6a can be prematurely selected.
  • a UI method has been prematurely selected, a different UI method can be used if there is any change.
  • the drawing UI method shown in FIG. 6b can be automatically used in the case of grabbing a drawing input device by hand.
  • the user can change the UI mode at any time when the user desires to change to a certain UI mode.
  • the virtual keyboard UI method when applied as shown in FIG. 6a as a physical UI, the location of the virtual keyboard 410 on the surface of the object can be controlled in various ways. Further illustration is as follows.
  • the virtual keyboard 410 can be created at a point where the user hand 10 is located. That is, the processor 110 determines whether the user hand 10 is approaching near or touching the surface of the object 400 and controls to generate the virtual keyboard 410 at a point where the corresponding user hand 10 is located. Accordingly, as an object that the user wants is created and the virtual keyboard is created at a certain point of the surface of the object, the user can conveniently utilize the virtual keyboard.
  • the UI control unit 170 is equipped with a one-hand virtual keyboard like a small size keyboard or a two-hand virtual keyboard like a large size keyboard, and the processor 110 controls to generate either one of the one-hand or two-hand keyboards by determining the number of fingers that are approaching near or touching the object.
  • the location of the virtual keyboard 410 can be determined based on the user’s view angle.
  • the processor 110 can control to determine whether the user is using a first view angle, the view angle of the right eye, or a second view angle, the view angle of the left eye, or both.
  • the processor 110 then controls the virtual keyboard 410 so that the virtual keyboard 410 is located at an appropriate point corresponding to the view angle.
  • the appropriate point corresponding to the view angle can be the center point of the corresponding view angle when only one view angle is used or the overlapping point of the corresponding view angles when the both view angles are used.
  • the processor 110 can determine the type of the virtual keyboard by determining whether the user 10 is using only one hand or two hands and generate the determined virtual keyboard on the surface of the object that the user hand or hands 10 is/are approaching near or touching. Also, the processor 110 can first determine the type of the virtual keyboard by determining whether the user 10 is using one hand or two hands, and generate the determined virtual keyboard at an appropriate point of the view angle that the user is using.
  • the processor 110 can first determine the type of the virtual keyboard by determining whether the user 10 is using one hand or two hands, generate the determined virtual keyboard at an appropriate point of the view angle that the user is using, and move the generated virtual keyboard on the surface of the object that the user hand 10 is approaching near or touching.
  • the processor 110 can first determine the type of the virtual keyboard by determining whether the user 10 is using one hand or two hands and generate the determined virtual keyboard at the most appropriate location by comparing the appropriate point of the view angle with the location of the surface of the object that the user 10 is approaching or touching. For example, in the case of the user hand 10 not within the view angle, the virtual keyboard can be created at the boundary of the view angle. It is because it is determined that the user hand 10 has nothing to do with the virtual keyboard. On the other hand, in the case of the user hand 10 within the view angle, the virtual keyboard can be created at the location of the user hand 10 first. It is because it is determined that the user hand 10 is preparing for the use of the virtual keyboard.
  • the diverse examples about the type and location of the aforementioned virtual keyboard 410 can be applied the same way to the type of the window 420 for the drawing UI shown in FIG. 6b and the determination of the location of the window 420.
  • non-physical UI mode that F2 and F3 statuses are applied to is a UI mode in which the user 10 does not physically utilize the object 400, it can be, for example, the voice recognition UI method shown in FIG. 7a, the gesture UI method shown in FIG. 7b, or a UI using eye movements. Yet, these are only the embodiments of the present invention and, as mentioned, other diverse UI methods having a non-physical contact with an object 400 clearly exist.
  • the voice recognition UI method of FIG. 7a displays an icon 440 displaying that the voice recognition UI is being executed on the screen of the HMD display and once the user’s 10 voice is inputted, the voice command is recognized, and converted through the voice recognition process in the non-physical UI mode control unit 172. Then the voice recognition UI method performs the corresponding command through the processor 110.
  • the gesture UI method of FIG. 7b displays an icon 450 displaying that the voice recognition UI is being executed on the screen of the HMD display and the gestures of the user 10 such as a user’s finger move 451 and head move (not shown) are used to input commands. Once the user’s gesture is inputted, gesture commands are recognized and converted through the gesture recognition process in the non-physical UI mode control unit 172, and the corresponding command is performed by the processor 110.
  • any one of a plurality of the aforementioned non-physical mode UIs can be selected by the user’s settings or the system settings.
  • the user 10 can predetermine the settings by using the user input unit 150, either one of the voice recognition UI method or the gesture UI method can be prematurely selected as a default when the non-physical UI mode is being determined.
  • the UI control unit 170 analyses noise around the user 10 and the voice recognition UI method shown in FIG. 7a is selected if the noise is below a certain level and the gesture UI method shown in FIG.
  • the voice recognition UI method shown in FIG. 7a may be automatically changed to the gesture UI method shown in FIG. 7b.
  • the user can change the UI method any time the user desires.
  • the voice recognition mode shown in FIG. 7a and the gesture UI method shown in FIG. 7b can be set to be operated at the same time and in that case the user 10 can utilize the voice recognition UI or the gesture UI or both to send the commands.
  • FIGS. 8 and 9 show flowcharts of the UI mode method considering the HMD view angle as a second embodiment of the exemplary embodiments.
  • FIGS. 10a and 10b are drawings to show how the UI mode determination process is applied to the second embodiment. Accordingly, the second embodiment of the exemplary embodiments considers view angle more than the first embodiment does when the UI mode is being determined. Further illustration is as follows.
  • the HMD UI mode determination process includes the steps of object location determination S220, view angle determination S230, and HMD UI mode determination S240.
  • the processor 110 detects an object in the proximity of the HMD and determines the location of the detected object S221.
  • the processor 110 determines whether the relationship between the HMD and the object is one of the aforementioned F1, F2, or F3 statuses. For example, it is called F1 status when an object is detected and the detected object stays within distance in which physical feedback is possible. S122. In addition, it is called F2 status when an object is detected and the detected object stays not within distance in which physical feedback is possible. S123. Lastly, it is called F3 status when an object does not exist in the proximity of the HMD. S124.
  • the HMD 100 processor 110 further determines after determining F1 status S222 whether the detected object is within the HMD view angle through the view angle sensor 120.
  • the view angle can have two statuses. Further description is as follows in reference to FIGS. 10a and 10b.
  • FIG. 10a shows a flowchart for the object 400 within the HMD view angle and that case is called S1 status.
  • FIG. 10b shows a flowchart for the object 400 not within the HMD view angle and that case is called S2 status. That is, S1 status indicates that the object 400 in the proximity of the HMD exists not only within the predetermined distance but also within the view angle of the user 10. On the other hand, S2 status indicates that the object 400 in the proximity of the HMD is within the predetermined distance but not within the view angle of the user.
  • the HMD UI mode may be determined as the physical UI mode and operated as the virtual keyboard UI or drawing UI through the aforementioned physical UI mode control unit 171. S241.
  • the physical UI type and operation method shown in the first embodiment can be applied to the second embodiment the same way.
  • the HMD processor 110 confirms S2 status S233 through the step of S231, the location of the object 400 is continuously confirmed for a predetermined period of time, for example five seconds (5s) or ten seconds (10s). S234. In the step of S234, if the object 400 is re-detected within the view angle within the predetermined period of time, that is, if S2 status is changed to S1 status, the physical UI mode will be stayed. S242. Further, if the non-physical UI mode is being currently applied, it will be changed to the physical UI mode. S242.
  • the fact that the status of the user 10 changes from S2 status to S1 status within the predetermined period of time indicates that the user 10 temporarily looked away from the object 400 and did not intend to look away from it for good. That is, in that case, the physical UI mode will be stayed due to the user’s temporary eye movement (presume a short term intention) and if the then-existing UI mode is the non-physical UI mode, it is appropriate that the UI mode may be changed to the physical UI mode.
  • the UI mode will change to the non-physical UI mode.
  • the UI mode will change to the non-physical UI mode.
  • the physical UI mode is being currently applied, it will be changed to the non-physical UI mode.
  • step of S234 if the user’s 10 S2 status is stayed for the predetermined period of time, it is deemed that the user 10 intended to look away from the object 400 for a long term. That is, in that case, due to the user’s intention of looking away for a long term, it is appropriate to stay in the non-physical UI mode or change to the non-physical UI mode if the then-existing UI mode is the physical UI mode. Further, through the step of S234, if the user’s 10 status is changed to F2 or F3 status within the predetermined period of time, it is deemed that the user 10 is moving away from the object 400. That is, in that case, as the object 400 cannot utilize the physical UI, the non-physical UI mode may be stayed, and if the then-existing UI mode was the physical mode, it may be changed to the non-physical UI mode.
  • the processor 110 continuously determines whether the object 400 used for the physical UI stays not within the HMD angle view. S241/S242? S231.
  • the processor 110 continuously determines whether the object 400 is detected within the predetermined distance S221.
  • FIGS. 11 and 12 as the third embodiment of the exemplary embodiments show flowcharts of the UI mode determination process regarding the object type. Further, FIGS. 13a and 13b show drawings of how the UI mode determination is applied to the third embodiment. Accordingly, the third embodiment considers the object type more than the second embodiment does. Further illustration is as follows.
  • the HMD UI mode determination process includes the steps of object location determination S320, view angle determination S330, object type determination S340, and HMD UI mode determination S350.
  • the HMD processor 110 detects an object in the proximity of the HMD by the object sensor 121 and determines the location of the detected object. S321.
  • the processor 110 determines the relationship between the HMD and the object as one of the aforementioned F1, F2, and F3 statuses. For example, it is called F1 status when an object is detected and the detected object stays within distance in which physical feedback is possible, shown in S322.
  • F2 status when an object is detected and the detected object stays not within distance in which physical feedback is possible, shown in S323.
  • F3 status when an object does not exist in the proximity of the HMD, shown in S324.
  • the HMD 100 processor 110 further determines whether the detected object is within the HMD view angle S331. For example, according to the step of S331, it can determine either S1 or S2 status.
  • the HMD processor 110 confirms that the relationship between the current HMD and the object is S2 status S333, the location of the object 400 will be continuously confirmed for the predetermined period of time (e.g., five, ten seconds).
  • the predetermined period of time e.g., five, ten seconds.
  • the object is detected within the view angle within the predetermined period of time, that is, if the status is changed to S1 status, the physical UI mode will be stayed or the mode will be changed to the physical UI mode.
  • S352 That is, through the step of S334, the fact that the user’s 10 status changes from S2 to S1 within the predetermined period of time indicates that the user 10 temporarily looked away from the object 400 and did not intend to look away from the object 400.
  • the physical UI mode may be stayed or the then-existing UI mode may be changed to the physical UI mode if the then-existing UI mode is the non-physical UI mode.
  • step of S334 if the object is not detected within the view angle for the predetermined period of time, that is, if S2 status is stayed or if it is changed to F2 or F3 status, the non-physical UI mode may be stayed or the UI mode may be changed to the non-physical UI mode.
  • step of S334 if the user’s 10 status is stayed as S2 status for the predetermined period of time, it is deemed that the user 10 looked away from the object and intended to look away.
  • the non-physical UI mode may be stayed or the UI mode may change to the non-physical UI mode if the then-existing UI mode is the physical UI mode.
  • the step of S334 if the user’s 10 status changes to F2 or F3 status within the predetermined period of time, it is deemed that the user 10 is moving away from the object 400. That is, in that case, as the object 400 cannot utilize the UI, the non-physical UI mode may be stayed or the UI mode may change to the non-physical UI mode if the UI mode is the physical UI mode.
  • the HMD processor 110 further determines the object type.
  • the object type is the external shape of an object and can be categorized based on whether the object is user-interfaceable.
  • the wall 461 or the table 462 shown in FIG. 13a are Type 1 which the user can easily have a contact with or is user-interfaceable.
  • the basketball 463 shown in FIG. 13b is Type 2 that the user cannot easily have a contact with or is not user-interfaceable.
  • the HMD processor 110 determines through the step of object type determination S341 that the corresponding object is Type 1, it determines the physical UI mode as the HMD UI mode. S351. Then, through the aforementioned physical UI mode control unit 171, the UI methods such as the virtual keyboard UI and drawing UI that can have a contact with or touch the Type 1 objects such as the wall 461 and table 462 will be operated.
  • the HMD processor 110 determines through the step of object type determination S341 that the corresponding object is Type 2, it selects the non-physical UI mode as the HMD UI mode. S353. Then, through the aforementioned non-physical UI mode control unit 172, regardless of the Type 2 object 463 existing within the predetermined distance and the view angle, applicable non-physical UI methods such as the voice recognition UI and the gesture UI will be operated. Accordingly, the types and operating methods of the physical and non-physical UIs shown in the first embodiment will be applied the same in the third embodiment.
  • the processor 110 continuously determines whether the object 400 used for the physical UI stays not within the HMD view angle. S351/S352 ? S331.
  • the processor 110 continuously determines whether the object 400 is detected within the predetermined distance. S343/S344 ?S321.
  • FIGS. 14 and 15 as the fourth embodiment show flowcharts of the UI mode determination considering detecting a new digital device. Further, FIG. 16 shows a drawing of how the UI mode determination is applied to the fourth embodiment. Accordingly, the fourth amendment of the exemplary embodiments considers detecting a new digital device more than the second embodiment does. Further description is as follows.
  • the HMD UI mode determination process includes the steps of object location determination S420, view angle determination S430, digital device detection and location determination S440 and HMD UI mode determination S450.
  • the HMD processor 110 detects an object in the proximity of the HMD by the object sensor 121 and determines the location of the object S421.
  • the processor 110 determines the relationship between the HMD and the object as one of the aforementioned F1, F2, and F3 statuses. For example, it is called F1 status when an object is detected and the detected object stays within distance in which physical feedback is possible, shown in S422.
  • F2 status when an object is detected and the detected object stays not within distance in which physical feedback is possible, shown in S423.
  • F3 status when an object does not exist in the proximity of the HMD, shown in S424.
  • the HMD 100 processor 110 further determines by the view angle sensor 120 whether the detected object is within the HMD view angle. S431. For example, according to the step of S431, it can determine either S1 or S2 status.
  • the HMD processor 110 determines whether a new digital device within the predetermined distance exists. S441. For example, according to FIG. 16, the HMD user 10 can look away from the original detected object 400 to a nearby digital device 500. Thus, in that case, if the new device 500 is detected within the predetermined distance through the step of S441, the HMD processor 110 tries to perform connecting communications with the corresponding digital device 500 by using the communications unit 140. Once communications between the HMD 100 and the device 500 are connected, the HMD processor 110 utilizes a display method 510 in the digital device 500 as the physical mode by using the UI control unit 170. S452.
  • the HMD processor 110 interprets that the user intended to look away for a long term and stays the non-physical UI mode. If the original UI mode is the physical UI mode, it may be changed to the non-physical UI mode.
  • the HMD processor 110 determines that the detected object is within the predetermined distance (S1 status, S432), the HMD processor 110 operates the physical UI mode as the HMD UI mode. S451.
  • the type and operating methods of the physical and non-physical UIs shown in the first amendment will be applied the same in the fourth amendment.
  • the display function of the corresponding new digital device can be utilized as the physical UI mode.
  • the physical UI on the surface of the object may be deleted but the deleted physical UI may be removed and applied to the display of the digital device.
  • the physical UI may be stayed on the surface of the object and the display function of the digital device can be used as another physical UI that is different from the physical UI of the surface of the object.
  • the physical UI of the object surface can be used as the virtual keyboard UI and the display function of the digital device can be used as the drawing UI.
  • the physical UI of the object surface can be used as the numbers pad of the virtual keyboard and the display function of the digital device can be used as the letters pad of the virtual keyboard.
  • the processor 110 continuously determines whether the object 400 used for the physical UI or the digital device 500 stay not within the HMD view angle. S451/S452 ? S431. On the other hand, in the case of the non-physical UI mode being applied as if shown in S453 and S454, the processor 110 continuously determines whether the object is detected within the predetermined distance. S453/S454 ? S321.
  • FIGS. 17 and 18 as the fifth embodiment of exemplary embodiments show flowcharts of the UI mode determination considering the detection of a digital device. Further illustration is as follows.
  • the HMD UI mode includes the steps of digital device detection and location determination S520 and HMD UI mode determination S530.
  • the HMD processor 110 detects a digital device in the proximity of the HMD using the object sensor 121 and the communications unit 140 and determines the location of the detected digital device. S521.
  • the processor 110 determines the relationship between the HMD and the digital device as one of the aforementioned F1, F2, and F3 statuses. For example, it is called F1 status when a device is detected and the detected device stays within distance in which physical feedback is possible, shown in S522.
  • F2 status when a device is detected and the detected device stays not within distance in which physical feedback is possible, shown in S523.
  • F3 status when a device does not exist in the proximity of the HMD, shown in S524.
  • the HMD processor 110 determines that it is F1 status S522, the HMD processor 110 performs connecting communications through the communications 140. S531. If the communications connection is completed, the HMD processor 110 operates the physical UI mode by using the display in the device through the aforementioned UI control unit 170. S531. That is, the display equipped in the corresponding device can be utilized as the HMD virtual keyboard. On the other hand, if the HMD processor 110 determines that it is F2 status S523 or F3 status S524, the HMD processor 110 operates the non-physical UI mode through the aforementioned UI control unit 170. S533.

Abstract

An apparatus and method of providing a user interface (UI) on head mounted display and the head mounted display (HMD) thereof is disclosed. The apparatus comprises a sensor unit detecting whether an object exists in the proximity of the HMD and if the object is detected, the sensor unit senses a distance between the object and the HMD. The apparatus further comprises a processor controlling a User Interface (UI) of the HMD based on a result of the sensor unit. A physical User Interface (UI) mode is applied if the detected object is within a predetermined distance from the HMD and a non-physical User Interface (UI) mode is applied if the object is not detected or is not within the predetermined distance from the HMD. The method includes the steps of detecting whether an object exists in the proximity of the HMD and determining a distance between the detected object and the HMD if the object is detected. The method further includes the step of applying the physical UI mode if the detected object is within a predetermined distance from the HMD. Finally, the method includes the step of applying the non-physical UI mode if the object is not detected or is not within the predetermined distance from the HMD.

Description

APPARATUS AND METHOD OF PROVIDING USER INTERFACE ON HEAD MOUNTED DISPLAY AND HEAD MOUNTED DISPLAY THEREOF
The exemplary embodiments of present invention relate to an apparatus and method of providing a User Interface (UI) and/or a User Experience (UX) (herein after, “UI”), and more particularly to an apparatus and method of determining an optimized UI and providing the UI on a head mounted display and a head mounted display thereof.
The UI technology is an interface method that helps users conveniently utilize diverse digital devices. In more details, the UI technology is a part of a program that a user counter-interacts with in order for the user and a digital device to offer and obtain information. For example, it is called a command-line interface in which the user inputs a command to run a program, a menu-driven interface operated by commands of menu selection, and a Graphic User Interface (GUI) in which a figure display program is operated by using position locating devices such as an optical pen, mouse, control ball, and a joystick. Further, recently, a gesture UI operated by a user’s action commands and a voice recognition UI operated by the user’s voice without his or her action have been developed to be applied to the digital devices.
In addition, as the trend goes toward minimizing the weight and size of the digital devices, diverse wearable digital devices have been developed. As one of the wearable digital devices, a head mounted display (HMD) that can be worn on the face, as if eye glasses are, has been developed. The HMD can be collaborated with technologies such as Augmented Reality technology and N Screen technology beyond the simple display use, providing users many diverse conveniences.
Accordingly, the aforementioned diverse UI technologies can be applied to the HMD. However, it is difficult to determine an optimized UI for the HMD because the HMD can be worn on a user and freely moved. Therefore, a technology providing a very efficient and convenient UI is demanded considering the characteristics of the HMD and ambient environmental conditions in the proximity of the HMD.
Accordingly, the exemplary embodiments of present invention are directed to an apparatus and method of providing a User Interface that substantially obviates one or more problems due to limitations and disadvantages of the related art.
The exemplary embodiments of present invention provide a technology providing a very efficient and convenient UI considering the characteristics of the HMD and ambient environmental conditions in the proximity of the HMD.
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate exemplary embodiment(s) of the invention and together with the description serve to explain the principle of the exemplary embodiments. In the drawings:
FIGS. 1a and 1b show drawings to explain a HMD according to the exemplary embodiments of present invention;
FIG. 2 shows a block diagram for a HMD internal configuration according to the exemplary embodiments of present invention;
FIGS. 3 and 4 show flowcharts of UI mode determination process according to object location as a first exemplary embodiment of present invention;
FIGS. 5a, 5b, and 5c are drawings to show how the UI mode determination process is applied to the first exemplary embodiment of present invention;
FIGS. 6a and 6b show an example of physical UI mode (e.g., keyboard, drawing) according to the exemplary embodiments of present invention;
FIGS. 7a and 7b show an example of non-physical UI mode (e.g., voice, gesture) according to the exemplary embodiments of present invention;
FIGS. 8 and 9 show flowcharts of UI mode determination considering a HMD view angle as a second exemplary embodiment of present invention;
FIGS. 10a and 10b are drawings to show how the UI mode determination process is applied to the second exemplary embodiment of present invention;
FIGS. 11 and 12 show flowcharts of UI mode determination considering an object type as a third exemplary embodiment of present invention;
FIGS. 13a and 13b are drawings to show how the UI mode determination process is applied to the third exemplary embodiment of present invention;
FIGS. 14 and 15 show flowcharts of UI mode determination process utilizing a digital device within a view angle as a fourth exemplary embodiment of present invention;
FIG. 16 is a drawing to show how the UI mode determination process is applied to the fourth exemplary embodiment of present invention; and
FIGS. 17 and 18 show flowcharts of UI mode determination process utilizing a digital device as a fifth exemplary embodiment of present invention.
One object of the exemplary embodiments is, in providing a Head Mounted Display User Interface (HMD UI), to provide an optimized HMD UI considering the ambient environmental conditions in the proximity of the HMD. Especially, another object of the exemplary embodiments is to apply the HMD UI differently based on whether a usable object for the HMD UI exists in the proximity of the HMD.
Another object of the exemplary embodiments is to change and provide an optimized HMD UI based on the ambient environmental conditions in the proximity of the HMD that is being used at present.
Additional advantages, objects, and features of the exemplary embodiments will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the exemplary embodiments. The objectives and other advantages of the exemplary embodiments may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
To achieve these objects and other advantages and in accordance with the purpose of the exemplary embodiments, as embodied and broadly described herein, a method of providing a User Interface (UI) includes the steps of detecting whether an object exists in the proximity of the HMD and determining a distance between the detected object and the HMD if the object is detected. The method further includes the step of applying a UI mode if the detected object is within a predetermined distance from the HMD. Finally, the method includes the step of applying a non-physical UI mode if the object is not detected or is not within the predetermined distance from the HMD.
In another aspect of the exemplary embodiments, a UI apparatus comprises a sensor unit detecting whether an object exists in the proximity of the HMD and if the object is detected, the sensor unit senses a distance between the object and the HMD. The apparatus further comprises a processor controlling a User Interface (UI) of the HMD based on a result of the sensor unit. The physical UI mode is applied if the detected object is within a predetermined distance from the HMD and the non-physical UI mode is applied if the object is not detected or is not within the predetermined distance from the HMD.
It is to be understood that both the foregoing general description and the following detailed description of the embodiments are exemplary and explanatory and are intended to provide further explanation of the exemplary embodiments as claimed.
Reference will now be made in detail to the exemplary embodiments of present invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
FIGS. 1a and 1b show drawings to explain a HMD as an example according to the exemplary embodiments of present invention. In more details, FIG. 1a indicates an example of the external configuration of a HMD 100 and FIG. 1b indicates an example of the HMD 100 worn on a user 10. Accordingly, the present invention is not limited to the external configuration of the HMD 100 and clearly any external configurations of a HMD can be utilized to realize the exemplary embodiments.
Especially, the HMD 100 to which an UI is applied includes a display screen 101 and at least one sensor 102. Not only all contents and images are provided the user 10 by the HMD through the display screen 101 but also information about the UI of the exemplary embodiments is provided. Further, the HMD 100 includes at least one sensor 102, detects ambient environmental conditions in the proximity of the HMD 100, and is utilized as an important element to determine a HMD UI to operate such sensor functions. Further, the HMD 100 is able to include a supporting component 103 in order for the user 10 to wear the HMD 100 in the head and an audio outputting unit 104 wearable in the ear.
FIG. 1b shows a drawing of the HMD 100 worn on the user 10 in order to explain the status of the HMD 100 that is being used at present. According to an embodiment to be mentioned, the HMD 100 includes a predetermined distance of a view angle area 200. The view angle area 200 is a predetermined area that corresponds to the user 10 wearing the HMD 100 and can include an area with a certain angle in the forward direction of the HMD 100. Further, according to the embodiment, the HMD 100 and an external digital device (not shown in the drawing) can be connected to communicate by a network 300. For example, usable wireless networks are Near Field Communication (NFC), Zigbee, Infrared Communications, Bluetooth, and WIFI. That is, in the exemplary embodiments, communications between the HMD 100 and the digital device can be realized by one of the enumerated networks 300 above or a combination of them. Yet, the exemplary embodiments is not limited to those mentioned above.
FIG. 2 shows a block diagram for a HMD internal configuration.
According to FIG. 2, the HMD 100 of the exemplary embodiment includes a processor 110, a sensor unit 120, a storage unit 130, a communications unit 140, a user input unit 150, a display controller 160, a UI control unit 170.
The sensor unit 120 can be internally or externally equipped in the HMD 100 and informs the processor 110 of ambient environmental conditions that the HMD 100 recognizes. Then, the sensor unit 120 can include a plurality of sensing methods. For example, the sensor unit 120 not only detects objects or things in the proximity of the HMD 100, called “objects” but also ensures the type of the detected object and can include an object sensor 121 sensing the distance between the detected object and the HMD. Further, the sensor unit 120 can include a view angle sensor 122 sensing the view angle of the HMD 100. More detailed functions and operations of the object sensor 121 and the view angle sensor 122 will be illustrated below. Accordingly, the sensor unit, for example, can be internally or externally equipped in the HMD like a 102 in FIG. 1a.
Also, a plurality of the sensing methods the sensor unit 120 can be configured with includes, for example, a gravity sensor, magnetic sensor, motion sensor, gyro sensor, acceleration sensor, infrared sensor, inclination sensor, brightness sensor, elevation sensor, olfactory sensor, temperature sensor, depth sensor, pressure sensor, bending sensor, audio sensor, video sensor, Global Positioning System (GPS) sensor, and touch sensor. Yet, the exemplary embodiments are not limited to the enumerated. That is, it is sufficient that the sensor unit 120 senses a HMD user and ambient environmental conditions in the proximity of him or her and sends the result of the sensing in order for the processor 110 to be operated accordingly, and the detailed sensing method of the sensor unit 120 is not limited to the enumerated sensing methods.
Further, the storage unit 130 can store diverse digital data such as video, audio, pictures, movie clips, and applications. The storage unit 130 indicates diverse digital data storage space such as flash memory, Random Access Memory (RAM), and Solid State Drive (SSD).
Further, the communications unit 140 transmits and receives data by performing communications with external digital devices and diverse protocols. The HMD 100 of the exemplary embodiment performs pairing with and connecting communications with digital devices in the proximity of the HMD 100 by using the communications unit 140. On the other hand, the communications unit 140 can include a plurality of antennas. The HMD 100 detects the location of a digital device that is being communicated with the HMD by using a plurality of the antennas. That is, the HMD 100 detects the location of the digital device that is being communicated with the HMD by using the time and altitude differences between the transmitted or received signals through a plurality of the antennas.
Further, the user input unit is a device that receives user control commands controlling the use of the HMD 100. The user control commands can include user configuration commands. For example, according to an embodiment of the exemplary embodiments, the user 10 may set the operation corresponding to ambient environmental conditions in the proximity of the HMD UI and predetermine a certain UI mode without the ambient environmental conditions by using the user control information from the user input unit 150. Thus, based on the predetermined setting by the user, when the HMD UI is set to operate in correspondence to ambient environmental conditions, the UI mode can automatically change according to the embodiment of the exemplary embodiments. Meanwhile, based on the predetermined settings by the user, when the certain UI mode is fixed regardless of the ambient environmental conditions, the fixed UI mode will operate on “On” status if the fixed UI mode is in the operable status and if not the UI mode will operate on “Off” status.
Also, the display control unit 160 outputs motion pictures or images on the HMD display screen 101. Further, according to an embodiment, the display control unit 160 provides an HMD external object with an imagery UI screen or performs controlling functions for providing an external digital device with a UI screen. In addition, according to an embodiment, the display control unit 160 provides information to determine the UI for the HMD display screen 101 or images according to the determined UI. Detailed description will be further illustrated.
Also, the UI control unit 170 provides a HMD user with an UI and controls the provided UI. The UI control unit 170 includes a physical UI control unit 171 and a non-physical UI control unit 172.
As the physical UI control unit 171 indicates an UI that can have a physical contact with a user, it can be, for example, a virtual keyboard UI and drawing UI as an embodiment. The virtual keyboard UI indicates a UI method that displays a virtual keyboard on the surface of the detected object in the proximity of the HMD, receives a command by the user’s keyboard touch, and operates accordingly. In addition, the drawing UI indicates a UI method that provides with an imaginary drawing panel on the surface of the detected object in the proximity of the HMD and inputs a command on the drawing panel by the user using drawing tools such as an electric pen or a finger. Further, as the non-physical UI control unit 172 indicates an UI that does not have a physical contact with the user, it can be, for example, a gesture UI and voice recognition UI as an embodiment. Thus, detailed description of the UI methods will be further illustrated.
Accordingly, the physical and non-physical UIs, illustrated in the exemplary embodiments, are certain UI methods and are not limited to only the aforementioned virtual keyboard, drawing, gesture, and voice recognition UIs. That is, the physical UI indicates all UIs that can have a physical contact with an object and the non-physical UI indicates all UIs that do not need a physical contact with an object.
As the processor 110 is a main HMD controller, it controls not only each block of the HMD 100 but also information and data transmission and reception between each block. In the following, detailed process of the exemplary embodiments can be operated mainly by the control operations of the processor 110. Accordingly, the internal configuration block diagram of the HMD 100 shown in FIG. 2 is illustrated as an embodiment for explanation purposes only. Thus, each block shown in FIG. 2 may be combined as a whole or some necessary blocks may be separated and combined as a whole. For example, the processor 110 may be composed with the UI control unit as a combined controller.
FIGS. 3 and 4 as the embodiments of the exemplary embodiments show flowcharts of the HMD UI mode determination process according to the object location. Further, FIGS. 5a, 5b, 5c, 6a, 6b, 7a, and 7b are drawings to explain the exemplary embodiments of the present invention.
In the exemplary embodiments, an object is a physical thing or entity that exists in the proximity of the user worn on the HMD and it includes, for example, a wall, table, and ball. The first embodiment of the exemplary embodiments applies a UI mode based on the detection of an object in the proximity of the HMD 100. Further description is as follows.
The UI mode determination process to apply the HMD is operated by a user’s request or automatic system settings. S110. For example, S110 can be operated based on settings determined by the user. In more details, for example, the user 10 can predetermine settings according to ambient environmental conditions in the proximity of the HMD UI and in that case, the processor 110 can control the HMD so that the HMD UI automatically changes according to the ambient environmental conditions. In the following, the exemplary embodiments of present invention will describe an optimized HMD UI determination process in the case that the HMD UI is preset to be operated according to the ambient environmental conditions.
According to the first embodiment of the exemplary embodiments, the HMD UI mode process comprises the steps of object location determination S120 and HMD UI mode determination S130. When the HMD UI mode process begins, the HMD processor 110 detects an object in the proximity of the HMD and determines the location of the object through the object sensor S121. Based on the result of the S121 determination, the processor 110 separates the relationship between the HMD and the object into three statuses.
For example, it is called F1 status when an object is detected and the detected object stays within distance in which physical feedback is possible, shown in S122, FIG. 5a. In addition, it is called F2 status when an object is detected and the detected object stays not within distance in which physical feedback is possible, shown in S123, FIG. 5b. Lastly, it is called F3 status when an object does not exist in the proximity of the HMD, shown in S124, FIG. 5c.
The further illustration of the F1, F2, and F3 statuses in FIGS. 5a, 5b, and 5c are as follows. The HMD 100 processor 110 determines whether an object 400 exists and where it is located through the object sensor 121. For example, FIGS. 5a and 5b demonstrate when the object 400 is detected in the proximity of the HMD and FIG. 5c demonstrates when an object does not exist. Further, FIG. 5a shows when a distance, called D1, between the detected object 400 and the HMD is less than a predetermined distance, called Th. S122. On the other hand, FIG. 5b shows when a distance, called D2, between the detected object 400 and the HMD is larger than the predetermined distance, called Th. Accordingly, the predetermined distance Th can be set as a distance in which the user can have a physical contact with and touch the object 400. Therefore, the F1 status of FIG. 5a indicates a status in which the user can touch the object and the F2 status of FIG. 5b indicates a status in which the user cannot touch the object 400 even if it exists. Further, the F3 status of FIG. 5c is a status in which the object 400 does not exist in the proximity of the HMD.
When a status of the object in the proximity of the HMD is determined as F1(S122), F2(S123), or F3(S124) through the step of S121, the HMD processor 110 selects a HMD UI and operates it by the UI control unit 170. For example, in the case of F1(S122), the aforementioned physical UI mode is applied S131 and in the case of F2 status (S123) or F3 status (S124), the aforementioned non-physical UI mode is applied S132. These physical and non-physical UI modes can also be referred as Object and Non-object modes, respectively.
In addition, although a certain HMD UI is being currently applied through S131, S132, or S133, the processor 110 continuously detects an object and determines the location of the object S121 and when the state changes, for example, from F1 to F3, or from F2 to F1, the HMD UI mode can be automatically changed. Accordingly, in an embodiment, it is possible that the user 10 may be informed of the change when the UI mode automatically changes.
Accordingly, as the UI mode that F1 status is applied to indicates that the user 10 can directly contact or touch the UI, it can be, for example, a virtual keyboard UI method shown in FIG. 6a or a drawing UI method shown in FIG. 6b. Yet, these are only the embodiments of the exemplary embodiments and other diverse UI methods that can be physically touched can clearly exist.
For example, the virtual keyboard UI method of FIG. 6a displays the virtual keyboard 410 on the surface of the detected object 400 and generates a command that the user directly inputs by touching the virtual keyboard 410. Then, the corresponding object 400 provides the user 10 with the touch feeling so that the user 10 can efficiently use the virtual keyboard 410. Also, the drawing UI method of FIG. 6b is a method, for example, in which a virtual window 420 that can be drawn is displayed on the surface of the detected object 400 and the user 10 generates desired commands by using a pen 430. Then, the corresponding object 400 provides the user 10 with the touch feeling so that the user 10 can efficiently use the pen 430.
Accordingly, when the physical UI mode is selected as the HMD UI in S131, one of a plurality of the physical mode UIs can be selected by the user’s settings or the system’s settings. For example, as the user 10 can predetermine the settings by using the user input unit 150, it is possible that in the case of the physical UI mode determination, either one of the virtual keyboard shown in FIG. 6a or the drawing UI method shown in FIG. 6b can be prematurely set as a default. Or, when the user’s settings do not exist, the UI control unit 170 can determine whether drawing input devices such as the pen 430 exist. If a drawing input device exists, the drawing UI method shown in FIG. 6b can be prematurely selected and if a drawing device does not exist, the virtual keyboard shown in FIG. 6a can be prematurely selected. Also, although a UI method has been prematurely selected, a different UI method can be used if there is any change. For example, when the user 10 is using the virtual keyboard UI method shown in FIG. 6a, the drawing UI method shown in FIG. 6b can be automatically used in the case of grabbing a drawing input device by hand. Also, with regard to the original determined UI mode, the user can change the UI mode at any time when the user desires to change to a certain UI mode.
Further, when the virtual keyboard UI method is applied as shown in FIG. 6a as a physical UI, the location of the virtual keyboard 410 on the surface of the object can be controlled in various ways. Further illustration is as follows.
For example, the virtual keyboard 410 can be created at a point where the user hand 10 is located. That is, the processor 110 determines whether the user hand 10 is approaching near or touching the surface of the object 400 and controls to generate the virtual keyboard 410 at a point where the corresponding user hand 10 is located. Accordingly, as an object that the user wants is created and the virtual keyboard is created at a certain point of the surface of the object, the user can conveniently utilize the virtual keyboard.
Also, it is possible that different types and sizes of the virtual keyboard can be created depending on whether the user is using only one hand or both hands. For example, the UI control unit 170 is equipped with a one-hand virtual keyboard like a small size keyboard or a two-hand virtual keyboard like a large size keyboard, and the processor 110 controls to generate either one of the one-hand or two-hand keyboards by determining the number of fingers that are approaching near or touching the object.
In addition, the location of the virtual keyboard 410 can be determined based on the user’s view angle. For example, the processor 110 can control to determine whether the user is using a first view angle, the view angle of the right eye, or a second view angle, the view angle of the left eye, or both. The processor 110 then controls the virtual keyboard 410 so that the virtual keyboard 410 is located at an appropriate point corresponding to the view angle. For example, the appropriate point corresponding to the view angle can be the center point of the corresponding view angle when only one view angle is used or the overlapping point of the corresponding view angles when the both view angles are used.
Further, all the aforementioned embodiments can be combined and used. That is, for example, the processor 110 can determine the type of the virtual keyboard by determining whether the user 10 is using only one hand or two hands and generate the determined virtual keyboard on the surface of the object that the user hand or hands 10 is/are approaching near or touching. Also, the processor 110 can first determine the type of the virtual keyboard by determining whether the user 10 is using one hand or two hands, and generate the determined virtual keyboard at an appropriate point of the view angle that the user is using. Also, the processor 110 can first determine the type of the virtual keyboard by determining whether the user 10 is using one hand or two hands, generate the determined virtual keyboard at an appropriate point of the view angle that the user is using, and move the generated virtual keyboard on the surface of the object that the user hand 10 is approaching near or touching.
Further, the processor 110 can first determine the type of the virtual keyboard by determining whether the user 10 is using one hand or two hands and generate the determined virtual keyboard at the most appropriate location by comparing the appropriate point of the view angle with the location of the surface of the object that the user 10 is approaching or touching. For example, in the case of the user hand 10 not within the view angle, the virtual keyboard can be created at the boundary of the view angle. It is because it is determined that the user hand 10 has nothing to do with the virtual keyboard. On the other hand, in the case of the user hand 10 within the view angle, the virtual keyboard can be created at the location of the user hand 10 first. It is because it is determined that the user hand 10 is preparing for the use of the virtual keyboard.
Accordingly, the diverse examples about the type and location of the aforementioned virtual keyboard 410 can be applied the same way to the type of the window 420 for the drawing UI shown in FIG. 6b and the determination of the location of the window 420.
Also, as the non-physical UI mode that F2 and F3 statuses are applied to is a UI mode in which the user 10 does not physically utilize the object 400, it can be, for example, the voice recognition UI method shown in FIG. 7a, the gesture UI method shown in FIG. 7b, or a UI using eye movements. Yet, these are only the embodiments of the present invention and, as mentioned, other diverse UI methods having a non-physical contact with an object 400 clearly exist.
For example, the voice recognition UI method of FIG. 7a displays an icon 440 displaying that the voice recognition UI is being executed on the screen of the HMD display and once the user’s 10 voice is inputted, the voice command is recognized, and converted through the voice recognition process in the non-physical UI mode control unit 172. Then the voice recognition UI method performs the corresponding command through the processor 110. Also, the gesture UI method of FIG. 7b displays an icon 450 displaying that the voice recognition UI is being executed on the screen of the HMD display and the gestures of the user 10 such as a user’s finger move 451 and head move (not shown) are used to input commands. Once the user’s gesture is inputted, gesture commands are recognized and converted through the gesture recognition process in the non-physical UI mode control unit 172, and the corresponding command is performed by the processor 110.
Accordingly, when the non-physical UI mode is determined to be applied as the HMD UI in the steps of S132 and S133, any one of a plurality of the aforementioned non-physical mode UIs can be selected by the user’s settings or the system settings. For example, as the user 10 can predetermine the settings by using the user input unit 150, either one of the voice recognition UI method or the gesture UI method can be prematurely selected as a default when the non-physical UI mode is being determined. In addition, when the user’s settings do not exist, the UI control unit 170 analyses noise around the user 10 and the voice recognition UI method shown in FIG. 7a is selected if the noise is below a certain level and the gesture UI method shown in FIG. 7b is prematurely selected if the noise is above a certain level. Also, although a UI method is originally determined, a different UI method can be used if there is any change. For example, when the noise around the user 10 changes from below to above a certain level, the voice recognition UI method shown in FIG. 7a may be automatically changed to the gesture UI method shown in FIG. 7b. Further, with regard to the original determined UI mode, the user can change the UI method any time the user desires. Moreover, as for the non-physical UI mode, for example, the voice recognition mode shown in FIG. 7a and the gesture UI method shown in FIG. 7b can be set to be operated at the same time and in that case the user 10 can utilize the voice recognition UI or the gesture UI or both to send the commands.
FIGS. 8 and 9 show flowcharts of the UI mode method considering the HMD view angle as a second embodiment of the exemplary embodiments. Further, FIGS. 10a and 10b are drawings to show how the UI mode determination process is applied to the second embodiment. Accordingly, the second embodiment of the exemplary embodiments considers view angle more than the first embodiment does when the UI mode is being determined. Further illustration is as follows.
According to the second embodiment of the exemplary embodiments, the HMD UI mode determination process includes the steps of object location determination S220, view angle determination S230, and HMD UI mode determination S240. Once the HMD UI mode determination process begins S210, the processor 110 detects an object in the proximity of the HMD and determines the location of the detected object S221. After the step of S221, the processor 110 determines whether the relationship between the HMD and the object is one of the aforementioned F1, F2, or F3 statuses. For example, it is called F1 status when an object is detected and the detected object stays within distance in which physical feedback is possible. S122. In addition, it is called F2 status when an object is detected and the detected object stays not within distance in which physical feedback is possible. S123. Lastly, it is called F3 status when an object does not exist in the proximity of the HMD. S124.
Then, the HMD 100 processor 110 further determines after determining F1 status S222 whether the detected object is within the HMD view angle through the view angle sensor 120. For example, according to S231, the view angle can have two statuses. Further description is as follows in reference to FIGS. 10a and 10b.
FIG. 10a shows a flowchart for the object 400 within the HMD view angle and that case is called S1 status. Further, FIG. 10b shows a flowchart for the object 400 not within the HMD view angle and that case is called S2 status. That is, S1 status indicates that the object 400 in the proximity of the HMD exists not only within the predetermined distance but also within the view angle of the user 10. On the other hand, S2 status indicates that the object 400 in the proximity of the HMD is within the predetermined distance but not within the view angle of the user.
If the HMD processor 110 confirms S1 status S232 through the step of S231, the HMD UI mode may be determined as the physical UI mode and operated as the virtual keyboard UI or drawing UI through the aforementioned physical UI mode control unit 171. S241. The physical UI type and operation method shown in the first embodiment can be applied to the second embodiment the same way.
Further, if the HMD processor 110 confirms S2 status S233 through the step of S231, the location of the object 400 is continuously confirmed for a predetermined period of time, for example five seconds (5s) or ten seconds (10s). S234. In the step of S234, if the object 400 is re-detected within the view angle within the predetermined period of time, that is, if S2 status is changed to S1 status, the physical UI mode will be stayed. S242. Further, if the non-physical UI mode is being currently applied, it will be changed to the physical UI mode. S242. Accordingly, the fact that the status of the user 10 changes from S2 status to S1 status within the predetermined period of time indicates that the user 10 temporarily looked away from the object 400 and did not intend to look away from it for good. That is, in that case, the physical UI mode will be stayed due to the user’s temporary eye movement (presume a short term intention) and if the then-existing UI mode is the non-physical UI mode, it is appropriate that the UI mode may be changed to the physical UI mode.
On the other hand, in the step of S234, if the object 400 is not detected within the view angle within the predetermined period of time (e.g., five or ten seconds), that is, if S2 status is stayed or changed to F2 or F3 status, the UI mode will change to the non-physical UI mode. S243. Or, if the physical UI mode is being currently applied, it will be changed to the non-physical UI mode. S243.
Accordingly, in the step of S234, if the user’s 10 S2 status is stayed for the predetermined period of time, it is deemed that the user 10 intended to look away from the object 400 for a long term. That is, in that case, due to the user’s intention of looking away for a long term, it is appropriate to stay in the non-physical UI mode or change to the non-physical UI mode if the then-existing UI mode is the physical UI mode. Further, through the step of S234, if the user’s 10 status is changed to F2 or F3 status within the predetermined period of time, it is deemed that the user 10 is moving away from the object 400. That is, in that case, as the object 400 cannot utilize the physical UI, the non-physical UI mode may be stayed, and if the then-existing UI mode was the physical mode, it may be changed to the non-physical UI mode.
Further, in the case of the physical UI mode being applied as if shown in the steps of S241 and S242, the processor 110 continuously determines whether the object 400 used for the physical UI stays not within the HMD angle view. S241/S242? S231. On the other hand, in the case of the non-physical UI mode being applied as if shown in the steps of S243 and S244, the processor 110 continuously determines whether the object 400 is detected within the predetermined distance S221.
FIGS. 11 and 12 as the third embodiment of the exemplary embodiments show flowcharts of the UI mode determination process regarding the object type. Further, FIGS. 13a and 13b show drawings of how the UI mode determination is applied to the third embodiment. Accordingly, the third embodiment considers the object type more than the second embodiment does. Further illustration is as follows.
According to the third embodiment of the exemplary embodiments, the HMD UI mode determination process includes the steps of object location determination S320, view angle determination S330, object type determination S340, and HMD UI mode determination S350. Once the HMD UI mode determination process begins S310, the HMD processor 110 detects an object in the proximity of the HMD by the object sensor 121 and determines the location of the detected object. S321. After determining the step of S321, the processor 110 determines the relationship between the HMD and the object as one of the aforementioned F1, F2, and F3 statuses. For example, it is called F1 status when an object is detected and the detected object stays within distance in which physical feedback is possible, shown in S322. In addition, it is called F2 status when an object is detected and the detected object stays not within distance in which physical feedback is possible, shown in S323. Lastly, it is called F3 status when an object does not exist in the proximity of the HMD, shown in S324.
Then, in the case of having determined F1 status S322, the HMD 100 processor 110 further determines whether the detected object is within the HMD view angle S331. For example, according to the step of S331, it can determine either S1 or S2 status.
Through the step of S331, if the HMD processor 110 confirms that the relationship between the current HMD and the object is S2 status S333, the location of the object 400 will be continuously confirmed for the predetermined period of time (e.g., five, ten seconds). In the step of S334, if the object is detected within the view angle within the predetermined period of time, that is, if the status is changed to S1 status, the physical UI mode will be stayed or the mode will be changed to the physical UI mode. S352. That is, through the step of S334, the fact that the user’s 10 status changes from S2 to S1 within the predetermined period of time indicates that the user 10 temporarily looked away from the object 400 and did not intend to look away from the object 400. That is, in that case, due to the user’s intention of looking away for a short term, it is appropriate that the physical UI mode may be stayed or the then-existing UI mode may be changed to the physical UI mode if the then-existing UI mode is the non-physical UI mode.
On the other hand, in the step of S334, if the object is not detected within the view angle for the predetermined period of time, that is, if S2 status is stayed or if it is changed to F2 or F3 status, the non-physical UI mode may be stayed or the UI mode may be changed to the non-physical UI mode. S353. That is, through the step of S334, if the user’s 10 status is stayed as S2 status for the predetermined period of time, it is deemed that the user 10 looked away from the object and intended to look away. Thus, in that case, due to the user’s intention of looking away for a long term, the non-physical UI mode may be stayed or the UI mode may change to the non-physical UI mode if the then-existing UI mode is the physical UI mode. Further, through the step of S334, if the user’s 10 status changes to F2 or F3 status within the predetermined period of time, it is deemed that the user 10 is moving away from the object 400. That is, in that case, as the object 400 cannot utilize the UI, the non-physical UI mode may be stayed or the UI mode may change to the non-physical UI mode if the UI mode is the physical UI mode.
On the other hand, if, through the step of the view angle determination S331, the relationship between the current HMD and the object is confirmed to be S1 status S332, the HMD processor 110 further determines the object type. S341. Accordingly, the object type is the external shape of an object and can be categorized based on whether the object is user-interfaceable. For example, the wall 461 or the table 462 shown in FIG. 13a are Type 1 which the user can easily have a contact with or is user-interfaceable. In contrast, for example the basketball 463 shown in FIG. 13b is Type 2 that the user cannot easily have a contact with or is not user-interfaceable.
When the HMD processor 110 determines through the step of object type determination S341 that the corresponding object is Type 1, it determines the physical UI mode as the HMD UI mode. S351. Then, through the aforementioned physical UI mode control unit 171, the UI methods such as the virtual keyboard UI and drawing UI that can have a contact with or touch the Type 1 objects such as the wall 461 and table 462 will be operated.
Further, when the HMD processor 110 determines through the step of object type determination S341 that the corresponding object is Type 2, it selects the non-physical UI mode as the HMD UI mode. S353. Then, through the aforementioned non-physical UI mode control unit 172, regardless of the Type 2 object 463 existing within the predetermined distance and the view angle, applicable non-physical UI methods such as the voice recognition UI and the gesture UI will be operated. Accordingly, the types and operating methods of the physical and non-physical UIs shown in the first embodiment will be applied the same in the third embodiment.
Further, in the case of the physical UI mode being applied as if shown in S351 and S352, the processor 110 continuously determines whether the object 400 used for the physical UI stays not within the HMD view angle. S351/S352 ? S331. On the other hand, in the case of the non-physical UI mode being applied as if shown in S343 and S344, the processor 110 continuously determines whether the object 400 is detected within the predetermined distance. S343/S344 ?S321.
FIGS. 14 and 15 as the fourth embodiment show flowcharts of the UI mode determination considering detecting a new digital device. Further, FIG. 16 shows a drawing of how the UI mode determination is applied to the fourth embodiment. Accordingly, the fourth amendment of the exemplary embodiments considers detecting a new digital device more than the second embodiment does. Further description is as follows.
According to the fourth embodiment of the exemplary embodiments, the HMD UI mode determination process includes the steps of object location determination S420, view angle determination S430, digital device detection and location determination S440 and HMD UI mode determination S450. Once the HMD UI mode determination process begins S410, the HMD processor 110 detects an object in the proximity of the HMD by the object sensor 121 and determines the location of the object S421. After determining the step of S421, the processor 110 determines the relationship between the HMD and the object as one of the aforementioned F1, F2, and F3 statuses. For example, it is called F1 status when an object is detected and the detected object stays within distance in which physical feedback is possible, shown in S422. In addition, it is called F2 status when an object is detected and the detected object stays not within distance in which physical feedback is possible, shown in S423. Lastly, it is called F3 status when an object does not exist in the proximity of the HMD, shown in S424.
Then, in the case of having determined F1 status S422, the HMD 100 processor 110 further determines by the view angle sensor 120 whether the detected object is within the HMD view angle. S431. For example, according to the step of S431, it can determine either S1 or S2 status.
Through the step of S431, if the HMD processor 110 determines that the detected object is not within the view angle (S2 status, S433), it determines whether a new digital device within the predetermined distance exists. S441. For example, according to FIG. 16, the HMD user 10 can look away from the original detected object 400 to a nearby digital device 500. Thus, in that case, if the new device 500 is detected within the predetermined distance through the step of S441, the HMD processor 110 tries to perform connecting communications with the corresponding digital device 500 by using the communications unit 140. Once communications between the HMD 100 and the device 500 are connected, the HMD processor 110 utilizes a display method 510 in the digital device 500 as the physical mode by using the UI control unit 170. S452.
Also, in the step of S441, if it is deemed that a new digital device does not exist within the view angle, the HMD processor 110 interprets that the user intended to look away for a long term and stays the non-physical UI mode. If the original UI mode is the physical UI mode, it may be changed to the non-physical UI mode.
On the other hand, through the step of view angle determination S431, if the HMD processor 110 determines that the detected object is within the predetermined distance (S1 status, S432), the HMD processor 110 operates the physical UI mode as the HMD UI mode. S451. The type and operating methods of the physical and non-physical UIs shown in the first amendment will be applied the same in the fourth amendment.
In addition, although it is not shown in the drawing, according to the step of S451, if a new digital device is detected within the predetermined distance while the object within the predetermined distance is being applied as the physical UI, the display function of the corresponding new digital device can be utilized as the physical UI mode. For example, the physical UI on the surface of the object may be deleted but the deleted physical UI may be removed and applied to the display of the digital device. Also, the physical UI may be stayed on the surface of the object and the display function of the digital device can be used as another physical UI that is different from the physical UI of the surface of the object. For example, the physical UI of the object surface can be used as the virtual keyboard UI and the display function of the digital device can be used as the drawing UI. Also, for example, the physical UI of the object surface can be used as the numbers pad of the virtual keyboard and the display function of the digital device can be used as the letters pad of the virtual keyboard.
Further, in the case of the physical UI mode being applied as if shown in S451 and S452, the processor 110 continuously determines whether the object 400 used for the physical UI or the digital device 500 stay not within the HMD view angle. S451/S452 ? S431. On the other hand, in the case of the non-physical UI mode being applied as if shown in S453 and S454, the processor 110 continuously determines whether the object is detected within the predetermined distance. S453/S454 ? S321.
FIGS. 17 and 18 as the fifth embodiment of exemplary embodiments show flowcharts of the UI mode determination considering the detection of a digital device. Further illustration is as follows.
According to the fifth embodiment, the HMD UI mode includes the steps of digital device detection and location determination S520 and HMD UI mode determination S530. When the HMD UI mode determination process begins S510, the HMD processor 110 detects a digital device in the proximity of the HMD using the object sensor 121 and the communications unit 140 and determines the location of the detected digital device. S521. After determining the step of S521, the processor 110 determines the relationship between the HMD and the digital device as one of the aforementioned F1, F2, and F3 statuses. For example, it is called F1 status when a device is detected and the detected device stays within distance in which physical feedback is possible, shown in S522. In addition, it is called F2 status when a device is detected and the detected device stays not within distance in which physical feedback is possible, shown in S523. Lastly, it is called F3 status when a device does not exist in the proximity of the HMD, shown in S524.
If the HMD processor 110 determines that it is F1 status S522, the HMD processor 110 performs connecting communications through the communications 140. S531. If the communications connection is completed, the HMD processor 110 operates the physical UI mode by using the display in the device through the aforementioned UI control unit 170. S531. That is, the display equipped in the corresponding device can be utilized as the HMD virtual keyboard. On the other hand, if the HMD processor 110 determines that it is F2 status S523 or F3 status S524, the HMD processor 110 operates the non-physical UI mode through the aforementioned UI control unit 170. S533.
It will be apparent to those skilled in the art that various modifications and variations can be made in the exemplary embodiments without departing from the spirit or scope of the exemplary embodimentss. Thus, it is intended that the exemplary embodiments covers the modifications and variations of this exemplary embodiments provided they come within the scope of the appended claims and their equivalents.

Claims (20)

  1. A User Interface (UI) apparatus for a Head Mounted Display (HMD) comprising:
    a sensor unit detecting whether an object exists in the proximity of the HMD and if the object is detected, sensing a distance between the object and the HMD; and
    a processor controlling a User Interface (UI) of the HMD based on a result of the sensor unit such that a physical User Interface (UI) mode is applied if the detected object is within a predetermined distance from the HMD and a non-physical User Interface (UI) mode is applied if the object is not detected or is not within the predetermined distance from the HMD.
  2. The apparatus of claim 1, wherein the physical UI mode displays an imaginary User Interface (UI) function on a surface of the detected object.
  3. The apparatus of claim 2, wherein the physical UI mode comprises at least one of a virtual keyboard User Interface (UI) or a drawing User Interface (UI).
  4. The apparatus of claim 1, wherein the non-physical UI mode operates in a status in which the non-physical UI does not have a physical contact with the object.
  5. The apparatus of claim 4, wherein the non-physical UI mode comprises at least one of a voice recognition User Interface (UI) or a gesture User Interface (UI).
  6. The apparatus of claim 1, wherein the processor selects the physical UI mode if the detected object is within the predetermined distance and the detected object is within a view angle.
  7. The apparatus of claim 6, wherein the processor changes to the non-physical UI mode if the detected object is within the view angle but is not user-interfaceable.
  8. The apparatus of claim 1, wherein the processor selects the non-physical UI mode if the detected object is within the predetermined distance from the HMD but not within the view angle.
  9. The apparatus of claim 8, wherein the processor changes to the physical UI mode if the detected object is not within the view angle but returns within the view angle within a predetermined period of time.
  10. The apparatus of claim 8, wherein the processor stays the non-physical UI mode if the detected object is not within the view angle and does not return within the view angle within the predetermined period of time.
  11. The apparatus of claim 1, wherein the processor selects the non-physical UI mode if the detected object is within the predetermined distance from the HMD but not within the view angle, and the detected object does not return within the view angle within the predetermined period of time.
  12. The apparatus of claim 1, wherein the processor applies a display function of a new digital device as the physical UI mode if the detected object is within the predetermined distance from the HMD but not within the view angle, and the new digital device is detected within the predetermined distance within a certain period of time.
  13. The apparatus of claim 1, wherein the processor applies the display of the new digital device as the physical UI mode if the object is not detected or is not within the predetermined distance from the HMD, and the new digital device is detected within the predetermined distance within the certain period of time.
  14. A User Interface (UI) apparatus for a Head Mounted Display (HMD) comprising:
    a communications unit performing communications with a digital device in the proximity of the HMD;
    a sensor unit detecting whether the digital device exists in the proximity of the HMD, and if the digital device is detected, sensing a distance between the digital device and the HMD; and
    a processor controlling a User Interface (UI) of the HMD based on a result of the sensor unit, wherein a display function of the digital device is applied as a physical User Interface (UI) mode if the detected digital device is within a predetermined distance from the HMD and a non-physical User Interface (UI) mode is applied if the digital device is not detected or is not within the predetermined distance from the HMD.
  15. A method providing a User Interface (UI) for a Head Mounted Display (HMD), the method comprising:
    detecting whether an object exists in the proximity of the HMD;
    if the object is detected, determining a distance between the detected object and the HMD; and
    if the object is within a predetermined distance from the HMD, applying a physical User Interface (UI) mode, and if the object is not detected or is not within the predetermined distance from the HMD, applying a non-physical UI mode.
  16. The method of claim 15 comprises selecting the physical UI mode if the detected object is within the predetermined distance from the HMD and within a view angle.
  17. The method of claim 16 comprises changing to the non-physical UI mode if the detected object is within the view angle but is not user-interfaceable.
  18. The method of claim 15 comprises selecting the non-physical UI mode if the detected object is within the predetermined distance from the HMD but not within the view angle.
  19. The method of claim 18 comprises changing to the physical UI mode if the detected object is not within the view angle but returns within the view angle within a predetermined period of time.
  20. The method of claim 18 comprises continuously staying the non-physical UI mode if the detected object is not within the view angle and does not return within the view angle within the predetermined period of time.
PCT/KR2013/000209 2012-09-14 2013-01-10 Apparatus and method of providing user interface on head mounted display and head mounted display thereof WO2014042320A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP13836413.8A EP2896205B1 (en) 2012-09-14 2013-01-10 Apparatus and method of providing user interface on head mounted display and head mounted display thereof

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2012-0102277 2012-09-14
KR1020120102277 2012-09-14
US13/708,561 US8482527B1 (en) 2012-09-14 2012-12-07 Apparatus and method of providing user interface on head mounted display and head mounted display thereof
US13/708,561 2012-12-07

Publications (1)

Publication Number Publication Date
WO2014042320A1 true WO2014042320A1 (en) 2014-03-20

Family

ID=48701422

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/KR2013/000209 WO2014042320A1 (en) 2012-09-14 2013-01-10 Apparatus and method of providing user interface on head mounted display and head mounted display thereof
PCT/KR2013/008300 WO2014042458A1 (en) 2012-09-14 2013-09-13 Apparatus and method of providing user interface on head mounted display and head mounted display thereof

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/KR2013/008300 WO2014042458A1 (en) 2012-09-14 2013-09-13 Apparatus and method of providing user interface on head mounted display and head mounted display thereof

Country Status (5)

Country Link
US (2) US8482527B1 (en)
EP (2) EP2896205B1 (en)
KR (1) KR102138533B1 (en)
CN (1) CN104641318B (en)
WO (2) WO2014042320A1 (en)

Families Citing this family (110)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9965681B2 (en) 2008-12-16 2018-05-08 Osterhout Group, Inc. Eye imaging in head worn computing
US9400390B2 (en) 2014-01-24 2016-07-26 Osterhout Group, Inc. Peripheral lighting for head worn computing
US9952664B2 (en) 2014-01-21 2018-04-24 Osterhout Group, Inc. Eye imaging in head worn computing
US9229233B2 (en) 2014-02-11 2016-01-05 Osterhout Group, Inc. Micro Doppler presentations in head worn computing
US9715112B2 (en) 2014-01-21 2017-07-25 Osterhout Group, Inc. Suppression of stray light in head worn computing
US20150205111A1 (en) 2014-01-21 2015-07-23 Osterhout Group, Inc. Optical configurations for head worn computing
US9298007B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Eye imaging in head worn computing
US9378592B2 (en) 2012-09-14 2016-06-28 Lg Electronics Inc. Apparatus and method of providing user interface on head mounted display and head mounted display thereof
US8482527B1 (en) * 2012-09-14 2013-07-09 Lg Electronics Inc. Apparatus and method of providing user interface on head mounted display and head mounted display thereof
US9671874B2 (en) * 2012-11-08 2017-06-06 Cuesta Technology Holdings, Llc Systems and methods for extensions to alternative control of touch-based devices
US10359841B2 (en) * 2013-01-13 2019-07-23 Qualcomm Incorporated Apparatus and method for controlling an augmented reality device
US20140267049A1 (en) * 2013-03-15 2014-09-18 Lenitra M. Durham Layered and split keyboard for full 3d interaction on mobile devices
GB201310358D0 (en) * 2013-06-11 2013-07-24 Sony Comp Entertainment Europe Head-Mountable apparatus and systems
US10451874B2 (en) 2013-09-25 2019-10-22 Seiko Epson Corporation Image display device, method of controlling image display device, computer program, and image display system
JP6206099B2 (en) * 2013-11-05 2017-10-04 セイコーエプソン株式会社 Image display system, method for controlling image display system, and head-mounted display device
JP2015090547A (en) * 2013-11-05 2015-05-11 ソニー株式会社 Information input device, information input method, and computer program
US9529194B2 (en) * 2013-11-21 2016-12-27 Samsung Electronics Co., Ltd. Head-mounted display apparatus
US9448621B2 (en) 2013-12-20 2016-09-20 Nokia Technologies Oy Causation of display of information on a see through display
US9529195B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US10191279B2 (en) 2014-03-17 2019-01-29 Osterhout Group, Inc. Eye imaging in head worn computing
US9299194B2 (en) 2014-02-14 2016-03-29 Osterhout Group, Inc. Secure sharing in head worn computing
US9594246B2 (en) 2014-01-21 2017-03-14 Osterhout Group, Inc. See-through computer display systems
US9575321B2 (en) 2014-06-09 2017-02-21 Osterhout Group, Inc. Content presentation in head worn computing
US11103122B2 (en) 2014-07-15 2021-08-31 Mentor Acquisition One, Llc Content presentation in head worn computing
US10649220B2 (en) 2014-06-09 2020-05-12 Mentor Acquisition One, Llc Content presentation in head worn computing
US20160019715A1 (en) 2014-07-15 2016-01-21 Osterhout Group, Inc. Content presentation in head worn computing
US9841599B2 (en) 2014-06-05 2017-12-12 Osterhout Group, Inc. Optical configurations for head-worn see-through displays
US9829707B2 (en) 2014-08-12 2017-11-28 Osterhout Group, Inc. Measuring content brightness in head worn computing
US10684687B2 (en) 2014-12-03 2020-06-16 Mentor Acquisition One, Llc See-through computer display systems
US9939934B2 (en) 2014-01-17 2018-04-10 Osterhout Group, Inc. External user interface for head worn computing
US9671613B2 (en) 2014-09-26 2017-06-06 Osterhout Group, Inc. See-through computer display systems
US9746686B2 (en) 2014-05-19 2017-08-29 Osterhout Group, Inc. Content position calibration in head worn computing
US9448409B2 (en) 2014-11-26 2016-09-20 Osterhout Group, Inc. See-through computer display systems
US9810906B2 (en) 2014-06-17 2017-11-07 Osterhout Group, Inc. External user interface for head worn computing
US20150277118A1 (en) 2014-03-28 2015-10-01 Osterhout Group, Inc. Sensor dependent content position in head worn computing
US10254856B2 (en) 2014-01-17 2019-04-09 Osterhout Group, Inc. External user interface for head worn computing
US11227294B2 (en) 2014-04-03 2022-01-18 Mentor Acquisition One, Llc Sight information collection in head worn computing
US9753288B2 (en) 2014-01-21 2017-09-05 Osterhout Group, Inc. See-through computer display systems
US11892644B2 (en) 2014-01-21 2024-02-06 Mentor Acquisition One, Llc See-through computer display systems
US9740280B2 (en) 2014-01-21 2017-08-22 Osterhout Group, Inc. Eye imaging in head worn computing
US9836122B2 (en) 2014-01-21 2017-12-05 Osterhout Group, Inc. Eye glint imaging in see-through computer display systems
US9766463B2 (en) 2014-01-21 2017-09-19 Osterhout Group, Inc. See-through computer display systems
US11737666B2 (en) 2014-01-21 2023-08-29 Mentor Acquisition One, Llc Eye imaging in head worn computing
US20150205135A1 (en) 2014-01-21 2015-07-23 Osterhout Group, Inc. See-through computer display systems
US9615742B2 (en) 2014-01-21 2017-04-11 Osterhout Group, Inc. Eye imaging in head worn computing
US11487110B2 (en) 2014-01-21 2022-11-01 Mentor Acquisition One, Llc Eye imaging in head worn computing
US11669163B2 (en) 2014-01-21 2023-06-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US9529199B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US9494800B2 (en) 2014-01-21 2016-11-15 Osterhout Group, Inc. See-through computer display systems
US9651784B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US9846308B2 (en) 2014-01-24 2017-12-19 Osterhout Group, Inc. Haptic systems for head-worn computers
US20150241963A1 (en) 2014-02-11 2015-08-27 Osterhout Group, Inc. Eye imaging in head worn computing
US9401540B2 (en) 2014-02-11 2016-07-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
EP2946266B1 (en) 2014-03-21 2023-06-21 Samsung Electronics Co., Ltd. Method and wearable device for providing a virtual input interface
US20160187651A1 (en) 2014-03-28 2016-06-30 Osterhout Group, Inc. Safety for a vehicle operator with an hmd
JP6362391B2 (en) * 2014-04-10 2018-07-25 キヤノン株式会社 Information processing terminal, information processing method, and computer program
US10853589B2 (en) 2014-04-25 2020-12-01 Mentor Acquisition One, Llc Language translation with head-worn computing
US9672210B2 (en) 2014-04-25 2017-06-06 Osterhout Group, Inc. Language translation with head-worn computing
US9423842B2 (en) 2014-09-18 2016-08-23 Osterhout Group, Inc. Thermal management for head-worn computer
US9651787B2 (en) 2014-04-25 2017-05-16 Osterhout Group, Inc. Speaker assembly for headworn computer
KR102262086B1 (en) * 2014-05-28 2021-06-09 삼성전자 주식회사 Apparatus and method for processing image
US10663740B2 (en) 2014-06-09 2020-05-26 Mentor Acquisition One, Llc Content presentation in head worn computing
US9092898B1 (en) 2014-07-03 2015-07-28 Federico Fraccaroli Method, system and apparatus for the augmentation of radio emissions
KR20160033376A (en) 2014-09-18 2016-03-28 (주)에프엑스기어 Head-mounted display controlled by line of sight, method for controlling the same and computer program for controlling the same
US9684172B2 (en) 2014-12-03 2017-06-20 Osterhout Group, Inc. Head worn computer display systems
USD751552S1 (en) 2014-12-31 2016-03-15 Osterhout Group, Inc. Computer glasses
USD753114S1 (en) 2015-01-05 2016-04-05 Osterhout Group, Inc. Air mouse
KR102345911B1 (en) 2015-01-16 2022-01-03 삼성전자주식회사 Virtual input apparatus and method for receiving user input using thereof
US20160239985A1 (en) 2015-02-17 2016-08-18 Osterhout Group, Inc. See-through computer display systems
JP5981591B1 (en) * 2015-03-17 2016-08-31 株式会社コロプラ Computer program and computer system for controlling object operations in an immersive virtual space
KR102313485B1 (en) 2015-04-22 2021-10-15 삼성전자주식회사 Method and apparatus for transmitting and receiving image data for virtual reality streaming service
CN104820492A (en) * 2015-04-23 2015-08-05 济南大学 Three-dimensional haptic system
US10139966B2 (en) 2015-07-22 2018-11-27 Osterhout Group, Inc. External user interface for head worn computing
US20170017323A1 (en) * 2015-07-17 2017-01-19 Osterhout Group, Inc. External user interface for head worn computing
US11003246B2 (en) 2015-07-22 2021-05-11 Mentor Acquisition One, Llc External user interface for head worn computing
US9298283B1 (en) 2015-09-10 2016-03-29 Connectivity Labs Inc. Sedentary virtual reality method and systems
US10416776B2 (en) * 2015-09-24 2019-09-17 International Business Machines Corporation Input device interaction
EP3379736B1 (en) * 2015-11-19 2022-01-19 Sony Interactive Entertainment Inc. Antenna control device, head-mounted display, antenna control method, and program
CN105657407B (en) * 2015-12-31 2018-11-23 深圳纳德光学有限公司 Head-mounted display and its binocular 3D image display method and device
CN105425397A (en) * 2016-01-01 2016-03-23 赵山山 Automatic adjusting method, automatic adjusting system and automatic adjusting device for head mounted display
KR102610120B1 (en) 2016-01-20 2023-12-06 삼성전자주식회사 Head mounted display and control method thereof
US10824253B2 (en) 2016-05-09 2020-11-03 Mentor Acquisition One, Llc User interface systems for head-worn computers
US10684478B2 (en) 2016-05-09 2020-06-16 Mentor Acquisition One, Llc User interface systems for head-worn computers
US10466491B2 (en) 2016-06-01 2019-11-05 Mentor Acquisition One, Llc Modular systems for head-worn computers
CN106123916B (en) * 2016-06-13 2019-11-15 上海临奇智能科技有限公司 It is a kind of for calibrating the method and apparatus of Inertial Measurement Unit in VR equipment
US10591988B2 (en) * 2016-06-28 2020-03-17 Hiscene Information Technology Co., Ltd Method for displaying user interface of head-mounted display device
CN108073432B (en) * 2016-11-07 2020-12-22 亮风台(上海)信息科技有限公司 User interface display method of head-mounted display equipment
CN106157930A (en) * 2016-06-30 2016-11-23 腾讯科技(深圳)有限公司 Brightness adjusting method based on wear-type visual device and device
KR102519578B1 (en) 2016-07-05 2023-04-07 삼성전자주식회사 Screen display method and apparatus in electronic device
CN106227345A (en) * 2016-07-21 2016-12-14 深圳市金立通信设备有限公司 A kind of virtual reality glasses and control method thereof
CN106502323A (en) * 2016-10-20 2017-03-15 精元电脑股份有限公司 The display device of dummy keyboard can be superimposed
CN106802761A (en) * 2017-01-03 2017-06-06 北京掌阔移动传媒科技有限公司 A kind of advertisement placement method and system based on gravity sensing
US10366540B2 (en) 2017-03-23 2019-07-30 Htc Corporation Electronic apparatus and method for virtual reality or augmented reality system
WO2018177521A1 (en) 2017-03-29 2018-10-04 Vestel Elektronik Sanayi Ve Ticaret A.S. Improved method and system for vr interaction
US10152141B1 (en) 2017-08-18 2018-12-11 Osterhout Group, Inc. Controller movement tracking with light emitters
KR102431712B1 (en) 2017-09-04 2022-08-12 삼성전자 주식회사 Electronic apparatus, method for controlling thereof and computer program product thereof
WO2019059044A1 (en) * 2017-09-20 2019-03-28 日本電気株式会社 Information processing device, control method, and program
US11861136B1 (en) * 2017-09-29 2024-01-02 Apple Inc. Systems, methods, and graphical user interfaces for interacting with virtual reality environments
KR102306790B1 (en) * 2017-11-08 2021-09-30 삼성전자주식회사 Device and method to visualize content
US11164380B2 (en) * 2017-12-05 2021-11-02 Samsung Electronics Co., Ltd. System and method for transition boundaries and distance responsive interfaces in augmented and virtual reality
US10643362B2 (en) * 2018-03-26 2020-05-05 Lenovo (Singapore) Pte Ltd Message location based on limb location
US10854169B2 (en) 2018-12-14 2020-12-01 Samsung Electronics Co., Ltd. Systems and methods for virtual displays in virtual, mixed, and augmented reality
US10902250B2 (en) 2018-12-21 2021-01-26 Microsoft Technology Licensing, Llc Mode-changeable augmented reality interface
US11023036B1 (en) * 2019-07-09 2021-06-01 Facebook Technologies, Llc Virtual drawing surface interaction using a peripheral device in artificial reality environments
US10976804B1 (en) 2019-07-09 2021-04-13 Facebook Technologies, Llc Pointer-based interaction with a virtual surface using a peripheral device in artificial reality environments
US11023035B1 (en) 2019-07-09 2021-06-01 Facebook Technologies, Llc Virtual pinboard interaction using a peripheral device in artificial reality environments
KR20220012073A (en) * 2020-07-22 2022-02-03 삼성전자주식회사 Method and apparatus for performing virtual user interaction
KR20230046008A (en) * 2021-09-29 2023-04-05 삼성전자주식회사 Method for displaying content by augmented reality device and augmented reality device for displaying content
KR20230088100A (en) * 2021-12-10 2023-06-19 삼성전자주식회사 Electronic device for using of virtual input device and method of operating the same
WO2024043466A1 (en) * 2022-08-25 2024-02-29 삼성전자 주식회사 Augmented reality device controlled by external device, and method of operating same

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6061064A (en) * 1993-08-31 2000-05-09 Sun Microsystems, Inc. System and method for providing and using a computer user interface with a view space having discrete portions
GB2376397A (en) 2001-06-04 2002-12-11 Hewlett Packard Co Virtual or augmented reality
US6771294B1 (en) 1999-12-29 2004-08-03 Petri Pulli User interface
GB2465280A (en) 2008-11-17 2010-05-19 Honeywell Int Inc Augmented reality system that marks and tracks the position of a real world object on a see-through display
US8199974B1 (en) * 2011-07-18 2012-06-12 Google Inc. Identifying a target object using optical occlusion

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002318652A (en) 2001-04-20 2002-10-31 Foundation For Nara Institute Of Science & Technology Virtual input device and its program
US7774075B2 (en) 2002-11-06 2010-08-10 Lin Julius J Y Audio-visual three-dimensional input/output
KR100800859B1 (en) 2004-08-27 2008-02-04 삼성전자주식회사 Apparatus and method for inputting key in head mounted display information terminal
US8957835B2 (en) * 2008-09-30 2015-02-17 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
US20100238161A1 (en) 2009-03-19 2010-09-23 Kenneth Varga Computer-aided system for 360º heads up display of safety/mission critical data
KR20110010906A (en) * 2009-07-27 2011-02-08 삼성전자주식회사 Apparatus and method for controlling of electronic machine using user interaction
JP4679661B1 (en) 2009-12-15 2011-04-27 株式会社東芝 Information presenting apparatus, information presenting method, and program
KR101123115B1 (en) 2010-02-09 2012-03-20 유태영 The apparatus of 3d-pc room with head mounted display
JP2013521576A (en) 2010-02-28 2013-06-10 オスターハウト グループ インコーポレイテッド Local advertising content on interactive head-mounted eyepieces
US20110214082A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Projection triggering through an external marker in an augmented reality eyepiece
US8508347B2 (en) * 2010-06-24 2013-08-13 Nokia Corporation Apparatus and method for proximity based input
US10019962B2 (en) 2011-08-17 2018-07-10 Microsoft Technology Licensing, Llc Context adaptive user interface for augmented reality display
CN102508546B (en) * 2011-10-31 2014-04-09 冠捷显示科技(厦门)有限公司 Three-dimensional (3D) virtual projection and virtual touch user interface and achieving method
US8482527B1 (en) * 2012-09-14 2013-07-09 Lg Electronics Inc. Apparatus and method of providing user interface on head mounted display and head mounted display thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6061064A (en) * 1993-08-31 2000-05-09 Sun Microsystems, Inc. System and method for providing and using a computer user interface with a view space having discrete portions
US6771294B1 (en) 1999-12-29 2004-08-03 Petri Pulli User interface
GB2376397A (en) 2001-06-04 2002-12-11 Hewlett Packard Co Virtual or augmented reality
GB2465280A (en) 2008-11-17 2010-05-19 Honeywell Int Inc Augmented reality system that marks and tracks the position of a real world object on a see-through display
US8199974B1 (en) * 2011-07-18 2012-06-12 Google Inc. Identifying a target object using optical occlusion

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2896205A4

Also Published As

Publication number Publication date
WO2014042458A1 (en) 2014-03-20
US8482527B1 (en) 2013-07-09
EP2895911B1 (en) 2018-12-19
EP2896205B1 (en) 2019-01-02
EP2895911A4 (en) 2016-07-27
EP2896205A1 (en) 2015-07-22
EP2896205A4 (en) 2016-07-27
CN104641318B (en) 2018-02-02
US9448624B2 (en) 2016-09-20
CN104641318A (en) 2015-05-20
US20140078043A1 (en) 2014-03-20
EP2895911A1 (en) 2015-07-22
KR102138533B1 (en) 2020-08-11
KR20150054825A (en) 2015-05-20

Similar Documents

Publication Publication Date Title
WO2014042320A1 (en) Apparatus and method of providing user interface on head mounted display and head mounted display thereof
US9378592B2 (en) Apparatus and method of providing user interface on head mounted display and head mounted display thereof
EP2778881B1 (en) Multi-input control method and system, and electronic device supporting the same
CN111083684B (en) Method for controlling electronic equipment and electronic equipment
KR101491045B1 (en) Apparatus and methdo for sharing contents
WO2014107005A1 (en) Mouse function provision method and terminal implementing the same
KR100931403B1 (en) Device and information controlling system on network using hand gestures
KR20140112920A (en) Method for providing user's interaction using multi hovering gesture
KR101878144B1 (en) An Apparatus and Method of Providing User Interface on Head Mounted Display and a Head Mounted Display Thereof
US20150067550A1 (en) Dual screen system and method
CN106896920B (en) Virtual reality system, virtual reality equipment, virtual reality control device and method
WO2020173235A1 (en) Task switching method and terminal device
WO2020130356A1 (en) System and method for multipurpose input device for two-dimensional and three-dimensional environments
WO2013118987A1 (en) Control method and apparatus of electronic device using control device
KR20130123343A (en) Electronic apparatus, remote control apparatus and control method thereof
KR20150144641A (en) user terminal apparatus and control method thereof
US11526320B2 (en) Multi-screen interface control method and terminal device
WO2022199434A1 (en) Method and apparatus for transmitting target between devices, and electronic device
KR20150080831A (en) Virtual keyboard for Dual Remote Control between mobile devices
KR20140136854A (en) Application operating method and electronic device implementing the same
KR20140142629A (en) Method and apparatus for processing key pad input received on touch screen of mobile terminal
US20230350568A1 (en) Human-machine interaction system for projection system
KR20150054451A (en) Set-top box system and Method for providing set-top box remote controller functions
KR20150108591A (en) Method for controlling wearable device and apparatus thereof
JP5511023B2 (en) Video display system and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13836413

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE