US20100053151A1 - In-line mediation for manipulating three-dimensional content on a display device - Google Patents

In-line mediation for manipulating three-dimensional content on a display device Download PDF

Info

Publication number
US20100053151A1
US20100053151A1 US12/421,363 US42136309A US2010053151A1 US 20100053151 A1 US20100053151 A1 US 20100053151A1 US 42136309 A US42136309 A US 42136309A US 2010053151 A1 US2010053151 A1 US 2010053151A1
Authority
US
United States
Prior art keywords
hand
user
recited
tracking
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/421,363
Inventor
Stefan Marti
Seung Wook Kim
Francisco Imai
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to US12/421,363 priority Critical patent/US20100053151A1/en
Priority to KR1020090080586A priority patent/KR20100027976A/en
Publication of US20100053151A1 publication Critical patent/US20100053151A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1446Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/002Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to project the image of a two-dimensional display, such as an array of light emitting or modulating elements or a CRT
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • H04N13/279Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2016Rotation, translation, scaling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2021Shape modification
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2356/00Detection of the display position w.r.t. other display screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2203/00Aspects of automatic or semi-automatic exchanges
    • H04M2203/35Aspects of automatic or semi-automatic exchanges related to information services provided via a voice call
    • H04M2203/359Augmented reality

Definitions

  • the present invention relates generally to hardware and software for user interaction with digital three-dimensional data. More specifically, it relates to devices having displays and to human interaction with data displayed on the devices.
  • Three-dimensional content may be found in medical imaging (e.g., examining MRIs), modeling and prototyping, information visualization, architecture, tele-immersion and collaboration, geographic information systems (e.g., Google Earth), and in other fields.
  • Medical imaging e.g., examining MRIs
  • modeling and prototyping information visualization
  • architecture e.g., tele-immersion and collaboration
  • geographic information systems e.g., Google Earth
  • stereoscopic displays requiring the user to wear a pair of special glasses
  • auto-stereoscopic displays based on lenticular lenses or parallax barriers that cause eye strain and headaches as usual side effects
  • head-mounted displays requiring heavy head gear or goggles
  • volumetric displays such as those based on oscillating mirrors or screens (which do not allow bare hand direct manipulation of 3-D content).
  • mobile device displays such as displays on cell phones, only allow for a limited field of view (FOV).
  • FOV field of view
  • the mobile device display size is generally limited by the size of the device.
  • the size of a non-projection display cannot be larger than the mobile device that contains the display. Therefore, existing solutions for mobile displays (which are generally light-emitting displays) limit the immersive experience for the user.
  • Mobile devices do not provide satisfactory user awareness of virtual surroundings, another important aspect of creating an immersive experience.
  • Some display systems require a user to reach behind the monitor. However, in these systems the user's hands must physically touch the back of the monitor and is only intended to manipulate 2-D images, such as moving images on the screen.
  • a user is able to use a mobile device having a display, such as a cell phone or a media player, to view and manipulate 3D content displayed on the device by reaching behind the device and manipulating a perceived 3D object.
  • the user's eyes, device, and a perceived 3D object are aligned or “in-line,” such that the device performs as a type of in-line mediator between the user and the perceived 3D object.
  • This alignment results in a visual coherency to the user when reaching behind the device to make hand gestures and movements to manipulate the 3D content. That is, the user's hand movements behind the device are at a natural and intuitive distance and are aligned with the 3D object displayed on the device monitor so that the user has a natural visual impression that she is actually handling the 3D object shown on the monitor.
  • One embodiment of the present invention is a method of detecting manipulation of a digital 3D object displayed on a device having a front side with a display monitor facing the user and a back side having a sensor facing away from the user.
  • a hand or other object may be detected within a specific area of the back side of the device having the sensor, such as a camera.
  • the hand is displayed on the monitor and its movements within a specific area of the back side of the device are tracked.
  • the movements are the result of the user intending to manipulate the displayed 3D object and are made the user in manipulating a perceived 3D object behind the device, but without having to physically touch the backside of the device.
  • a collision between the displayed hand and the displayed 3D object may be detected by the device resulting in a modification of the image of the 3D object displayed on the device. In this manner the device functions as a 3D in-line mediator between the user and the 3D object.
  • a display device in another embodiment, includes a processor and a memory component storing digital 3D content data.
  • the device also includes a tracking sensor component for tracking movement of an object that is in proximity of the device.
  • the tracking sensor component faces the back of the device (away from the user) and is able to detect movements and gestures of a hand of a user who reaches behind the device.
  • a hand tracking module processes movement data from the tracking sensor and a collision detection module detects collisions between a user's hand and a 3D object.
  • FIG. 1A is an illustration of a user using a mobile device as a 3D in-line mediator to manipulate digital 3D content displayed on the device in accordance with one embodiment
  • FIG. 1B is an illustration of a user using a laptop computer as a 3D in-line mediator to manipulate digital 3D content displayed on device in accordance with one embodiment
  • FIG. 1C is an illustration of a user using a desktop computer as a 3D in-line mediator to manipulate digital 3D content displayed on a desktop monitor in accordance with one embodiment
  • FIGS. 2A and 2B are top views illustrating 3D in-line mediation
  • FIG. 2C is an illustration of a side perspective of user shown in FIG. 2A ;
  • FIG. 3A is a more detailed top view of a user utilizing a mobile device as a 3D in-line mediator for manipulating digital 3D content in accordance with one embodiment
  • FIG. 3B shows a scene that a user sees when facing a device and when reaching behind the device
  • FIG. 4 is a flow diagram of a process of enabling in-line mediation in accordance with one embodiment
  • FIG. 5 is a block diagram showing relevant components of a device capable of functioning as a 3D in-line mediator in accordance with one embodiment.
  • FIGS. 6A and 6B illustrate a computer system suitable for implementing embodiments of the present invention.
  • a display device as a three-dimensional (3D) in-line mediator for interacting with digital 3D content displayed on the device.
  • the use of a display device as an in-line mediator enables intuitive bare hand manipulation of digital 3D content by allowing a user to see the direct effect of the user's handling of the 3D content on the display by reaching behind the display device.
  • the device display functions as an in-line mediator between the user and the 3D content, enabling a type of visual coherency for the user. That is, the 3D content is visually coherent or in-line from the user's perspective.
  • the user's hand, or a representation of it is shown on the display, maintaining the visual coherency.
  • the user's view of the 3D content on the display is not obstructed by the user's arm or hand.
  • FIG. 1A is an illustration of a user using a mobile device as a 3D in-line mediator to manipulate digital 3D content displayed on the device in accordance with one embodiment.
  • 3D in-line mediation refers to the user's eyes, the 3D content on the display, and user's hands behind the display (but not touching the back of the display) being aligned in real 3D space.
  • a user 102 holds a mobile device 104 with one hand 105 and reaches behind device 104 with another hand 106 .
  • One or more sensors (not shown), collectively referred to as a sensor component, on device 104 face away from user 102 in the direction of user's hand behind device 104 .
  • representation 108 may be an unaltered (actual) image of the user's hand that is composited onto scene 106 or may be a one-to-one mapping of real hand 106 to a virtual representation (not shown), such as an avatar hand which becomes part of 3D or virtual scene 107 .
  • the term “hand” as used herein may include, in addition to the user's hand, fingers, and thumb, the user's wrist and forearm, all of which may be detected by the sensor component.
  • Mobile device 104 may be a cell phone, a media player (e.g., MP3 player), portable gaming device, or any type of smart handset device having a display. It is assumed that the device is IP-enabled or capable of connecting to a suitable network to access 3D content over the Internet. However, the various embodiments described below do not necessarily require that the device be able to access a network. For example, the 3D content displayed on the device may be resident on a local storage of the device, such as on a hard disk or other mass storage component or on a local cache.
  • the sensor component on mobile device 104 and the accompanying sensor software may be one or more of various types of sensors, a typical one being a conventional camera. Implementations of various sensor components are described below.
  • FIG. 1B is an illustration of a user 110 using a laptop computer 112 or similar nomadic computing device (e.g., netbook computer, mini laptop, etc.) as a 3D in-line mediator to manipulate digital 3D content displayed on device 112 in accordance with one embodiment.
  • a laptop computer has a sensor component (not shown) facing away from user 110 .
  • Sensor component may be an internal component of laptop 112 or a peripheral 113 attached to it with associated software installed on laptop 112 .
  • Both of user 110 hands 114 and 115 are composited onto a 3D scene 116 on display 117 .
  • FIG. 1C is an illustration of a user 118 using a desktop computer monitor 119 as a 3D in-line mediator to manipulate digital 3D content displayed on desktop monitor 119 in accordance with one embodiment.
  • monitor 119 be some type of flat panel monitor, such as an LCD or plasma monitor, so that the user is physically able to reach behind the monitor.
  • a tracking sensor component 120 detects a user's hand 122 behind desktop monitor 119 .
  • hand 122 is mapped to a digital representation, such as an avatar hand 124 .
  • User 118 may also move his other hand behind monitor 119 as in FIG. 1B .
  • FIGS. 2A and 2B are top views illustrating 3D in-line mediation. They show a user 200 holding a mobile device 202 with her left hand 204 . User 200 extends her right hand 206 behind device 202 . An area 208 behind mobile device 202 indicated by the solid angled lines, is a so-called virtual 3D space and the area 210 surrounding device 202 is the physical or real environment or world (“RW”) that user 200 is in. A segment of the user's hand 206 in virtual space 208 is shaded. A sensor component (not shown) detects the presence and movement in area 208 . User 200 can move hand 206 around in area 208 to manipulate 3D objects shown on device 202 . FIG. 2B shows a user 212 reaching behind a laptop 214 with both hands 216 and 218 .
  • RW real environment or world
  • the sensor component also referred to as a tracking component, may be implemented using various types of sensors. These sensors may be used to detect the presence of a user's hand (or any object) behind the mobile device's monitor.
  • a standard or conventional mobile device or cell phone camera is used to sense the presence of a hand and its movements or gestures. Image differentiation or optic flow may also be used to detect and track hand movement.
  • a conventional camera may be replaced with infrared detection components to perform hand detection and tracking.
  • a mobile device camera facing away from the user and that is IR sensitive (or has its IR filter removed), possibly in combination with additional IR illumination (e.g., LED), may look for the brightest object within the ranger of the camera, which will likely be the user's hand.
  • IR illumination e.g., LED
  • Dedicated infrared sensors with IR illumination may also be used.
  • redshift thermal imaging may be used. This option provides passive optical components that redshift a standard CMOS imager to be able to detect long wavelength and thermal infrared radiation.
  • Another type of sensor may be ultrasonic gesture detection sensors.
  • Sensor software options include off-the-shelf gesture recognition tools, such as software for detecting hands using object segmentation and/or optic flow. Other options include spectral imaging software for detecting skin tones, pseudo-thermal imaging, and 3D depth cameras using time-of-flight.
  • FIG. 2C is an illustration of a side perspective of user 200 shown initially in FIG. 2A .
  • User hand 206 is in front of a sensor component of device 202 , the sensor facing away from user 200 .
  • User 200 holds device 202 with hand 204 .
  • Hand 206 is behind device 202 in virtual 3D space 208 .
  • Space 208 is the space in proximity of the backside of device 202 that is tracked by the backward facing sensor.
  • Real or physical world 210 is outside the proximity or tracked area of device 202 .
  • Gestures and movements of hand 206 in 3D space 208 are made by user 200 in order to manipulate a perceived 3D object (not shown) on the display of device 202 .
  • the gestures may not be for manipulating an object but simply for the purpose of making a gesture (e.g., waving, pointing, etc.) in a virtual world environment.
  • FIG. 3A is a more detailed top view of a user utilizing a mobile device as a 3D in-line mediator for manipulating digital 3D content in accordance with one embodiment. It shows a user's head 303 , a “perceived” digital 3D object 304 and a display device 306 as being in-line or aligned, with display device 306 functioning as the in-line mediator. It also shows a user's hand 308 reaching behind device 306 .
  • FIG. 3B shows a scene 310 that user head 303 sees when facing device 306 and when reaching behind device 306 . The user sees digital 3D object 312 on the screen and user hand 308 or a representation of it touching or otherwise manipulating object 312 .
  • the user is also able to manipulate a digital 3D object, such as touching, lifting, holding, moving, pushing, pulling, dropping, throwing, rotating, deforming, bending, stretching, compressing, squeezing, pinching.
  • a digital 3D object such as touching, lifting, holding, moving, pushing, pulling, dropping, throwing, rotating, deforming, bending, stretching, compressing, squeezing, pinching.
  • the user can move her hand(s) to manipulate 3D objects she sees on the display.
  • there is a depth component If the 3D object is in front of the 3D scene she is viewing, she will not have to reach far behind the device/monitor. If the object is further back in the 3D scene, the user may have to physically reach further behind the device/monitor in order to touch (or collide with) the 3D object.
  • FIG. 4 is a flow diagram of a process of enabling in-line mediation in accordance with one embodiment.
  • the process described in FIG. 4 begins after the user has powered on the device, which may be mobile, nomadic, or stationary. A tracking component has also been activated.
  • the device displays 3D content, such as an online virtual world or any other form of 3D content, examples of which are provided above.
  • the user directly faces the display; that is, sits squarely in front of the laptop or desktop monitor or holds the cell phone directly in front of her.
  • 3D object displayed on the screen that the user wants to manipulate (e.g., pick up a ball, move a chair, etc.) or there may be a 3D world scene in which the user wants to perform a hand gesture or movement (e.g., wave to 3D person or an avatar).
  • Other examples that do not involve online 3D content may include moving or changing the orientation of 3D medical imaging data, playing a 3D video game, interacting with 3D content, such as a movie or show, and so on.
  • a tracking component detects the presence of the user's hand.
  • a tracking component detects the presence of the user's hand.
  • One conventional way is by detecting the skin tone of the user's hand.
  • tracking components or sensors that may be used. Which one that is most suitable will likely depend on the features and capabilities of the device (i.e., mobile, nomadic, stationary, etc.).
  • a typical cell phone camera is capable of detecting the presence of a human hand.
  • An image of the hand (or hands) is transmitted to a compositing component.
  • the hand is displayed on the screen.
  • the user sees either an unaltered view of her hand (not including the background behind and around the hand) or an altered representation of the hand.
  • known compositing techniques may be used. For example, some techniques may involve combining two video sources-one for the 3D content and another representing video images of the user's hand. Other techniques for overlaying or compositing the images of the hand over the 3D content data may be used and which technique is most suitable will likely depend on the type of device.
  • software from the 3D content provider or other conventional software may be used to perform a mapping of the user hand images to an avatar image, such as a robotic hand.
  • a representation of a stationary user's hand can be seen on the device. That is, its presence has been detected and is being represented on the device.
  • the user starts moving the hand, either by moving it up, down, left, right, or inward or outward (relative to the device) or by gesturing (or both).
  • the initial position of the hand and its subsequent movement can be described in terms of x, y, and z coordinates.
  • the tracking component begins tracking hand movement and gesturing, which has horizontal, vertical, and depth components. For example, a user may be viewing a 3D virtual world room on the device and wants to move an object that is in the far left corner of the room (which has a certain depth) and to the near right corner of the room. In one embodiment of the invention, the user may have to move his hand to a position that is, for example, about 12 inches behind and slightly left of the device.
  • the user moves her hand to a position that is maybe 2-3 inches behind and to the right of the device. This example illustrates that there is a depth component in the hand tracking that is implemented to maintain the in-line mediation performed by the device.
  • the digital representation of the user's hand on the device collides or touches an object. This collision is detected by comparing sensor data from the tracking sensor and geometrical data from the 3D data repository.
  • the user moves her hand behind the device in a way that causes the digital representation of her hand on the screen to collide with the object, at which point she can grab, pick up, or otherwise manipulate the object.
  • the user's hand may be characterized as colliding with the perceived object that is “floating” behind the device, as described in FIG. 3A .
  • the user's eyes are looking straight at the middle of the screen. That is, there is a vertical and horizontal alignment of the user's head with the device and the 3D content.
  • the user's face may also be tracked which may enable changes in the 3D content images to reflect movement in the user's head (i.e., perspective).
  • an “input-output coincidence” model is used to close a human-computer interaction feature referred to as a perception-action loop, where perception is what the user sees and action is what the user does. This enables a user to see the consequences of an interaction, such as touching a 3-D object, immediately.
  • a user's hand is aligned with or in the same position as the 3-D object that is being manipulated. That is, from the user's perspective, the hand is aligned with the 3-D object so that it looks like the user is lifting or moving a 3-D object as if it were a physical object. What the user sees makes sense based on the action being taken by the user.
  • the system provides tactile feedback to the user upon detecting a collision between the user's hand and the 3-D object.
  • the image of the 3D scene is modified to reflect the user's manipulation of the 3D object. If there is no manipulation of a 3D object (and thus no object collision), the image on the screen changes as the user moves her hand, as it does when the user manipulates a 3D object.
  • the changes in the 3D image on the screen may be done using known methods for processing 3D content data. These methods or techniques may vary depending on the type of device, the source of the data, and other factors.
  • the process then repeats by returning to step 402 where the presence of the user's hand is again detected.
  • the process described in FIG. 4 is continuous in that the user's hand movement is tracked as long as it is within the range of the tracking component.
  • the device is able to perform as a 3D in-line mediator as long as the user's head or perspective is kept in line with the device which, in turn, allows the user's hand movements behind the device to be visually coherent with the hand movements shown on the screen and vice versa. That is, the user moves her hand in the physical world based on actions she wants to perform in the digital 3D environment shown on the screen.
  • FIG. 5 is a block diagram showing relevant components of a device capable of functioning as a 3D in-line mediator in accordance with one embodiment. Many of the components shown here have been described above.
  • a device 500 has a display component (not shown) for displaying digital 3D content data 501 which may be stored in mass storage or in a local cache (not shown) on device 500 or may be downloaded from the Internet or from another source.
  • a tracking sensor component 502 may include one or more conventional (2D) cameras and 3D (depth) cameras and non-camera peripherals.
  • a 3-D camera may provide depth data which simplifies gesture recognition by use of depth keying.
  • a wide angle lens may be used in a camera which may require less processing by an imaging system, but may produce more distortion.
  • Component 502 may also have other capabilities as described above, such as infrared detection, optic flow, image differentiation, redshift thermal imaging, spectral processing, and other techniques may be used in tracking component 502 .
  • Tracking sensor component 502 is responsible for tracking the position of body parts within the range of detection. This position data is transmitted to hand tracking module 504 and to face tracking module 506 and each identifies features that are relevant to each module.
  • Hand tracking module 504 identifies features of the user's hand positions, including the positions of the fingers, wrist, and arm. It determines the location of these body parts in the 3D environment. Data from module 504 goes to two components related to hand and arm position: gesture detection module 508 and hand collision detection module 510 .
  • a user “gesture” results in a modification of 3D content 501 .
  • a gesture may include lifting, holding, squeezing, pinching, or rotating a 3D object. These actions typically result in some type of modification of the object in the 3D environment.
  • a modification of an object may include a change in its location (lifting or turning) without there being an actual deformation or change in shape of the object.
  • the gesture detection data may be applied directly to the graphics data representing 3D content 501 .
  • tracking sensor component 502 may also track the user's face.
  • face tracking data is transmitted to face tracking module 506 .
  • Face tracking may be utilized in cases where the user is not vertically aligned (i.e., the user's head is not looking directly at the middle of the screen) with the device and the perceived object.
  • data from hand collision detection module 510 may be transmitted to a tactile feedback controller 512 , which is connected to one or more actuators 514 which are external to device 500 .
  • the user may receive haptic feedback when the user's hand collides with a 3D object.
  • actuators 514 be as unobtrusive as possible.
  • they are vibrating wristbands, which may be wired or wireless. Using wristbands allows for bare hand manipulation of 3D content as described above.
  • Tactile feedback controller 512 receives a signal that there is a collision or contact and causes tactile actuators 514 to provide a physical sensation to the user. For example, with vibrating wristbands, the user's wrist will sense a vibration or similar physical sensation indicating contact with the 3-D object.
  • the present invention enables a user to interact with digital 3D content in a natural and immersive way by enabling visual coherency, thereby creating an immersive volumetric interaction with the 3D content.
  • a user uploads or executes 3D content onto a mobile computing device, such as a cell phone.
  • This 3D content may be a virtual world that the user has visited using a browser on the mobile device (e.g., Second Life or any other site that provides virtual world content).
  • Other examples include movies, video games, online virtual cities, medical imaging (e.g., examining MRIs), modeling and prototyping, information visualization, architecture, tele-immersion and collaboration, and geographic information systems (e.g., Google Earth).
  • the user holds the display of the device upright at a comfortable distance in front of the user's eyes, for example at 20-30 centimeters.
  • the display of the mobile device is used as a window into the virtual world.
  • the mobile device as an in-line mediator between the user and the user's hand, the user is able to manipulate 3D objects shown on the display by reaching behind the display of the device and make hand gestures and movements around a perceived object behind the display. The user sees the gestures and movements on the display and the 3D object that they are affecting.
  • one aspect of creating an immersive and natural user interaction with 3D content using a mobile device is enabling the user to have bare-hand interaction with objects in the virtual world. That is, allowing the user to manipulate and “touch” digital 3D objects using the mobile device and not requiring the user to use any peripheral devices, such as gloves, finger sensors, motion detectors, and the like.
  • FIGS. 6A and 6B illustrate a computing system 600 suitable for implementing embodiments of the present invention.
  • FIG. 6A shows one possible physical form of the computing system.
  • the computing system may have many physical forms including an integrated circuit, a printed circuit board, a small handheld device (such as a mobile telephone, handset or PDA), a personal computer or a super computer.
  • Computing system 600 includes a monitor 602 , a display 604 , a housing 606 , a disk drive 608 , a keyboard 610 and a mouse 612 .
  • Disk 614 is a computer-readable medium used to transfer data to and from computer system 600 .
  • FIG. 6B is an example of a block diagram for computing system 600 .
  • Attached to system bus 620 are a wide variety of subsystems.
  • Processor(s) 622 also referred to as central processing units, or CPUs
  • Memory 624 includes random access memory (RAM) and read-only memory (ROM).
  • RAM random access memory
  • ROM read-only memory
  • RAM random access memory
  • ROM read-only memory
  • RAM random access memory
  • ROM read-only memory
  • a fixed disk 626 is also coupled bi-directionally to CPU 622 ; it provides additional data storage capacity and may also include any of the computer-readable media described below.
  • Fixed disk 626 may be used to store programs, data and the like and is typically a secondary storage medium (such as a hard disk) that is slower than primary storage. It will be appreciated that the information retained within fixed disk 626 , may, in appropriate cases, be incorporated in standard fashion as virtual memory in memory 624 .
  • Removable disk 614 may take the form of any of the computer-readable media described below.
  • CPU 622 is also coupled to a variety of input/output devices such as display 604 , keyboard 610 , mouse 612 and speakers 630 .
  • an input/output device may be any of: video displays, track balls, mice, keyboards, microphones, touch-sensitive displays, transducer card readers, magnetic or paper tape readers, tablets, styluses, voice or handwriting recognizers, biometrics readers, or other computers.
  • CPU 622 optionally may be coupled to another computer or telecommunications network using network interface 640 . With such a network interface, it is contemplated that the CPU might receive information from the network, or might output information to the network in the course of performing the above-described method steps.
  • method embodiments of the present invention may execute solely upon CPU 622 or may execute over a network such as the Internet in conjunction with a remote CPU that shares a portion of the processing.

Abstract

A user holds the mobile device upright or sits in front of a nomadic or stationary device, views the monitor from a suitable distance, and physically reaches behind the device with her hand to manipulate a 3D object displayed on the monitor. The device functions as a 3D in-line mediator that provides visual coherency to the user when she reaches behind the device to use hand gestures and movements to manipulate a perceived object behind the device and sees that the 3D object on the display is being manipulated. The perceived object that the user manipulates behind the device with bare hands corresponds to the 3D object displayed on the device. The visual coherency arises from the alignment of the user's head or eyes, the device, and the 3D object. The user's hand may be represented as an image of the actual hand or as a virtualized representation of the hand, such as part of an avatar.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims priority of U.S. provisional patent application No. 60/093,651 filed Sep. 2, 2008 entitled “GESTURE AND MOTION-BASED NAVIGATION AND INTERACTION WITH THREE-DIMENSIONAL VIRTUAL CONTENT ON A MOBILE DEVICE,” of which is hereby incorporated by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates generally to hardware and software for user interaction with digital three-dimensional data. More specifically, it relates to devices having displays and to human interaction with data displayed on the devices.
  • 2. Description of the Related Art
  • The amount of three-dimensional content available on the Internet and in other contexts is increasing at a rapid pace. Consumers are getting more accustomed to hearing about “3D” in various contexts, such as movies, video games, and online virtual cities. Three-dimensional content may be found in medical imaging (e.g., examining MRIs), modeling and prototyping, information visualization, architecture, tele-immersion and collaboration, geographic information systems (e.g., Google Earth), and in other fields. Current systems, including computers and cell phones, but more generally, content display systems (e.g., TVs) fall short of taking advantage of 3D content by not providing an immersive user experience. For example, they do not provide an intuitive, natural and unobtrusive interaction with 3-D objects.
  • With respect to mobile devices, presently, such devices do not provide users who are seeking interaction with digital 3D content on their mobile devices with a natural, intuitive, and immersive experience. Mobile device users are not able to make gestures or manipulate 3D objects using bare hands in a natural and intuitive way.
  • Although some displays allow users to manipulate 3D content with bare hands in front of the display (monitor), current display systems that are able to provide some interaction with 3D content require inconvenient or intrusive peripherals that make the experience unnatural to the user. For example, some current methods of providing tactile or haptic feedback require vibro-tactile gloves. In other examples, current methods of rendering 3-D content include stereoscopic displays (requiring the user to wear a pair of special glasses), auto-stereoscopic displays (based on lenticular lenses or parallax barriers that cause eye strain and headaches as usual side effects), head-mounted displays (requiring heavy head gear or goggles), and volumetric displays, such as those based on oscillating mirrors or screens (which do not allow bare hand direct manipulation of 3-D content).
  • In addition mobile device displays, such as displays on cell phones, only allow for a limited field of view (FOV). This is due to the fact that the mobile device display size is generally limited by the size of the device. For example, the size of a non-projection display cannot be larger than the mobile device that contains the display. Therefore, existing solutions for mobile displays (which are generally light-emitting displays) limit the immersive experience for the user. Furthermore, it is presently difficult to navigate through virtual worlds and 3-D content via a first-person view on mobile devices, which is one aspect of creating an immersive experience. Mobile devices do not provide satisfactory user awareness of virtual surroundings, another important aspect of creating an immersive experience.
  • Some display systems require a user to reach behind the monitor. However, in these systems the user's hands must physically touch the back of the monitor and is only intended to manipulate 2-D images, such as moving images on the screen.
  • SUMMARY OF THE INVENTION
  • A user is able to use a mobile device having a display, such as a cell phone or a media player, to view and manipulate 3D content displayed on the device by reaching behind the device and manipulating a perceived 3D object. The user's eyes, device, and a perceived 3D object are aligned or “in-line,” such that the device performs as a type of in-line mediator between the user and the perceived 3D object. This alignment results in a visual coherency to the user when reaching behind the device to make hand gestures and movements to manipulate the 3D content. That is, the user's hand movements behind the device are at a natural and intuitive distance and are aligned with the 3D object displayed on the device monitor so that the user has a natural visual impression that she is actually handling the 3D object shown on the monitor.
  • One embodiment of the present invention is a method of detecting manipulation of a digital 3D object displayed on a device having a front side with a display monitor facing the user and a back side having a sensor facing away from the user. A hand or other object may be detected within a specific area of the back side of the device having the sensor, such as a camera. The hand is displayed on the monitor and its movements within a specific area of the back side of the device are tracked. The movements are the result of the user intending to manipulate the displayed 3D object and are made the user in manipulating a perceived 3D object behind the device, but without having to physically touch the backside of the device. A collision between the displayed hand and the displayed 3D object may be detected by the device resulting in a modification of the image of the 3D object displayed on the device. In this manner the device functions as a 3D in-line mediator between the user and the 3D object.
  • In another embodiment, a display device includes a processor and a memory component storing digital 3D content data. The device also includes a tracking sensor component for tracking movement of an object that is in proximity of the device. In one embodiment, the tracking sensor component faces the back of the device (away from the user) and is able to detect movements and gestures of a hand of a user who reaches behind the device. A hand tracking module processes movement data from the tracking sensor and a collision detection module detects collisions between a user's hand and a 3D object.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • References are made to the accompanying drawings, which form a part of the description and in which are shown, by way of illustration, particular embodiments:
  • FIG. 1A is an illustration of a user using a mobile device as a 3D in-line mediator to manipulate digital 3D content displayed on the device in accordance with one embodiment;
  • FIG. 1B is an illustration of a user using a laptop computer as a 3D in-line mediator to manipulate digital 3D content displayed on device in accordance with one embodiment;
  • FIG. 1C is an illustration of a user using a desktop computer as a 3D in-line mediator to manipulate digital 3D content displayed on a desktop monitor in accordance with one embodiment;
  • FIGS. 2A and 2B are top views illustrating 3D in-line mediation;
  • FIG. 2C is an illustration of a side perspective of user shown in FIG. 2A;
  • FIG. 3A is a more detailed top view of a user utilizing a mobile device as a 3D in-line mediator for manipulating digital 3D content in accordance with one embodiment;
  • FIG. 3B shows a scene that a user sees when facing a device and when reaching behind the device;
  • FIG. 4 is a flow diagram of a process of enabling in-line mediation in accordance with one embodiment;
  • FIG. 5 is a block diagram showing relevant components of a device capable of functioning as a 3D in-line mediator in accordance with one embodiment; and
  • FIGS. 6A and 6B illustrate a computer system suitable for implementing embodiments of the present invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Methods and systems for using a display device as a three-dimensional (3D) in-line mediator for interacting with digital 3D content displayed on the device are described in the various figures. The use of a display device as an in-line mediator enables intuitive bare hand manipulation of digital 3D content by allowing a user to see the direct effect of the user's handling of the 3D content on the display by reaching behind the display device. In this manner, the device display functions as an in-line mediator between the user and the 3D content, enabling a type of visual coherency for the user. That is, the 3D content is visually coherent or in-line from the user's perspective. The user's hand, or a representation of it, is shown on the display, maintaining the visual coherency. Furthermore, by reaching behind the device, the user's view of the 3D content on the display is not obstructed by the user's arm or hand.
  • FIG. 1A is an illustration of a user using a mobile device as a 3D in-line mediator to manipulate digital 3D content displayed on the device in accordance with one embodiment. The term “3D in-line mediation” refers to the user's eyes, the 3D content on the display, and user's hands behind the display (but not touching the back of the display) being aligned in real 3D space. A user 102 holds a mobile device 104 with one hand 105 and reaches behind device 104 with another hand 106. One or more sensors (not shown), collectively referred to as a sensor component, on device 104 face away from user 102 in the direction of user's hand behind device 104. User's hand 106 is detected and a representation 108 of the hand is displayed on a device monitor 109 displaying a 3D scene 107. As discussed in greater detail below, representation 108 may be an unaltered (actual) image of the user's hand that is composited onto scene 106 or may be a one-to-one mapping of real hand 106 to a virtual representation (not shown), such as an avatar hand which becomes part of 3D or virtual scene 107. The term “hand” as used herein may include, in addition to the user's hand, fingers, and thumb, the user's wrist and forearm, all of which may be detected by the sensor component.
  • Mobile device 104 may be a cell phone, a media player (e.g., MP3 player), portable gaming device, or any type of smart handset device having a display. It is assumed that the device is IP-enabled or capable of connecting to a suitable network to access 3D content over the Internet. However, the various embodiments described below do not necessarily require that the device be able to access a network. For example, the 3D content displayed on the device may be resident on a local storage of the device, such as on a hard disk or other mass storage component or on a local cache. The sensor component on mobile device 104 and the accompanying sensor software may be one or more of various types of sensors, a typical one being a conventional camera. Implementations of various sensor components are described below. Although the methods and systems of the various embodiments are described using a mobile device, they may equally apply to nomadic devices, such as laptops and netbook computers (i.e., devices that are portable), and to stationary devices, such as desktop computers, workstations, and the like, as shown in FIGS. 1B and 1C.
  • FIG. 1B is an illustration of a user 110 using a laptop computer 112 or similar nomadic computing device (e.g., netbook computer, mini laptop, etc.) as a 3D in-line mediator to manipulate digital 3D content displayed on device 112 in accordance with one embodiment. A laptop computer has a sensor component (not shown) facing away from user 110. Sensor component may be an internal component of laptop 112 or a peripheral 113 attached to it with associated software installed on laptop 112. Both of user 110 hands 114 and 115 are composited onto a 3D scene 116 on display 117.
  • Similarly, FIG. 1C is an illustration of a user 118 using a desktop computer monitor 119 as a 3D in-line mediator to manipulate digital 3D content displayed on desktop monitor 119 in accordance with one embodiment. As a practical matter, it is preferable that monitor 119 be some type of flat panel monitor, such as an LCD or plasma monitor, so that the user is physically able to reach behind the monitor. A tracking sensor component 120 detects a user's hand 122 behind desktop monitor 119. In this example, hand 122 is mapped to a digital representation, such as an avatar hand 124. User 118 may also move his other hand behind monitor 119 as in FIG. 1B.
  • FIGS. 2A and 2B are top views illustrating 3D in-line mediation. They show a user 200 holding a mobile device 202 with her left hand 204. User 200 extends her right hand 206 behind device 202. An area 208 behind mobile device 202 indicated by the solid angled lines, is a so-called virtual 3D space and the area 210 surrounding device 202 is the physical or real environment or world (“RW”) that user 200 is in. A segment of the user's hand 206 in virtual space 208 is shaded. A sensor component (not shown) detects the presence and movement in area 208. User 200 can move hand 206 around in area 208 to manipulate 3D objects shown on device 202. FIG. 2B shows a user 212 reaching behind a laptop 214 with both hands 216 and 218.
  • The sensor component, also referred to as a tracking component, may be implemented using various types of sensors. These sensors may be used to detect the presence of a user's hand (or any object) behind the mobile device's monitor. In a preferred embodiment, a standard or conventional mobile device or cell phone camera is used to sense the presence of a hand and its movements or gestures. Image differentiation or optic flow may also be used to detect and track hand movement. In other embodiments, a conventional camera may be replaced with infrared detection components to perform hand detection and tracking. For example, a mobile device camera facing away from the user and that is IR sensitive (or has its IR filter removed), possibly in combination with additional IR illumination (e.g., LED), may look for the brightest object within the ranger of the camera, which will likely be the user's hand. Dedicated infrared sensors with IR illumination may also be used. In another embodiment, redshift thermal imaging may be used. This option provides passive optical components that redshift a standard CMOS imager to be able to detect long wavelength and thermal infrared radiation. Another type of sensor may be ultrasonic gesture detection sensors. Sensor software options include off-the-shelf gesture recognition tools, such as software for detecting hands using object segmentation and/or optic flow. Other options include spectral imaging software for detecting skin tones, pseudo-thermal imaging, and 3D depth cameras using time-of-flight.
  • FIG. 2C is an illustration of a side perspective of user 200 shown initially in FIG. 2A. User hand 206 is in front of a sensor component of device 202, the sensor facing away from user 200. User 200 holds device 202 with hand 204. Hand 206 is behind device 202 in virtual 3D space 208. Space 208 is the space in proximity of the backside of device 202 that is tracked by the backward facing sensor. Real or physical world 210 is outside the proximity or tracked area of device 202. Gestures and movements of hand 206 in 3D space 208 are made by user 200 in order to manipulate a perceived 3D object (not shown) on the display of device 202. In another scenario, the gestures may not be for manipulating an object but simply for the purpose of making a gesture (e.g., waving, pointing, etc.) in a virtual world environment.
  • FIG. 3A is a more detailed top view of a user utilizing a mobile device as a 3D in-line mediator for manipulating digital 3D content in accordance with one embodiment. It shows a user's head 303, a “perceived” digital 3D object 304 and a display device 306 as being in-line or aligned, with display device 306 functioning as the in-line mediator. It also shows a user's hand 308 reaching behind device 306. FIG. 3B shows a scene 310 that user head 303 sees when facing device 306 and when reaching behind device 306. The user sees digital 3D object 312 on the screen and user hand 308 or a representation of it touching or otherwise manipulating object 312. It is helpful to note that although the figures only show a user touching an object or making gestures, the user is also able to manipulate a digital 3D object, such as touching, lifting, holding, moving, pushing, pulling, dropping, throwing, rotating, deforming, bending, stretching, compressing, squeezing, pinching. When the user reaches behind the monitor or device, she can move her hand(s) to manipulate 3D objects she sees on the display. As explained below, there is a depth component. If the 3D object is in front of the 3D scene she is viewing, she will not have to reach far behind the device/monitor. If the object is further back in the 3D scene, the user may have to physically reach further behind the device/monitor in order to touch (or collide with) the 3D object.
  • FIG. 4 is a flow diagram of a process of enabling in-line mediation in accordance with one embodiment. The process described in FIG. 4 begins after the user has powered on the device, which may be mobile, nomadic, or stationary. A tracking component has also been activated. The device displays 3D content, such as an online virtual world or any other form of 3D content, examples of which are provided above. The user directly faces the display; that is, sits squarely in front of the laptop or desktop monitor or holds the cell phone directly in front of her. There may be a 3D object displayed on the screen that the user wants to manipulate (e.g., pick up a ball, move a chair, etc.) or there may be a 3D world scene in which the user wants to perform a hand gesture or movement (e.g., wave to 3D person or an avatar). Other examples that do not involve online 3D content may include moving or changing the orientation of 3D medical imaging data, playing a 3D video game, interacting with 3D content, such as a movie or show, and so on.
  • The user begins by moving a hand behind the device (hereafter, for ease of illustration, the term “device” may convey mobile device screens and laptop/desktop monitors). At step 402 a tracking component detects the presence of the user's hand. There are various ways this can be done. One conventional way is by detecting the skin tone of the user's hand. As described above, there are numerous types of tracking components or sensors that may be used. Which one that is most suitable will likely depend on the features and capabilities of the device (i.e., mobile, nomadic, stationary, etc.). A typical cell phone camera is capable of detecting the presence of a human hand. An image of the hand (or hands) is transmitted to a compositing component.
  • At step 404 the hand is displayed on the screen. The user sees either an unaltered view of her hand (not including the background behind and around the hand) or an altered representation of the hand. If an image of the user's hand is displayed, known compositing techniques may be used. For example, some techniques may involve combining two video sources-one for the 3D content and another representing video images of the user's hand. Other techniques for overlaying or compositing the images of the hand over the 3D content data may be used and which technique is most suitable will likely depend on the type of device. If the user's hand is mapped to an avatar hand or other digital representation, software from the 3D content provider or other conventional software may be used to perform a mapping of the user hand images to an avatar image, such as a robotic hand. Thus, after step 404, a representation of a stationary user's hand can be seen on the device. That is, its presence has been detected and is being represented on the device.
  • At step 406 the user starts moving the hand, either by moving it up, down, left, right, or inward or outward (relative to the device) or by gesturing (or both). The initial position of the hand and its subsequent movement can be described in terms of x, y, and z coordinates. The tracking component begins tracking hand movement and gesturing, which has horizontal, vertical, and depth components. For example, a user may be viewing a 3D virtual world room on the device and wants to move an object that is in the far left corner of the room (which has a certain depth) and to the near right corner of the room. In one embodiment of the invention, the user may have to move his hand to a position that is, for example, about 12 inches behind and slightly left of the device. This may require that the user extend her arm out a little further than what would be considered a normal or natural distance. After grabbing the object, as discussed in step 408 below, the user moves her hand to a position that is maybe 2-3 inches behind and to the right of the device. This example illustrates that there is a depth component in the hand tracking that is implemented to maintain the in-line mediation performed by the device.
  • At step 408 the digital representation of the user's hand on the device collides or touches an object. This collision is detected by comparing sensor data from the tracking sensor and geometrical data from the 3D data repository. The user moves her hand behind the device in a way that causes the digital representation of her hand on the screen to collide with the object, at which point she can grab, pick up, or otherwise manipulate the object. The user's hand may be characterized as colliding with the perceived object that is “floating” behind the device, as described in FIG. 3A. In the described embodiment, in order to maintain the 3D in-line mediation or visual coherency, the user's eyes are looking straight at the middle of the screen. That is, there is a vertical and horizontal alignment of the user's head with the device and the 3D content. In another embodiment, the user's face may also be tracked which may enable changes in the 3D content images to reflect movement in the user's head (i.e., perspective).
  • In one embodiment, an “input-output coincidence” model is used to close a human-computer interaction feature referred to as a perception-action loop, where perception is what the user sees and action is what the user does. This enables a user to see the consequences of an interaction, such as touching a 3-D object, immediately. As described above, a user's hand is aligned with or in the same position as the 3-D object that is being manipulated. That is, from the user's perspective, the hand is aligned with the 3-D object so that it looks like the user is lifting or moving a 3-D object as if it were a physical object. What the user sees makes sense based on the action being taken by the user. In one embodiment, the system provides tactile feedback to the user upon detecting a collision between the user's hand and the 3-D object.
  • At step 410 the image of the 3D scene is modified to reflect the user's manipulation of the 3D object. If there is no manipulation of a 3D object (and thus no object collision), the image on the screen changes as the user moves her hand, as it does when the user manipulates a 3D object. The changes in the 3D image on the screen may be done using known methods for processing 3D content data. These methods or techniques may vary depending on the type of device, the source of the data, and other factors. The process then repeats by returning to step 402 where the presence of the user's hand is again detected. The process described in FIG. 4 is continuous in that the user's hand movement is tracked as long as it is within the range of the tracking component. In the described embodiment, the device is able to perform as a 3D in-line mediator as long as the user's head or perspective is kept in line with the device which, in turn, allows the user's hand movements behind the device to be visually coherent with the hand movements shown on the screen and vice versa. That is, the user moves her hand in the physical world based on actions she wants to perform in the digital 3D environment shown on the screen.
  • FIG. 5 is a block diagram showing relevant components of a device capable of functioning as a 3D in-line mediator in accordance with one embodiment. Many of the components shown here have been described above. A device 500 has a display component (not shown) for displaying digital 3D content data 501 which may be stored in mass storage or in a local cache (not shown) on device 500 or may be downloaded from the Internet or from another source. A tracking sensor component 502 may include one or more conventional (2D) cameras and 3D (depth) cameras and non-camera peripherals. A 3-D camera may provide depth data which simplifies gesture recognition by use of depth keying. In another embodiment, a wide angle lens may be used in a camera which may require less processing by an imaging system, but may produce more distortion. Component 502 may also have other capabilities as described above, such as infrared detection, optic flow, image differentiation, redshift thermal imaging, spectral processing, and other techniques may be used in tracking component 502. Tracking sensor component 502 is responsible for tracking the position of body parts within the range of detection. This position data is transmitted to hand tracking module 504 and to face tracking module 506 and each identifies features that are relevant to each module.
  • Hand tracking module 504 identifies features of the user's hand positions, including the positions of the fingers, wrist, and arm. It determines the location of these body parts in the 3D environment. Data from module 504 goes to two components related to hand and arm position: gesture detection module 508 and hand collision detection module 510. In one embodiment, a user “gesture” results in a modification of 3D content 501. A gesture may include lifting, holding, squeezing, pinching, or rotating a 3D object. These actions typically result in some type of modification of the object in the 3D environment. A modification of an object may include a change in its location (lifting or turning) without there being an actual deformation or change in shape of the object. The gesture detection data may be applied directly to the graphics data representing 3D content 501.
  • In another embodiment, tracking sensor component 502 may also track the user's face. In this case, face tracking data is transmitted to face tracking module 506. Face tracking may be utilized in cases where the user is not vertically aligned (i.e., the user's head is not looking directly at the middle of the screen) with the device and the perceived object.
  • In another embodiment, data from hand collision detection module 510 may be transmitted to a tactile feedback controller 512, which is connected to one or more actuators 514 which are external to device 500. In this embodiment, the user may receive haptic feedback when the user's hand collides with a 3D object. Generally, it is preferred that actuators 514 be as unobtrusive as possible. In one embodiment, they are vibrating wristbands, which may be wired or wireless. Using wristbands allows for bare hand manipulation of 3D content as described above. Tactile feedback controller 512 receives a signal that there is a collision or contact and causes tactile actuators 514 to provide a physical sensation to the user. For example, with vibrating wristbands, the user's wrist will sense a vibration or similar physical sensation indicating contact with the 3-D object.
  • As is evident from the figures and the various embodiments, the present invention enables a user to interact with digital 3D content in a natural and immersive way by enabling visual coherency, thereby creating an immersive volumetric interaction with the 3D content. In one embodiment, a user uploads or executes 3D content onto a mobile computing device, such as a cell phone. This 3D content may be a virtual world that the user has visited using a browser on the mobile device (e.g., Second Life or any other site that provides virtual world content). Other examples include movies, video games, online virtual cities, medical imaging (e.g., examining MRIs), modeling and prototyping, information visualization, architecture, tele-immersion and collaboration, and geographic information systems (e.g., Google Earth). The user holds the display of the device upright at a comfortable distance in front of the user's eyes, for example at 20-30 centimeters. The display of the mobile device is used as a window into the virtual world. Using the mobile device as an in-line mediator between the user and the user's hand, the user is able to manipulate 3D objects shown on the display by reaching behind the display of the device and make hand gestures and movements around a perceived object behind the display. The user sees the gestures and movements on the display and the 3D object that they are affecting.
  • As discussed above, one aspect of creating an immersive and natural user interaction with 3D content using a mobile device is enabling the user to have bare-hand interaction with objects in the virtual world. That is, allowing the user to manipulate and “touch” digital 3D objects using the mobile device and not requiring the user to use any peripheral devices, such as gloves, finger sensors, motion detectors, and the like.
  • FIGS. 6A and 6B illustrate a computing system 600 suitable for implementing embodiments of the present invention. FIG. 6A shows one possible physical form of the computing system. Of course, the computing system may have many physical forms including an integrated circuit, a printed circuit board, a small handheld device (such as a mobile telephone, handset or PDA), a personal computer or a super computer. Computing system 600 includes a monitor 602, a display 604, a housing 606, a disk drive 608, a keyboard 610 and a mouse 612. Disk 614 is a computer-readable medium used to transfer data to and from computer system 600.
  • FIG. 6B is an example of a block diagram for computing system 600. Attached to system bus 620 are a wide variety of subsystems. Processor(s) 622 (also referred to as central processing units, or CPUs) are coupled to storage devices including memory 624. Memory 624 includes random access memory (RAM) and read-only memory (ROM). As is well known in the art, ROM acts to transfer data and instructions uni-directionally to the CPU and RAM is used typically to transfer data and instructions in a bi-directional manner. Both of these types of memories may include any suitable of the computer-readable media described below. A fixed disk 626 is also coupled bi-directionally to CPU 622; it provides additional data storage capacity and may also include any of the computer-readable media described below. Fixed disk 626 may be used to store programs, data and the like and is typically a secondary storage medium (such as a hard disk) that is slower than primary storage. It will be appreciated that the information retained within fixed disk 626, may, in appropriate cases, be incorporated in standard fashion as virtual memory in memory 624. Removable disk 614 may take the form of any of the computer-readable media described below.
  • CPU 622 is also coupled to a variety of input/output devices such as display 604, keyboard 610, mouse 612 and speakers 630. In general, an input/output device may be any of: video displays, track balls, mice, keyboards, microphones, touch-sensitive displays, transducer card readers, magnetic or paper tape readers, tablets, styluses, voice or handwriting recognizers, biometrics readers, or other computers. CPU 622 optionally may be coupled to another computer or telecommunications network using network interface 640. With such a network interface, it is contemplated that the CPU might receive information from the network, or might output information to the network in the course of performing the above-described method steps. Furthermore, method embodiments of the present invention may execute solely upon CPU 622 or may execute over a network such as the Internet in conjunction with a remote CPU that shares a portion of the processing.
  • Although illustrative embodiments and applications of this invention are shown and described herein, many variations and modifications are possible which remain within the concept, scope, and spirit of the invention, and these variations would become clear to those of ordinary skill in the art after perusal of this application. Accordingly, the embodiments described are illustrative and not restrictive, and the invention is not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims.

Claims (32)

1. A method of detecting manipulation of a digital 3D object displayed on a device having a front side with a display and a back side, the method comprising:
detecting a hand within a specific area of the back side of the device, the back side having an sensor;
displaying the hand on the display;
tracking movement of the hand within the specific area of the back side, wherein said movement is caused by a user intending to manipulate the displayed 3D object;
detecting a collision between the displayed hand and the displayed 3D object; and
modifying an image of the 3D object displayed on the device,
wherein the device is a 3D in-line mediator between the user and the 3D object.
2. A method as recited in claim 1 further comprising:
detecting a hand gesture within the specific area.
3. A method as recited in claim 1 wherein modifying an image of the 3D object further comprises:
deforming the image of the 3D object.
4. A method as recited in claim 1 wherein modifying an image of the 3D object further comprises:
moving the image of the 3D object.
5. A method as recited in claim 1 further comprising:
displaying the modified image on the device.
6. A method as recited in claim 1 wherein the user reaches behind the device to manipulate a perceived object corresponding to the 3D object, such that the hand is within the specific area of the back side of the device.
7. A method as recited in claim 6 further comprising:
providing the user with visual coherency when the user reaches behind the device.
8. A method as recited in claim 1 wherein tracking movement of the hand further comprises:
processing depth data of said hand movement.
9. A method as recited in claim 1 further comprising:
executing tracking software.
10. A method as recited in claim 1 wherein the sensor is a tracking component that faces outward from the back side of the device and wherein the sensor is a camera.
11. A method as recited in claim 1 wherein displaying the hand further comprising:
displaying a composited image of the hand on the display.
12. A method as recited in claim 1 wherein displaying the hand further comprising:
displaying a virtualized image of the hand on the display.
13. A method as recited in claim 1 further comprising:
providing haptic feedback to the hand when a collision is detected between the displayed hand and the 3D object.
14. A method as recited in claim 1 wherein there is no contact between the hand and the back side of the device or with the display.
15. A device having a display, the device comprising:
a processor;
a memory storing digital 3D content data;
a tracking sensor component for tracking movement of an object in proximity of the device, wherein the tracking sensor component is on a back side of the device facing away from a user;
a hand tracking module for processing movement data related to a user hand; and
a hand-3D object collision module for detecting a collision between the user hand and a 3D object.
16. A device as recited in claim 15 further comprising:
a face tracking sensor component for tracking face movement in proximity of a front side of a device; and
a face tracking module for processing face movement data related to user face movement in front of the device.
17. A device as recited in claim 15 further comprising:
a hand gesture detection module for detecting user hand gestures made within range of the tracking sensor component.
18. A device as recited in claim 15 further comprising:
a tactile feedback controller for providing tactile feedback to the user hand.
19. A device as recited in claim 15 wherein a tracking sensor component is a camera-based component.
20. A device as recited in claim 15 wherein a tracking sensor component is one of an image differentiator, infrared detector, optic flow component, and spectral processor.
21. A device as recited in claim 15 wherein the tracking sensor component tracks movements of the hand when the user moves the hand behind the device with the range of the tracking sensor component.
22. A device as recited in claim 15 wherein the device is one of a mobile device, a nomadic device, and a stationary device.
23. A device as recited in claim 15 further comprising a network interface for connecting to a network to receive digital 3D content data.
24. An apparatus for manipulating digital 3D content, the apparatus having a front side with a display and a back side, the apparatus comprising:
means for detecting a hand within a specific area of the back side of the apparatus, the back side having an sensor;
means for displaying the hand on the apparatus;
means for tracking movement of the hand within the specific area of the back side, wherein said movement is caused by a user intending to manipulate a displayed 3D object;
means for detecting a collision between the displayed hand and the displayed 3D object; and
means for modifying an image of the 3D object displayed on the apparatus,
wherein the apparatus is a 3D in-line mediator between the user and the 3D object.
25. An apparatus as recited in claim 24 further comprising:
means for detecting a hand gesture within the specific area.
26. An apparatus as recited in claim 24 wherein means for modifying an image of the 3D object further comprises:
means for moving the image of the 3D object.
27. An apparatus as recited in claim 24 further comprising:
means for displaying the modified image on the apparatus.
28. An apparatus as recited in claim 24 wherein the user reaches behind the apparatus to manipulate a perceived object corresponding to the 3D object, such that the hand is within the specific area of the back side of the apparatus.
29. An apparatus as recited in claim 24 wherein means for tracking movement of the hand further comprises:
means for processing depth data of said hand movement.
30. An apparatus as recited in claim 24 wherein the sensor is a tracking component that faces outward from the back side of the apparatus and wherein the sensor is a camera.
31. An apparatus as recited in claim 24 wherein means for displaying the hand further comprises:
means for displaying a composited image of the hand on the apparatus.
32. An apparatus as recited in claim 24 further comprising:
means for providing haptic feedback to the hand when a collision is detected between the displayed hand and the 3D object.
US12/421,363 2008-09-02 2009-04-09 In-line mediation for manipulating three-dimensional content on a display device Abandoned US20100053151A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US12/421,363 US20100053151A1 (en) 2008-09-02 2009-04-09 In-line mediation for manipulating three-dimensional content on a display device
KR1020090080586A KR20100027976A (en) 2008-09-02 2009-08-28 Gesture and motion-based navigation and interaction with three-dimensional virtual content on a mobile device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US9365108P 2008-09-02 2008-09-02
US12/421,363 US20100053151A1 (en) 2008-09-02 2009-04-09 In-line mediation for manipulating three-dimensional content on a display device

Publications (1)

Publication Number Publication Date
US20100053151A1 true US20100053151A1 (en) 2010-03-04

Family

ID=41724668

Family Applications (5)

Application Number Title Priority Date Filing Date
US12/421,363 Abandoned US20100053151A1 (en) 2008-09-02 2009-04-09 In-line mediation for manipulating three-dimensional content on a display device
US12/424,117 Expired - Fee Related US8253649B2 (en) 2008-09-02 2009-04-15 Spatially correlated rendering of three-dimensional content on display components having arbitrary positions
US12/546,084 Expired - Fee Related US8253795B2 (en) 2008-09-02 2009-08-24 Egomotion speed estimation on a mobile device
US12/551,287 Expired - Fee Related US8310537B2 (en) 2008-09-02 2009-08-31 Detecting ego-motion on a mobile device displaying three-dimensional content
US12/699,568 Expired - Fee Related US8456524B2 (en) 2008-09-02 2010-02-03 Egomotion speed estimation on a mobile device using a single imager

Family Applications After (4)

Application Number Title Priority Date Filing Date
US12/424,117 Expired - Fee Related US8253649B2 (en) 2008-09-02 2009-04-15 Spatially correlated rendering of three-dimensional content on display components having arbitrary positions
US12/546,084 Expired - Fee Related US8253795B2 (en) 2008-09-02 2009-08-24 Egomotion speed estimation on a mobile device
US12/551,287 Expired - Fee Related US8310537B2 (en) 2008-09-02 2009-08-31 Detecting ego-motion on a mobile device displaying three-dimensional content
US12/699,568 Expired - Fee Related US8456524B2 (en) 2008-09-02 2010-02-03 Egomotion speed estimation on a mobile device using a single imager

Country Status (3)

Country Link
US (5) US20100053151A1 (en)
KR (2) KR20100027976A (en)
WO (1) WO2010027193A2 (en)

Cited By (207)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090059035A1 (en) * 2002-11-20 2009-03-05 Sony Corporation Picture production system, and picture production apparatus and method
US20090071375A1 (en) * 2005-06-15 2009-03-19 Halliburton Energy Services, Inc. Gas-Generating Additives Having Improved Shelf Lives for Use in Cement Compositions
US20090289188A1 (en) * 2008-05-20 2009-11-26 Everspring Industry Co., Ltd. Method for controlling an electronic device through infrared detection
US20100138797A1 (en) * 2008-12-01 2010-06-03 Sony Ericsson Mobile Communications Ab Portable electronic device with split vision content sharing control and method
US20100302138A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Methods and systems for defining or modifying a visual representation
US20100315413A1 (en) * 2009-06-16 2010-12-16 Microsoft Corporation Surface Computer User Interaction
US20110058711A1 (en) * 2009-09-04 2011-03-10 Takurou Noda Information Processing Apparatus, Method for Controlling Display, and Program for Controlling Display
US20110084983A1 (en) * 2009-09-29 2011-04-14 Wavelength & Resonance LLC Systems and Methods for Interaction With a Virtual Environment
US20110093778A1 (en) * 2009-10-20 2011-04-21 Lg Electronics Inc. Mobile terminal and controlling method thereof
US20110179368A1 (en) * 2010-01-19 2011-07-21 King Nicholas V 3D View Of File Structure
US20110193938A1 (en) * 2008-07-17 2011-08-11 Nederlandse Organisatie Voor Toegepast- Natuurwetenschappelijk Onderzoek Tno System, a method and a computer program for inspection of a three-dimensional environment by a user
WO2011123599A1 (en) * 2010-03-31 2011-10-06 Immersion Corporation System and method for providing haptic stimulus based on position
WO2011127646A1 (en) 2010-04-13 2011-10-20 Nokia Corporation An apparatus, method, computer program and user interface
US20120007854A1 (en) * 2010-07-12 2012-01-12 Lg Electronics Inc. Mobile terminal and controlling method thereof
US20120050155A1 (en) * 2010-09-01 2012-03-01 Canon Kabushiki Kaisha Imaging apparatus and control method thereof
KR20120028743A (en) * 2010-09-15 2012-03-23 엘지전자 주식회사 Mobile terminal and operation control method thereof
WO2011147561A3 (en) * 2010-05-28 2012-04-12 Chao Zhang Mobile unit, method for operating the same and network comprising the mobile unit
US20120098806A1 (en) * 2010-10-22 2012-04-26 Ramin Samadani System and method of modifying lighting in a display system
WO2012052612A1 (en) * 2010-10-21 2012-04-26 Nokia Corporation Apparatus and method for user input for controlling displayed information
EP2453344A1 (en) * 2010-11-11 2012-05-16 Sony Corporation Information processing apparatus, stereoscopic display method, and program
US20120121128A1 (en) * 2009-04-20 2012-05-17 Bent 360: Medialab Inc. Object tracking system
US20120162384A1 (en) * 2010-12-22 2012-06-28 Vesely Michael A Three-Dimensional Collaboration
US20120192067A1 (en) * 2011-01-20 2012-07-26 Research In Motion Corporation Three-dimensional, multi-depth presentation of icons in association with differing input components of a user interface
US20120194432A1 (en) * 2011-01-27 2012-08-02 Research In Motion Limited Portable electronic device and method therefor
US20120204133A1 (en) * 2009-01-13 2012-08-09 Primesense Ltd. Gesture-Based User Interface
US20120200600A1 (en) * 2010-06-23 2012-08-09 Kent Demaine Head and arm detection for virtual immersion systems and methods
US20120242793A1 (en) * 2011-03-21 2012-09-27 Soungmin Im Display device and method of controlling the same
US20120274589A1 (en) * 2011-04-28 2012-11-01 De Angelo Michael J Apparatus, system, and method for remote interaction with a computer display or computer visualization or object
US20120317510A1 (en) * 2011-06-07 2012-12-13 Takuro Noda Information processing apparatus, information processing method, and program
US20130009863A1 (en) * 2011-07-06 2013-01-10 Sony Corporation Display control apparatus, display control method, and program
CN102929388A (en) * 2011-09-30 2013-02-13 微软公司 Full space posture input
US20130050202A1 (en) * 2011-08-23 2013-02-28 Kyocera Corporation Display device
US20130050076A1 (en) * 2011-08-22 2013-02-28 Research & Business Foundation Sungkyunkwan University Method of recognizing a control command based on finger motion and mobile device using the same
US20130050414A1 (en) * 2011-08-24 2013-02-28 Ati Technologies Ulc Method and system for navigating and selecting objects within a three-dimensional video image
US20130100008A1 (en) * 2011-10-19 2013-04-25 Stefan J. Marti Haptic Response Module
WO2013100900A1 (en) 2011-12-27 2013-07-04 Intel Corporation Full 3d interaction on mobile devices
EP2538305A3 (en) * 2011-06-23 2013-08-21 Omek Interactive, Ltd. System and method for close-range movement tracking
CN103294173A (en) * 2012-02-24 2013-09-11 冠捷投资有限公司 Remote control system based on user actions and method thereof
US20130258701A1 (en) * 2012-03-28 2013-10-03 Microsoft Corporation Mobile Device Light Guide Display
US20130265232A1 (en) * 2012-04-08 2013-10-10 Samsung Electronics Co., Ltd. Transparent display apparatus and method thereof
US20130283214A1 (en) * 2012-04-18 2013-10-24 Electronics And Telecommunications Research Institute Apparatus and method for providing user interface for recognizing gesture
US20130293547A1 (en) * 2011-12-07 2013-11-07 Yangzhou Du Graphics rendering technique for autostereoscopic three dimensional display
CN103403661A (en) * 2011-09-27 2013-11-20 电子触控产品解决方案公司 Scaling of gesture based input
US8615108B1 (en) 2013-01-30 2013-12-24 Imimtek, Inc. Systems and methods for initializing motion tracking of human hands
US20130343601A1 (en) * 2012-06-22 2013-12-26 Charles Jia Gesture based human interfaces
US20140002339A1 (en) * 2012-06-28 2014-01-02 David Brent GUARD Surface With Touch Sensors for Detecting Proximity
US20140002336A1 (en) * 2012-06-27 2014-01-02 Greg D. Kaine Peripheral device for visual and/or tactile feedback
CN103502923A (en) * 2011-09-27 2014-01-08 电子触控产品解决方案公司 Touch and non touch based interaction of a user with a device
US8655021B2 (en) 2012-06-25 2014-02-18 Imimtek, Inc. Systems and methods for tracking human hands by performing parts based template matching using images from multiple viewpoints
JP2014506709A (en) * 2011-02-22 2014-03-17 クアルコム,インコーポレイテッド Providing position-based correction images for users' mobile platforms
US8702507B2 (en) 2011-04-28 2014-04-22 Microsoft Corporation Manual and camera-based avatar control
US20140145933A1 (en) * 2012-11-27 2014-05-29 Hyundai Motor Company Display and method capable of moving image
US20140161319A1 (en) * 2011-07-19 2014-06-12 Nec Corporation Information processing apparatus, method for tracking object and program storage medium
JP2014109802A (en) * 2012-11-30 2014-06-12 Casio Comput Co Ltd Image processor, image processing method and program
WO2014103167A1 (en) * 2012-12-27 2014-07-03 Sony Corporation Information processing apparatus, information processing method, and program
JPWO2012124250A1 (en) * 2011-03-15 2014-07-17 パナソニック株式会社 Object control apparatus, object control method, object control program, and integrated circuit
US8788973B2 (en) 2011-05-23 2014-07-22 Microsoft Corporation Three-dimensional gesture controlled avatar configuration interface
US20140235311A1 (en) * 2010-03-05 2014-08-21 Sony Computer Entertainment America Llc Maintaining multiple views on a shared stable virtual space
US8830312B2 (en) 2012-06-25 2014-09-09 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching within bounded regions
US8854802B2 (en) 2010-10-22 2014-10-07 Hewlett-Packard Development Company, L.P. Display with rotatable display screen
WO2014168901A1 (en) * 2013-04-12 2014-10-16 Microsoft Corporation Holographic object feedback
US8872762B2 (en) 2010-12-08 2014-10-28 Primesense Ltd. Three dimensional user interface cursor control
US8881051B2 (en) 2011-07-05 2014-11-04 Primesense Ltd Zoom-based gesture user interface
US20140354602A1 (en) * 2013-04-12 2014-12-04 Impression.Pi, Inc. Interactive input system and method
US8933912B2 (en) 2012-04-02 2015-01-13 Microsoft Corporation Touch sensitive user interface with three dimensional input sensor
US8933876B2 (en) 2010-12-13 2015-01-13 Apple Inc. Three dimensional user interface session control
JP2015026286A (en) * 2013-07-26 2015-02-05 セイコーエプソン株式会社 Display device, display system and control method of display device
US8959013B2 (en) 2010-09-27 2015-02-17 Apple Inc. Virtual keyboard for a non-tactile three dimensional user interface
US20150084866A1 (en) * 2012-06-30 2015-03-26 Fred Thomas Virtual hand based on combined data
JP2015084261A (en) * 2015-02-05 2015-04-30 キヤノン株式会社 Imaging apparatus, control method and program thereof, and recording medium
CN104603718A (en) * 2012-08-28 2015-05-06 Nec卡西欧移动通信株式会社 Electronic apparatus, control method thereof, and program
US20150128061A1 (en) * 2013-11-05 2015-05-07 Intuit Inc. Remote control of a desktop application via a mobile device
US9030498B2 (en) 2011-08-15 2015-05-12 Apple Inc. Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
US9035876B2 (en) 2008-01-14 2015-05-19 Apple Inc. Three-dimensional user interface session control
US20150145773A1 (en) * 2013-11-26 2015-05-28 Adobe Systems Incorporated Behind-display user interface
FR3014228A1 (en) * 2013-12-04 2015-06-05 Renault Sa VIRTUAL DETECTION SYSTEM
WO2015102527A1 (en) 2014-01-05 2015-07-09 Yousefi Shahrouz Real-time 3d gesture recognition and tracking system for mobile devices
US20150205372A1 (en) * 2014-01-22 2015-07-23 Samsung Electronics Co., Ltd. Method and apparatus for providing input interface for mobile terminal
US9092665B2 (en) 2013-01-30 2015-07-28 Aquifi, Inc Systems and methods for initializing motion tracking of human hands
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
US9158375B2 (en) 2010-07-20 2015-10-13 Apple Inc. Interactive reality augmentation for natural interaction
US9164581B2 (en) 2010-10-22 2015-10-20 Hewlett-Packard Development Company, L.P. Augmented reality display system and method of display
US20150301606A1 (en) * 2014-04-18 2015-10-22 Valentin Andrei Techniques for improved wearable computing device gesture based interactions
US20150312559A1 (en) * 2012-09-27 2015-10-29 Kyocera Corporation Display device, control method, and control program
EP2942693A1 (en) * 2014-05-05 2015-11-11 Immersion Corporation Systems and methods for viewport-based augmented reality haptic effects
EP2947548A1 (en) * 2014-05-19 2015-11-25 Immersion Corporation Non-collocated haptic cues in immersive environments
US9201501B2 (en) 2010-07-20 2015-12-01 Apple Inc. Adaptive projector
US9218063B2 (en) 2011-08-24 2015-12-22 Apple Inc. Sessionless pointing user interface
US9223138B2 (en) 2011-12-23 2015-12-29 Microsoft Technology Licensing, Llc Pixel opacity for augmented reality
US9229534B2 (en) 2012-02-28 2016-01-05 Apple Inc. Asymmetric mapping for tactile and non-tactile user interfaces
US9250703B2 (en) 2006-03-06 2016-02-02 Sony Computer Entertainment Inc. Interface with gaze detection and voice input
US20160065936A1 (en) * 2013-04-01 2016-03-03 Lg Electronics Inc. Image display device for providing function of changing screen display direction and method thereof
US9285874B2 (en) 2011-02-09 2016-03-15 Apple Inc. Gaze detection in a 3D mapping environment
US9298266B2 (en) 2013-04-02 2016-03-29 Aquifi, Inc. Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US9297996B2 (en) 2012-02-15 2016-03-29 Microsoft Technology Licensing, Llc Laser illumination scanning
US9298012B2 (en) 2012-01-04 2016-03-29 Microsoft Technology Licensing, Llc Eyebox adjustment for interpupillary distance
US9304235B2 (en) 2014-07-30 2016-04-05 Microsoft Technology Licensing, Llc Microfabrication
US9310891B2 (en) 2012-09-04 2016-04-12 Aquifi, Inc. Method and system enabling natural user interface gestures with user wearable glasses
US9330470B2 (en) 2010-06-16 2016-05-03 Intel Corporation Method and system for modeling subjects from a depth map
US9354719B2 (en) 2011-02-28 2016-05-31 Stmicroelectronics (Research & Development) Limited Optical navigation devices
US9368546B2 (en) 2012-02-15 2016-06-14 Microsoft Technology Licensing, Llc Imaging structure with embedded light sources
US9372347B1 (en) 2015-02-09 2016-06-21 Microsoft Technology Licensing, Llc Display system
US9377863B2 (en) 2012-03-26 2016-06-28 Apple Inc. Gaze-enhanced virtual touchscreen
US9377865B2 (en) 2011-07-05 2016-06-28 Apple Inc. Zoom-based gesture user interface
US20160189430A1 (en) * 2013-08-16 2016-06-30 Audi Ag Method for operating electronic data glasses, and electronic data glasses
US9395195B2 (en) 2010-03-30 2016-07-19 Ns Solutions Corporation System, method and program for managing and displaying product information
US9423360B1 (en) 2015-02-09 2016-08-23 Microsoft Technology Licensing, Llc Optical components
US9429692B1 (en) 2015-02-09 2016-08-30 Microsoft Technology Licensing, Llc Optical components
US9459758B2 (en) 2011-07-05 2016-10-04 Apple Inc. Gesture-based interface with enhanced features
US9477303B2 (en) 2012-04-09 2016-10-25 Intel Corporation System and method for combining three-dimensional tracking with a three-dimensional display for a user interface
US9504920B2 (en) 2011-04-25 2016-11-29 Aquifi, Inc. Method and system to create three-dimensional mapping in a two-dimensional game
US9507417B2 (en) 2014-01-07 2016-11-29 Aquifi, Inc. Systems and methods for implementing head tracking based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US9513480B2 (en) 2015-02-09 2016-12-06 Microsoft Technology Licensing, Llc Waveguide
US9535253B2 (en) 2015-02-09 2017-01-03 Microsoft Technology Licensing, Llc Display system
EP3117290A1 (en) * 2014-03-10 2017-01-18 BAE Systems PLC Interactive information display
US9558590B2 (en) 2012-03-28 2017-01-31 Microsoft Technology Licensing, Llc Augmented reality light guide display
US20170038830A1 (en) * 2015-08-04 2017-02-09 Google Inc. Context sensitive hand collisions in virtual reality
US9578318B2 (en) 2012-03-14 2017-02-21 Microsoft Technology Licensing, Llc Imaging structure emitter calibration
US9581820B2 (en) 2012-06-04 2017-02-28 Microsoft Technology Licensing, Llc Multiple waveguide imaging structure
US9600078B2 (en) 2012-02-03 2017-03-21 Aquifi, Inc. Method and system enabling natural user interface gestures with an electronic system
US9619105B1 (en) 2014-01-30 2017-04-11 Aquifi, Inc. Systems and methods for gesture based interaction with viewpoint dependent user interfaces
US20170131775A1 (en) * 2015-11-10 2017-05-11 Castar, Inc. System and method of haptic feedback by referral of sensation
US9658695B2 (en) * 2012-11-08 2017-05-23 Cuesta Technology Holdings, Llc Systems and methods for alternative control of touch-based devices
US9671874B2 (en) 2012-11-08 2017-06-06 Cuesta Technology Holdings, Llc Systems and methods for extensions to alternative control of touch-based devices
US20170177211A1 (en) * 2009-01-23 2017-06-22 Samsung Electronics Co., Ltd. Mobile terminal having dual touch screen and method of controlling content therein
US20170177077A1 (en) * 2015-12-09 2017-06-22 National Taiwan University Three-dimension interactive system and method for virtual reality
US9687059B2 (en) * 2013-08-23 2017-06-27 Preemadonna Inc. Nail decorating apparatus
US20170185141A1 (en) * 2015-12-29 2017-06-29 Microsoft Technology Licensing, Llc Hand tracking for interaction feedback
US20170201721A1 (en) * 2014-09-30 2017-07-13 Hewlett Packard Enterprise Development Lp Artifact projection
US9717981B2 (en) 2012-04-05 2017-08-01 Microsoft Technology Licensing, Llc Augmented reality and physical games
US9726887B2 (en) 2012-02-15 2017-08-08 Microsoft Technology Licensing, Llc Imaging structure color conversion
US9740338B2 (en) 2014-05-22 2017-08-22 Ubi interactive inc. System and methods for providing a three-dimensional touch screen
US9753549B2 (en) * 2014-03-14 2017-09-05 Sony Interactive Entertainment Inc. Gaming device with rotatably placed cameras
US9760166B2 (en) * 2012-12-17 2017-09-12 Centre National De La Recherche Scientifique Haptic system for establishing a contact free interaction between at least one part of a user's body and a virtual environment
WO2017116813A3 (en) * 2015-12-28 2017-09-14 Microsoft Technology Licensing, Llc Haptic feedback for non-touch surface interaction
US9779643B2 (en) 2012-02-15 2017-10-03 Microsoft Technology Licensing, Llc Imaging structure emitter configurations
WO2017179786A1 (en) * 2016-04-11 2017-10-19 (주)스마트몹 Three-dimensional input device, method and system using motion recognition sensor
US9798388B1 (en) 2013-07-31 2017-10-24 Aquifi, Inc. Vibrotactile system to augment 3D input systems
US9827209B2 (en) 2015-02-09 2017-11-28 Microsoft Technology Licensing, Llc Display system
US20170351331A1 (en) * 2012-08-02 2017-12-07 Immersion Corporation Systems and Methods for Haptic Remote Control Gaming
US9857868B2 (en) 2011-03-19 2018-01-02 The Board Of Trustees Of The Leland Stanford Junior University Method and system for ergonomic touch-free interface
US9918681B2 (en) 2011-09-16 2018-03-20 Auris Surgical Robotics, Inc. System and method for virtually tracking a surgical tool on a movable display
US20180126268A1 (en) * 2016-11-09 2018-05-10 Zynga Inc. Interactions between one or more mobile devices and a vr/ar headset
US20180164883A1 (en) * 2015-06-11 2018-06-14 Oculus Vr, Llc Hand-Held Controller with Pressure-Sensing Switch for Virtual-Reality Systems
US10018844B2 (en) 2015-02-09 2018-07-10 Microsoft Technology Licensing, Llc Wearable image display system
US10049493B1 (en) * 2015-10-22 2018-08-14 Hoyt Architecture Lab, Inc System and methods for providing interaction with elements in a virtual architectural visualization
TWI637363B (en) * 2017-07-26 2018-10-01 銘傳大學 Augmented reality human–machine interaction system
US10089681B2 (en) 2015-12-04 2018-10-02 Nimbus Visulization, Inc. Augmented reality commercial platform and method
US10120438B2 (en) 2011-05-25 2018-11-06 Sony Interactive Entertainment Inc. Eye gaze to alter device behavior
US10139916B2 (en) 2015-04-30 2018-11-27 Google Llc Wide-field radar-based gesture recognition
US10155274B2 (en) 2015-05-27 2018-12-18 Google Llc Attaching electronic components to interactive textiles
US10175781B2 (en) 2016-05-16 2019-01-08 Google Llc Interactive object with multiple electronics modules
US10186081B2 (en) 2015-12-29 2019-01-22 Microsoft Technology Licensing, Llc Tracking rigged smooth-surface models of articulated objects
US10192358B2 (en) 2012-12-20 2019-01-29 Microsoft Technology Licensing, Llc Auto-stereoscopic augmented reality display
US10203765B2 (en) 2013-04-12 2019-02-12 Usens, Inc. Interactive input system and method
US10203763B1 (en) 2015-05-27 2019-02-12 Google Inc. Gesture detection and interactions
US10222469B1 (en) 2015-10-06 2019-03-05 Google Llc Radar-based contextual sensing
US10241581B2 (en) 2015-04-30 2019-03-26 Google Llc RF-based micro-motion tracking for gesture tracking and recognition
US10250735B2 (en) 2013-10-30 2019-04-02 Apple Inc. Displaying relevant user interface objects
US10254942B2 (en) 2014-07-31 2019-04-09 Microsoft Technology Licensing, Llc Adaptive sizing and positioning of application windows
US10268321B2 (en) * 2014-08-15 2019-04-23 Google Llc Interactive textiles within hard objects
CN109739358A (en) * 2019-01-03 2019-05-10 京东方科技集团股份有限公司 Gesture collision checking method and equipment based on naked eye 3D
US10285456B2 (en) 2016-05-16 2019-05-14 Google Llc Interactive fabric
US10310620B2 (en) 2015-04-30 2019-06-04 Google Llc Type-agnostic RF signal representations
US10317677B2 (en) 2015-02-09 2019-06-11 Microsoft Technology Licensing, Llc Display system
GB2525304B (en) * 2014-03-10 2019-06-19 Bae Systems Plc Interactive information display
US10409385B2 (en) 2014-08-22 2019-09-10 Google Llc Occluded gesture recognition
US10459597B2 (en) * 2016-02-03 2019-10-29 Salesforce.Com, Inc. System and method to navigate 3D data on mobile and desktop
US10492302B2 (en) 2016-05-03 2019-11-26 Google Llc Connecting an electronic component to an interactive textile
US10497179B2 (en) 2018-02-23 2019-12-03 Hong Kong Applied Science and Technology Research Institute Company Limited Apparatus and method for performing real object detection and control using a virtual reality head mounted display system
US10502876B2 (en) 2012-05-22 2019-12-10 Microsoft Technology Licensing, Llc Waveguide optics focus elements
US10509478B2 (en) 2014-06-03 2019-12-17 Google Llc Radar-based gesture-recognition from a surface radar field on which an interaction is sensed
US10523993B2 (en) 2014-10-16 2019-12-31 Disney Enterprises, Inc. Displaying custom positioned overlays to a viewer
US10565791B2 (en) 2015-12-29 2020-02-18 Microsoft Technology Licensing, Llc Tracking rigged polygon-mesh models of articulated objects
US10579150B2 (en) 2016-12-05 2020-03-03 Google Llc Concurrent detection of absolute distance and relative movement for sensing action gestures
US10592080B2 (en) 2014-07-31 2020-03-17 Microsoft Technology Licensing, Llc Assisted presentation of application windows
US10620803B2 (en) * 2015-09-29 2020-04-14 Microsoft Technology Licensing, Llc Selecting at least one graphical user interface item
US10642367B2 (en) 2014-08-07 2020-05-05 Google Llc Radar-based gesture sensing and data transmission
US10664059B2 (en) 2014-10-02 2020-05-26 Google Llc Non-line-of-sight radar-based gesture recognition
US10678412B2 (en) 2014-07-31 2020-06-09 Microsoft Technology Licensing, Llc Dynamic joint dividers for application windows
US10732821B2 (en) 2007-01-07 2020-08-04 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US10739974B2 (en) 2016-06-11 2020-08-11 Apple Inc. Configuring context-specific user interfaces
US10778828B2 (en) 2006-09-06 2020-09-15 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US10782793B2 (en) 2017-08-10 2020-09-22 Google Llc Context-sensitive hand interaction
US10788953B2 (en) 2010-04-07 2020-09-29 Apple Inc. Device, method, and graphical user interface for managing folders
US10849532B1 (en) * 2017-12-08 2020-12-01 Arizona Board Of Regents On Behalf Of Arizona State University Computer-vision-based clinical assessment of upper extremity function
US10877645B2 (en) * 2018-04-30 2020-12-29 Samsung Electronics Co., Ltd. Electronic device and operating method thereof
US10884579B2 (en) 2005-12-30 2021-01-05 Apple Inc. Portable electronic device with interface reconfiguration mode
CN112354179A (en) * 2020-11-23 2021-02-12 浙江浙大中控信息技术有限公司 Three-dimensional geographic information content display and interaction method
US11020016B2 (en) 2013-05-30 2021-06-01 Auris Health, Inc. System and method for displaying anatomy and devices on a movable display
US11048333B2 (en) 2011-06-23 2021-06-29 Intel Corporation System and method for close-range movement tracking
US11068049B2 (en) 2012-03-23 2021-07-20 Microsoft Technology Licensing, Llc Light guide display and field of view
US11086216B2 (en) 2015-02-09 2021-08-10 Microsoft Technology Licensing, Llc Generating electronic components
US11103041B2 (en) 2017-10-04 2021-08-31 Preemadonna Inc. Systems and methods of adaptive nail printing and collaborative beauty platform hosting
US11169988B2 (en) 2014-08-22 2021-11-09 Google Llc Radar recognition-aided search
US20210351241A1 (en) * 2020-05-08 2021-11-11 Samsung Display Co., Ltd. Display device
US11265444B2 (en) 2013-08-23 2022-03-01 Preemadonna Inc. Apparatus for applying coating to nails
US11277584B2 (en) * 2017-09-26 2022-03-15 Audi Ag Method and system for carrying out a virtual meeting between at least a first person and a second person
US11281368B2 (en) 2010-04-07 2022-03-22 Apple Inc. Device, method, and graphical user interface for managing folders with multiple pages
US11285674B1 (en) * 2020-02-02 2022-03-29 Robert Edwin Douglas Method and apparatus for a geo-registered 3D virtual hand
WO2022074791A1 (en) * 2020-10-08 2022-04-14 マクセル株式会社 Three-dimensional augmented reality processing system, three-dimensional augmented reality processing method, and user interface device for three-dimensional augmented reality processing system
US11341569B2 (en) * 2019-10-25 2022-05-24 7-Eleven, Inc. System and method for populating a virtual shopping cart based on video of a customer's shopping session at a physical store
US11340707B2 (en) * 2020-05-29 2022-05-24 Microsoft Technology Licensing, Llc Hand gesture-based emojis
US11604559B2 (en) 2007-09-04 2023-03-14 Apple Inc. Editing interface
US11675476B2 (en) 2019-05-05 2023-06-13 Apple Inc. User interfaces for widgets
US20230273711A1 (en) * 2020-09-11 2023-08-31 Beijing Bytedance Network Technology Co., Ltd. Electronic device control method and apparatus, and terminal and storage medium
US11816325B2 (en) 2016-06-12 2023-11-14 Apple Inc. Application shortcuts for carplay

Families Citing this family (249)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9268518B2 (en) 2011-09-27 2016-02-23 Z124 Unified desktop docking rules
US9405459B2 (en) 2011-08-24 2016-08-02 Z124 Unified desktop laptop dock software operation
US9715252B2 (en) 2011-08-24 2017-07-25 Z124 Unified desktop docking behavior for window stickiness
US8384718B2 (en) * 2008-01-10 2013-02-26 Sony Corporation System and method for navigating a 3D graphical user interface
WO2009102813A2 (en) 2008-02-14 2009-08-20 Infomotion Sports Technologies, Inc. Electronic analysis of athletic performance
US20100053151A1 (en) * 2008-09-02 2010-03-04 Samsung Electronics Co., Ltd In-line mediation for manipulating three-dimensional content on a display device
US8866698B2 (en) * 2008-10-01 2014-10-21 Pleiades Publishing Ltd. Multi-display handheld device and supporting system
US8540560B2 (en) 2009-03-27 2013-09-24 Infomotion Sports Technologies, Inc. Monitoring of physical training events
US9479895B2 (en) * 2009-04-23 2016-10-25 International Business Machines Corporation Location-oriented services
US20100328354A1 (en) * 2009-06-26 2010-12-30 Sony Computer Entertainment, Inc. Networked Computer Graphics Rendering System with Multiple Displays
US8269691B2 (en) * 2009-06-26 2012-09-18 Sony Computer Entertainment Inc. Networked computer graphics rendering system with multiple displays for displaying multiple viewing frustums
US20100328447A1 (en) * 2009-06-26 2010-12-30 Sony Computer Entertainment, Inc. Configuration of display and audio parameters for computer graphics rendering system having multiple displays
US9032288B2 (en) * 2009-09-11 2015-05-12 Xerox Corporation Document presentation in virtual worlds
KR20110035609A (en) * 2009-09-30 2011-04-06 삼성전자주식회사 Apparatus and method for sensing motion
US20110109526A1 (en) * 2009-11-09 2011-05-12 Qualcomm Incorporated Multi-screen image display
US20110140991A1 (en) * 2009-12-15 2011-06-16 International Business Machines Corporation Multi-monitor configuration system
US8913009B2 (en) * 2010-02-03 2014-12-16 Nintendo Co., Ltd. Spatially-correlated multi-display human-machine interface
US8560583B2 (en) 2010-04-01 2013-10-15 Sony Computer Entertainment Inc. Media fingerprinting for social networking
US9264785B2 (en) 2010-04-01 2016-02-16 Sony Computer Entertainment Inc. Media fingerprinting for content determination and retrieval
US10448083B2 (en) * 2010-04-06 2019-10-15 Comcast Cable Communications, Llc Streaming and rendering of 3-dimensional video
TWI461967B (en) * 2010-04-07 2014-11-21 Hon Hai Prec Ind Co Ltd Handwrite input electronic device
KR101640043B1 (en) * 2010-04-14 2016-07-15 삼성전자주식회사 Method and Apparatus for Processing Virtual World
US8842113B1 (en) * 2010-05-26 2014-09-23 Google Inc. Real-time view synchronization across multiple networked devices
US20130093738A1 (en) * 2010-06-28 2013-04-18 Johannes Manus Generating images for at least two displays in image-guided surgery
US9814977B2 (en) * 2010-07-13 2017-11-14 Sony Interactive Entertainment Inc. Supplemental video content on a mobile device
US9143699B2 (en) 2010-07-13 2015-09-22 Sony Computer Entertainment Inc. Overlay non-video content on a mobile device
US9159165B2 (en) 2010-07-13 2015-10-13 Sony Computer Entertainment Inc. Position-dependent gaming, 3-D controller, and handheld as a remote
US8730354B2 (en) 2010-07-13 2014-05-20 Sony Computer Entertainment Inc Overlay video content on a mobile device
US9832441B2 (en) * 2010-07-13 2017-11-28 Sony Interactive Entertainment Inc. Supplemental content on a mobile device
KR101701859B1 (en) * 2010-08-13 2017-02-02 엘지전자 주식회사 Mobile terminal, display device, and control method therefor
US8517870B2 (en) 2010-09-07 2013-08-27 Infomotion Sports Technologies, Inc. Electronic component enclosure for an inflated object
US9047041B2 (en) * 2010-09-15 2015-06-02 Lenovo (Singapore) Pte. Ltd. Combining multiple slate displays into a larger display matrix
US9052760B2 (en) * 2010-09-15 2015-06-09 Lenovo (Singapore) Pte. Ltd. Combining multiple slate displays into a larger display
TW201215442A (en) * 2010-10-06 2012-04-16 Hon Hai Prec Ind Co Ltd Unmanned Aerial Vehicle control system and method
IL208600A (en) * 2010-10-10 2016-07-31 Rafael Advanced Defense Systems Ltd Network-based real time registered augmented reality for mobile devices
US20120086630A1 (en) * 2010-10-12 2012-04-12 Sony Computer Entertainment Inc. Using a portable gaming device to record or modify a game or application in real-time running on a home gaming system
JP5815932B2 (en) * 2010-10-27 2015-11-17 京セラ株式会社 Electronics
KR101492310B1 (en) 2010-11-01 2015-02-11 닌텐도가부시키가이샤 Operating apparatus and information processing apparatus
US8619030B2 (en) * 2010-11-09 2013-12-31 Blackberry Limited Method and apparatus for controlling an output device of a portable electronic device
US9584735B2 (en) * 2010-11-12 2017-02-28 Arcsoft, Inc. Front and back facing cameras
KR101303994B1 (en) * 2010-11-24 2013-09-04 한국전자통신연구원 Virtual-Reality Based Training Guide System and the Method
US20130249792A1 (en) * 2010-12-03 2013-09-26 App.Lab Inc. System and method for presenting images
KR20120064557A (en) * 2010-12-09 2012-06-19 한국전자통신연구원 Mixed reality display platform for presenting augmented 3d stereo image and operation method thereof
JP5122659B2 (en) * 2011-01-07 2013-01-16 任天堂株式会社 Information processing program, information processing method, information processing apparatus, and information processing system
US20120192088A1 (en) * 2011-01-20 2012-07-26 Avaya Inc. Method and system for physical mapping in a virtual world
US9480913B2 (en) * 2011-01-26 2016-11-01 WhitewaterWest Industries Ltd. Interactive entertainment using a mobile device with object tagging and/or hyperlinking
EP2485119A3 (en) * 2011-02-02 2012-12-12 Nintendo Co., Ltd. Spatially-correlated multi-display human-machine interface
US9298362B2 (en) * 2011-02-11 2016-03-29 Nokia Technologies Oy Method and apparatus for sharing media in a multi-device environment
EP2497545B1 (en) 2011-03-08 2019-08-07 Nintendo Co., Ltd. Information processing program, information processing system, and information processing method
JP5792971B2 (en) 2011-03-08 2015-10-14 任天堂株式会社 Information processing system, information processing program, and information processing method
US9561443B2 (en) 2011-03-08 2017-02-07 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method
EP2497547B1 (en) 2011-03-08 2018-06-27 Nintendo Co., Ltd. Information processing program, information processing apparatus, information processing system, and information processing method
EP2497546A3 (en) 2011-03-08 2012-10-03 Nintendo Co., Ltd. Information processing program, information processing system, and information processing method
EP2497543A3 (en) 2011-03-08 2012-10-03 Nintendo Co., Ltd. Information processing program, information processing system, and information processing method
US8745725B2 (en) * 2011-03-30 2014-06-03 Elwha Llc Highlighting in response to determining device transfer
US8713670B2 (en) 2011-03-30 2014-04-29 Elwha Llc Ascertaining presentation format based on device primary control determination
US8613075B2 (en) 2011-03-30 2013-12-17 Elwha Llc Selective item access provision in response to active item ascertainment upon device transfer
US9317111B2 (en) 2011-03-30 2016-04-19 Elwha, Llc Providing greater access to one or more items in response to verifying device transfer
US9153194B2 (en) 2011-03-30 2015-10-06 Elwha Llc Presentation format selection based at least on device transfer determination
US8739275B2 (en) 2011-03-30 2014-05-27 Elwha Llc Marking one or more items in response to determining device transfer
US8726366B2 (en) 2011-03-30 2014-05-13 Elwha Llc Ascertaining presentation format based on device primary control determination
US8726367B2 (en) * 2011-03-30 2014-05-13 Elwha Llc Highlighting in response to determining device transfer
US8918861B2 (en) 2011-03-30 2014-12-23 Elwha Llc Marking one or more items in response to determining device transfer
US8839411B2 (en) 2011-03-30 2014-09-16 Elwha Llc Providing particular level of access to one or more items in response to determining primary control of a computing device
US8863275B2 (en) 2011-03-30 2014-10-14 Elwha Llc Access restriction in response to determining device transfer
US10373375B2 (en) 2011-04-08 2019-08-06 Koninklijke Philips N.V. Image processing system and method using device rotation
JP5829040B2 (en) * 2011-04-11 2015-12-09 任天堂株式会社 GAME SYSTEM, GAME DEVICE, GAME PROGRAM, AND IMAGE GENERATION METHOD
JP2012226182A (en) * 2011-04-20 2012-11-15 Sony Corp Image display control device, image display system, image display control method, and computer program
US20120280898A1 (en) * 2011-05-03 2012-11-08 Nokia Corporation Method, apparatus and computer program product for controlling information detail in a multi-device environment
US8744169B2 (en) * 2011-05-31 2014-06-03 Toyota Motor Europe Nv/Sa Voting strategy for visual ego-motion from stereo
US9259645B2 (en) * 2011-06-03 2016-02-16 Nintendo Co., Ltd. Storage medium having stored therein an image generation program, image generation method, image generation apparatus and image generation system
JP5918618B2 (en) * 2011-06-03 2016-05-18 任天堂株式会社 Information processing program, information processing apparatus, information processing system, and information processing method
KR101302638B1 (en) 2011-07-08 2013-09-05 더디엔에이 주식회사 Method, terminal, and computer readable recording medium for controlling content by detecting gesture of head and gesture of hand
US20130044258A1 (en) * 2011-08-15 2013-02-21 Danfung Dennis Method for presenting video content on a hand-held electronic device
US20130050499A1 (en) * 2011-08-30 2013-02-28 Qualcomm Incorporated Indirect tracking
US20150138300A1 (en) 2011-09-02 2015-05-21 Microsoft Technology Licensing, Llc Mobile Video Calls
US8903127B2 (en) 2011-09-16 2014-12-02 Harman International (China) Holdings Co., Ltd. Egomotion estimation system and method
US9495012B2 (en) 2011-09-27 2016-11-15 Z124 Secondary single screen mode activation through user interface activation
US8711091B2 (en) * 2011-10-14 2014-04-29 Lenovo (Singapore) Pte. Ltd. Automatic logical position adjustment of multiple screens
US20130125016A1 (en) * 2011-11-11 2013-05-16 Barnesandnoble.Com Llc System and method for transferring content between devices
KR101915985B1 (en) * 2011-11-16 2018-11-07 엘지전자 주식회사 Mobile terminal and method for controlling thereof
EP2600319A1 (en) * 2011-11-29 2013-06-05 Gemalto SA Pairing system between a terminal and an eGo-type element worn on the wrist or the hand of a user and corresponding method
US9164544B2 (en) 2011-12-09 2015-10-20 Z124 Unified desktop: laptop dock, hardware configuration
WO2013089776A2 (en) * 2011-12-16 2013-06-20 Intel Corporation Driving multiple displays using a single display engine
US20130191787A1 (en) * 2012-01-06 2013-07-25 Tourwrist, Inc. Systems and Methods for Acceleration-Based Motion Control of Virtual Tour Applications
EP2615580B1 (en) * 2012-01-13 2016-08-17 Softkinetic Software Automatic scene calibration
US9070019B2 (en) 2012-01-17 2015-06-30 Leap Motion, Inc. Systems and methods for capturing motion in three-dimensional space
US8638989B2 (en) 2012-01-17 2014-01-28 Leap Motion, Inc. Systems and methods for capturing motion in three-dimensional space
US9501152B2 (en) 2013-01-15 2016-11-22 Leap Motion, Inc. Free-space user interface and control using virtual constructs
US9679215B2 (en) 2012-01-17 2017-06-13 Leap Motion, Inc. Systems and methods for machine control
US11493998B2 (en) 2012-01-17 2022-11-08 Ultrahaptics IP Two Limited Systems and methods for machine control
US8693731B2 (en) 2012-01-17 2014-04-08 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging
US10691219B2 (en) 2012-01-17 2020-06-23 Ultrahaptics IP Two Limited Systems and methods for machine control
US8786517B2 (en) * 2012-02-21 2014-07-22 Blackberry Limited System and method for displaying a user interface across multiple electronic devices
KR101310498B1 (en) * 2012-03-06 2013-10-10 서울대학교산학협력단 Apparatus and Method for displaying 3 Dimensional Contents
JP2013196156A (en) * 2012-03-16 2013-09-30 Sony Corp Information processing apparatus, information processing method and program
US9575710B2 (en) 2012-03-19 2017-02-21 Lenovo (Beijing) Co., Ltd. Electronic device and information processing method thereof
KR20140135839A (en) * 2012-04-20 2014-11-26 엠파이어 테크놀로지 디벨롭먼트 엘엘씨 Online game experience using multiple devices
US9201495B2 (en) * 2012-04-24 2015-12-01 Mobitv, Inc. Control of perspective in multi-dimensional media
US9746916B2 (en) 2012-05-11 2017-08-29 Qualcomm Incorporated Audio user interaction recognition and application interface
US9736604B2 (en) 2012-05-11 2017-08-15 Qualcomm Incorporated Audio user interaction recognition and context refinement
US9381427B2 (en) * 2012-06-01 2016-07-05 Microsoft Technology Licensing, Llc Generic companion-messaging between media platforms
US9690465B2 (en) 2012-06-01 2017-06-27 Microsoft Technology Licensing, Llc Control of remote applications using companion device
EP2677500B1 (en) 2012-06-19 2021-06-23 Samsung Electronics Co., Ltd. Event-based image processing apparatus and method
US9092184B2 (en) * 2012-06-22 2015-07-28 Harborside Press, LLC Interactive synchronized multi-screen display
JP5910880B2 (en) * 2012-06-22 2016-04-27 コニカミノルタ株式会社 Operation display system, program, portable terminal, operation display device, portable terminal program
US8532675B1 (en) 2012-06-27 2013-09-10 Blackberry Limited Mobile communication device user interface for manipulation of data items in a physical space
US20140006472A1 (en) * 2012-06-28 2014-01-02 David Van Brink Locating a Device
US9235373B2 (en) * 2012-06-30 2016-01-12 At&T Intellectual Property I, L.P. Real-time management of content depicted on a plurality of displays
US10076685B2 (en) 2012-07-02 2018-09-18 Russell Brands, Llc Operations with instrumented game ball
WO2014008134A1 (en) 2012-07-02 2014-01-09 Infomotion Sports Technologies, Inc. Computer-implemented capture of live sporting event data
US20140089850A1 (en) * 2012-09-22 2014-03-27 Tourwrist, Inc. Systems and Methods of Using Motion Control to Navigate Panoramas and Virtual Tours
KR20140004448A (en) * 2012-07-03 2014-01-13 삼성전자주식회사 Method and apparatus for supplying image
TW201403446A (en) * 2012-07-09 2014-01-16 Hon Hai Prec Ind Co Ltd System and method for displaying software interface
US9305514B1 (en) * 2012-07-31 2016-04-05 Trend Micro Inc. Detection of relative positions of tablet computers
US20140037135A1 (en) * 2012-07-31 2014-02-06 Omek Interactive, Ltd. Context-driven adjustment of camera parameters
US9838573B2 (en) * 2012-09-18 2017-12-05 Samsung Electronics Co., Ltd Method for guiding controller to move to within recognizable range of multimedia apparatus, the multimedia apparatus, and target tracking apparatus thereof
GB2506203B (en) * 2012-09-25 2016-12-14 Jaguar Land Rover Ltd Method of interacting with a simulated object
US9026924B2 (en) * 2012-10-05 2015-05-05 Lenovo (Singapore) Pte. Ltd. Devices, systems, and methods for moving electronic windows between displays
JP6178066B2 (en) * 2012-11-06 2017-08-09 株式会社ソニー・インタラクティブエンタテインメント Information processing apparatus, information processing method, program, and information storage medium
US9285893B2 (en) 2012-11-08 2016-03-15 Leap Motion, Inc. Object detection and tracking with variable-field illumination devices
US20140132833A1 (en) * 2012-11-12 2014-05-15 Hulu, LLC Combining Multiple Screens from Multiple Devices in Video Playback
KR101480300B1 (en) * 2012-11-30 2015-01-12 한국과학기술원 Interactive contents provision system and method using relative distance between the main screen and the second screen
US10101905B1 (en) * 2012-12-07 2018-10-16 American Megatrends, Inc. Proximity-based input device
US8976172B2 (en) 2012-12-15 2015-03-10 Realitycap, Inc. Three-dimensional scanning using existing sensors on portable electronic devices
KR101956073B1 (en) 2012-12-20 2019-03-08 삼성전자주식회사 3d volumetric display device for providing user interface using visual indicator and method thereof
CN103902195B (en) * 2012-12-28 2017-02-22 鸿富锦精密工业(武汉)有限公司 Automatic regulation system and method for display screen
CN104364750B (en) * 2013-01-06 2019-07-16 英特尔公司 The pretreated methods, devices and systems of distribution controlled for touch data and display area
US10609285B2 (en) 2013-01-07 2020-03-31 Ultrahaptics IP Two Limited Power consumption in motion-capture systems
US9626015B2 (en) 2013-01-08 2017-04-18 Leap Motion, Inc. Power consumption in motion-capture systems with audio and optical signals
US9459697B2 (en) 2013-01-15 2016-10-04 Leap Motion, Inc. Dynamic, free-space user interactions for machine control
US10042510B2 (en) 2013-01-15 2018-08-07 Leap Motion, Inc. Dynamic user interactions for display control and measuring degree of completeness of user gestures
CN103945030B (en) * 2013-01-17 2017-06-30 信泰光学(深圳)有限公司 Modular appliance and its operating method
KR101822463B1 (en) 2013-01-21 2018-01-26 삼성전자주식회사 Apparatus for arranging a plurality of Icons on Screen and Operation Method Thereof
US9161167B2 (en) * 2013-01-23 2015-10-13 Qualcomm Incorporated Visual identifier of third party location
TWI510084B (en) * 2013-01-25 2015-11-21 Sintai Optical Shenzhen Co Ltd Combination apparatus and operating method thereof
US9330471B2 (en) * 2013-02-14 2016-05-03 Qualcomm Incorporated Camera aided motion direction and speed estimation
US20140236726A1 (en) * 2013-02-18 2014-08-21 Disney Enterprises, Inc. Transference of data associated with a product and/or product package
US9858031B2 (en) * 2013-03-11 2018-01-02 International Business Machines Corporation Colony desktop hive display: creating an extended desktop display from multiple mobile devices using near-field or other networking
US9210526B2 (en) * 2013-03-14 2015-12-08 Intel Corporation Audio localization techniques for visual effects
US20140274384A1 (en) * 2013-03-15 2014-09-18 Electronic Arts Inc. Delivering and consuming interactive video gaming content
WO2014200589A2 (en) 2013-03-15 2014-12-18 Leap Motion, Inc. Determining positional information for an object in space
US9933987B2 (en) * 2013-03-26 2018-04-03 Nec Display Solutions, Ltd. Multi-display system
US10620709B2 (en) 2013-04-05 2020-04-14 Ultrahaptics IP Two Limited Customized gesture interpretation
KR102536174B1 (en) * 2013-04-08 2023-05-26 스냅 아이엔씨 Distance estimation using multi-camera device
US20140315489A1 (en) * 2013-04-22 2014-10-23 Htc Corporation Method for performing wireless display sharing, and associated apparatus and associated computer program product
US9395764B2 (en) * 2013-04-25 2016-07-19 Filippo Costanzo Gestural motion and speech interface control method for 3d audio-video-data navigation on handheld devices
US9916009B2 (en) 2013-04-26 2018-03-13 Leap Motion, Inc. Non-tactile interface systems and methods
EP2801891B1 (en) 2013-05-09 2018-12-26 Samsung Electronics Co., Ltd Input Apparatus, Pointing Apparatus, Method for Displaying Pointer, and Recordable Medium
US9417835B2 (en) * 2013-05-10 2016-08-16 Google Inc. Multiplayer game for display across multiple devices
US9747696B2 (en) 2013-05-17 2017-08-29 Leap Motion, Inc. Systems and methods for providing normalized parameters of motions of objects in three-dimensional space
KR101469178B1 (en) 2013-05-22 2014-12-04 주식회사 인피니트헬스케어 System and method for medical image displaying using multiple mobile devices
US9727298B2 (en) * 2013-05-28 2017-08-08 Sony Corporation Device and method for allocating data based on an arrangement of elements in an image
US20140365558A1 (en) * 2013-06-05 2014-12-11 Wolfgis, Llc System and method for visualizing complex gis location-based datasets
KR20150000783A (en) * 2013-06-25 2015-01-05 삼성전자주식회사 Display method and apparatus with multi-screens
US10281987B1 (en) 2013-08-09 2019-05-07 Leap Motion, Inc. Systems and methods of free-space gestural interaction
US10846942B1 (en) 2013-08-29 2020-11-24 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
JP2015061107A (en) * 2013-09-17 2015-03-30 株式会社リコー Distribution management device and distribution system
US10132635B2 (en) * 2013-09-17 2018-11-20 Invensense, Inc. Method and apparatus for misalignment between device and pedestrian using vision
US9456148B1 (en) * 2013-09-18 2016-09-27 Amazon Technologies, Inc. Multi-setting preview for image capture
JP6427858B2 (en) * 2013-09-19 2018-11-28 セイコーエプソン株式会社 Display system, image display apparatus, and display system control method
US9632572B2 (en) 2013-10-03 2017-04-25 Leap Motion, Inc. Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation
US9536353B2 (en) 2013-10-03 2017-01-03 Honda Motor Co., Ltd. System and method for dynamic in-vehicle virtual reality
US9715764B2 (en) * 2013-10-03 2017-07-25 Honda Motor Co., Ltd. System and method for dynamic in-vehicle virtual reality
US9630631B2 (en) 2013-10-03 2017-04-25 Honda Motor Co., Ltd. System and method for dynamic in-vehicle virtual reality
US9547173B2 (en) 2013-10-03 2017-01-17 Honda Motor Co., Ltd. System and method for dynamic in-vehicle virtual reality
US9996638B1 (en) 2013-10-31 2018-06-12 Leap Motion, Inc. Predictive information for free space gesture control and communication
NL2011867C2 (en) * 2013-11-29 2015-06-01 Jasper Vis Method and device for processing content for display.
JP2015115046A (en) * 2013-12-16 2015-06-22 ソニー株式会社 Information processing apparatus, program, information processing method, and information processing system
KR101548228B1 (en) * 2013-12-27 2015-08-28 주식회사 케이티 Apparatus for synchronizing user interface based on user state and method thereof
JP2015127897A (en) * 2013-12-27 2015-07-09 ソニー株式会社 Display control device, display control system, display control method, and program
US9613262B2 (en) 2014-01-15 2017-04-04 Leap Motion, Inc. Object detection and tracking for providing a virtual device experience
US20150242179A1 (en) * 2014-02-21 2015-08-27 Smart Technologies Ulc Augmented peripheral content using mobile device
CN106062808A (en) 2014-02-28 2016-10-26 罗素商标有限责任公司 Data processing inside gaming device
US20150268917A1 (en) * 2014-03-20 2015-09-24 Nokia Technologies Oy Apparatus, method, and computer program product for aligning images viewed across multiple displays
KR20150117018A (en) * 2014-04-09 2015-10-19 삼성전자주식회사 Computing apparatus, method for controlling computing apparatus thereof, and multi-display system
KR20150130845A (en) * 2014-05-14 2015-11-24 삼성전자주식회사 Apparatus and Device for Position Measuring of Electronic Apparatuses
DE202014103729U1 (en) 2014-08-08 2014-09-09 Leap Motion, Inc. Augmented reality with motion detection
US9880799B1 (en) 2014-08-26 2018-01-30 Sprint Communications Company L.P. Extendable display screens of electronic devices
CN105573687B (en) * 2014-10-10 2019-10-15 鸿富锦精密工业(深圳)有限公司 Display control unit, electronic equipment and display control method
GB2531531A (en) * 2014-10-20 2016-04-27 Bae Systems Plc Optical inertial measurement apparatus and method
US11504029B1 (en) 2014-10-26 2022-11-22 David Martin Mobile control using gait cadence
US11045116B1 (en) * 2017-09-15 2021-06-29 David Martin Enhanced determination of cadence for control in mobile
US10342462B2 (en) * 2014-10-26 2019-07-09 David Martin Application of gait characteristics for mobile
TWI521387B (en) * 2014-12-25 2016-02-11 國立臺灣大學 A re-anchorable virtual panel in 3d space
US10762534B1 (en) * 2014-12-29 2020-09-01 Groupon, Inc. Motion data based consumer interfaces
US9607428B2 (en) 2015-06-30 2017-03-28 Ariadne's Thread (Usa), Inc. Variable resolution virtual reality display system
KR101808759B1 (en) 2015-06-30 2017-12-13 인천대학교 산학협력단 Hand Gesture Interaction System and Method
US10261985B2 (en) 2015-07-02 2019-04-16 Microsoft Technology Licensing, Llc Output rendering in dynamic redefining application
US10198252B2 (en) 2015-07-02 2019-02-05 Microsoft Technology Licensing, Llc Transformation chain application splitting
US9733993B2 (en) 2015-07-02 2017-08-15 Microsoft Technology Licensing, Llc Application sharing using endpoint interface entities
US9658836B2 (en) 2015-07-02 2017-05-23 Microsoft Technology Licensing, Llc Automated generation of transformation chain compatible class
US9712472B2 (en) 2015-07-02 2017-07-18 Microsoft Technology Licensing, Llc Application spawning responsive to communication
US9785484B2 (en) 2015-07-02 2017-10-10 Microsoft Technology Licensing, Llc Distributed application interfacing across different hardware
US9860145B2 (en) 2015-07-02 2018-01-02 Microsoft Technology Licensing, Llc Recording of inter-application data flow
US9733915B2 (en) 2015-07-02 2017-08-15 Microsoft Technology Licensing, Llc Building of compound application chain applications
US10198405B2 (en) 2015-07-08 2019-02-05 Microsoft Technology Licensing, Llc Rule-based layout of changing information
US10031724B2 (en) 2015-07-08 2018-07-24 Microsoft Technology Licensing, Llc Application operation responsive to object spatial status
US9454010B1 (en) 2015-08-07 2016-09-27 Ariadne's Thread (Usa), Inc. Wide field-of-view head mounted display system
US9606362B2 (en) 2015-08-07 2017-03-28 Ariadne's Thread (Usa), Inc. Peripheral field-of-view illumination system for a head mounted display
US9990008B2 (en) 2015-08-07 2018-06-05 Ariadne's Thread (Usa), Inc. Modular multi-mode virtual reality headset
KR102433879B1 (en) * 2015-08-21 2022-08-18 삼성전자주식회사 Display apparatus and control method thereof
WO2017034886A1 (en) 2015-08-24 2017-03-02 Pcms Holdings, Inc. Systems and methods for enhancing augmented reality experience with dynamic output mapping
US10277582B2 (en) 2015-08-27 2019-04-30 Microsoft Technology Licensing, Llc Application service architecture
US10572209B2 (en) 2015-09-16 2020-02-25 Nec Display Solutions, Ltd. Multi-display system
WO2017062289A1 (en) 2015-10-08 2017-04-13 Pcms Holdings, Inc. Methods and systems of automatic calibration for dynamic display configurations
KR102402048B1 (en) * 2015-10-14 2022-05-26 삼성전자주식회사 Electronic apparatus and the controlling method thereof
US11609427B2 (en) 2015-10-16 2023-03-21 Ostendo Technologies, Inc. Dual-mode augmented/virtual reality (AR/VR) near-eye wearable displays
WO2017068926A1 (en) * 2015-10-21 2017-04-27 ソニー株式会社 Information processing device, control method therefor, and computer program
US11106273B2 (en) * 2015-10-30 2021-08-31 Ostendo Technologies, Inc. System and methods for on-body gestural interfaces and projection displays
US10007339B2 (en) * 2015-11-05 2018-06-26 Oculus Vr, Llc Controllers with asymmetric tracking patterns
WO2017086508A1 (en) * 2015-11-19 2017-05-26 엘지전자 주식회사 Mobile terminal and control method therefor
US10061552B2 (en) * 2015-11-25 2018-08-28 International Business Machines Corporation Identifying the positioning in a multiple display grid
CN106817508B (en) * 2015-11-30 2019-11-22 华为技术有限公司 A kind of synchronization object determines methods, devices and systems
US10345594B2 (en) 2015-12-18 2019-07-09 Ostendo Technologies, Inc. Systems and methods for augmented near-eye wearable displays
US10578882B2 (en) 2015-12-28 2020-03-03 Ostendo Technologies, Inc. Non-telecentric emissive micro-pixel array light modulators and methods of fabrication thereof
KR102418003B1 (en) * 2016-01-04 2022-07-07 삼성전자주식회사 Display Content using a Plurality of Display Devices
US10021373B2 (en) 2016-01-11 2018-07-10 Microsoft Technology Licensing, Llc Distributing video among multiple display zones
CN105824593B (en) * 2016-03-09 2018-11-13 京东方科技集团股份有限公司 Splice screen display system and splicing display method
US10115234B2 (en) 2016-03-21 2018-10-30 Accenture Global Solutions Limited Multiplatform based experience generation
US20170285813A1 (en) * 2016-03-29 2017-10-05 Microsoft Technology Licensing, Llc Touch-Input Support for an External Touch-Capable Display Device
US9459692B1 (en) * 2016-03-29 2016-10-04 Ariadne's Thread (Usa), Inc. Virtual reality headset with relative motion head tracker
US10353203B2 (en) 2016-04-05 2019-07-16 Ostendo Technologies, Inc. Augmented/virtual reality near-eye displays with edge imaging lens comprising a plurality of display devices
US10453431B2 (en) 2016-04-28 2019-10-22 Ostendo Technologies, Inc. Integrated near-far light field display systems
US10522106B2 (en) 2016-05-05 2019-12-31 Ostendo Technologies, Inc. Methods and apparatus for active transparency modulation
US10168976B2 (en) * 2016-05-20 2019-01-01 International Business Machines Corporation Dynamic display arrangement
CN109964321A (en) 2016-10-13 2019-07-02 六度空间有限责任公司 Method and apparatus for indoor positioning
CN108733211B (en) * 2017-04-21 2020-05-22 宏达国际电子股份有限公司 Tracking system, operation method thereof, controller and computer readable recording medium
KR101957417B1 (en) * 2017-06-14 2019-03-12 (주)이더블유비엠 Motion recognition control system and method for separately displaying shot image
US11093197B2 (en) * 2017-07-31 2021-08-17 Stmicroelectronics, Inc. System and method to increase display area utilizing a plurality of discrete displays
US11231896B2 (en) * 2017-08-25 2022-01-25 Intel Corporation Configuring display screen coordinates
US10569172B2 (en) * 2017-09-19 2020-02-25 Canon Kabushiki Kaisha System and method of configuring a virtual camera
EP3502837B1 (en) * 2017-12-21 2021-08-11 Nokia Technologies Oy Apparatus, method and computer program for controlling scrolling of content
CN107976811B (en) * 2017-12-25 2023-12-29 河南诺控信息技术有限公司 Virtual reality mixing-based method simulation laboratory simulation method of simulation method
US11875012B2 (en) 2018-05-25 2024-01-16 Ultrahaptics IP Two Limited Throwable interface for augmented reality and virtual reality environments
CN110830704B (en) * 2018-08-07 2021-10-22 纳宝株式会社 Method and device for generating rotating image
CN110796116A (en) * 2018-11-08 2020-02-14 英属开曼群岛商麦迪创科技股份有限公司 Multi-panel display system, vehicle with multi-panel display system and display method
WO2020112900A2 (en) 2018-11-30 2020-06-04 Pcms Holdings, Inc. Method for mirroring 3d objects to light field displays
US10867402B2 (en) 2019-03-01 2020-12-15 Here Global B.V. System and method for determining distance to object on road
US10849179B1 (en) 2019-05-29 2020-11-24 Bank Of America Corporation Mobile network tool
US11087489B2 (en) * 2019-06-03 2021-08-10 Disney Enterprises, Inc. Systems and methods to facilitate interaction by one or more participants with content presented across multiple distinct physical locations
KR20210069491A (en) 2019-12-03 2021-06-11 삼성전자주식회사 Electronic apparatus and Method for controlling the display apparatus thereof
US11120639B1 (en) 2020-04-24 2021-09-14 Microsoft Technology Licensing, Llc Projecting telemetry data to visualization models
US11594192B2 (en) * 2020-05-21 2023-02-28 Dell Products, L.P. Generating multi-monitor recommendations
CN112399148B (en) * 2020-11-11 2023-05-23 郑州捷安高科股份有限公司 Virtual monitoring method and device based on virtual three-dimensional scene
JP7287379B2 (en) * 2020-12-10 2023-06-06 セイコーエプソン株式会社 DISPLAY METHOD, DETECTION DEVICE, AND PROGRAM
US11550404B2 (en) * 2021-05-14 2023-01-10 Microsoft Technology Licensing, Llc Tilt-responsive techniques for sharing content
US11451709B1 (en) 2021-11-29 2022-09-20 Unity Technologies Sf Increasing dynamic range of a virtual production display
US20230021589A1 (en) * 2022-09-30 2023-01-26 Intel Corporation Determining external display orientation using ultrasound time of flight

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6222465B1 (en) * 1998-12-09 2001-04-24 Lucent Technologies Inc. Gesture-based computer interface
US20070091037A1 (en) * 2005-10-21 2007-04-26 Yee-Chun Lee Energy Efficient Compact Display For Mobile Device
US20070126733A1 (en) * 2005-12-02 2007-06-07 Electronics And Telecommunications Research Institute Apparatus and method for immediately creating and controlling virtual reality interactive human body model for user-centric interface
US20080042981A1 (en) * 2004-03-22 2008-02-21 Itay Katz System and Method for Inputing User Commands to a Processor
US20090219224A1 (en) * 2008-02-28 2009-09-03 Johannes Elg Head tracking for enhanced 3d experience using face detection

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06508222A (en) * 1991-05-23 1994-09-14 アタリ ゲームズ コーポレーション modular display simulator
US5764871A (en) * 1993-10-21 1998-06-09 Eastman Kodak Company Method and apparatus for constructing intermediate images for a depth image from stereo images using velocity vector fields
US5956046A (en) * 1997-12-17 1999-09-21 Sun Microsystems, Inc. Scene synchronization of multiple computer displays
JP2001091218A (en) 1999-09-14 2001-04-06 Mitsubishi Electric Inf Technol Center America Inc Device and method for detecting three-dimensional movement
US7091926B2 (en) * 2002-02-08 2006-08-15 Kulas Charles J Computer display system using multiple screens
JP3880561B2 (en) * 2002-09-05 2007-02-14 株式会社ソニー・コンピュータエンタテインメント Display system
CN100468515C (en) * 2003-12-19 2009-03-11 思比驰盖尔公司 Display of visual data as a function of position of display device
US20060020562A1 (en) * 2004-07-21 2006-01-26 University Of Southern Mississippi Apparatus and method for estimating optical flow
JP5080273B2 (en) 2005-01-07 2012-11-21 クアルコム,インコーポレイテッド Tilt sensor based on optical flow
JP2006208566A (en) 2005-01-26 2006-08-10 Nec Corp Liquid crystal display apparatus, mobile communication terminal device and liquid crystal display method
US7860301B2 (en) * 2005-02-11 2010-12-28 Macdonald Dettwiler And Associates Inc. 3D imaging system
EP1950708A4 (en) 2005-09-15 2010-11-24 Oleg Stanilasvovich Rurin Method and system for visualising virtual three-dimensional objects
US8855848B2 (en) * 2007-06-05 2014-10-07 GM Global Technology Operations LLC Radar, lidar and camera enhanced methods for vehicle dynamics estimation
KR20090011316A (en) 2007-07-25 2009-02-02 중앙대학교 산학협력단 Active display apparatus for stereoscopic and an operation method thereof
EP2061008B1 (en) * 2007-11-16 2011-01-26 Honda Research Institute Europe GmbH Method and device for continuous figure-ground segmentation in images from dynamic visual scenes
US8269842B2 (en) 2008-06-11 2012-09-18 Nokia Corporation Camera gestures for user interface control
KR101665034B1 (en) * 2008-08-22 2016-10-24 구글 인코포레이티드 Navigation in a three dimensional environment on a mobile device
US20100053151A1 (en) * 2008-09-02 2010-03-04 Samsung Electronics Co., Ltd In-line mediation for manipulating three-dimensional content on a display device
US20100058205A1 (en) * 2008-09-04 2010-03-04 Motorola, Inc. Reconfigurable multiple-screen display
US8803816B2 (en) * 2008-09-08 2014-08-12 Qualcomm Incorporated Multi-fold mobile device with configurable interface
JP5504645B2 (en) * 2009-02-20 2014-05-28 パナソニック株式会社 Resin coating device and resin coating data creation device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6222465B1 (en) * 1998-12-09 2001-04-24 Lucent Technologies Inc. Gesture-based computer interface
US20080042981A1 (en) * 2004-03-22 2008-02-21 Itay Katz System and Method for Inputing User Commands to a Processor
US20070091037A1 (en) * 2005-10-21 2007-04-26 Yee-Chun Lee Energy Efficient Compact Display For Mobile Device
US20070126733A1 (en) * 2005-12-02 2007-06-07 Electronics And Telecommunications Research Institute Apparatus and method for immediately creating and controlling virtual reality interactive human body model for user-centric interface
US20090219224A1 (en) * 2008-02-28 2009-09-03 Johannes Elg Head tracking for enhanced 3d experience using face detection

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Tang, A., Neustaedter, C., Greenberg, S. (2004).VideoArms: Supporting Remote Embodiment inGroupware. Video Proceedings of CSCW '04. *
Wigdor, Daniel et al.; Oct. 7-10, 2007; UIST '07; LucidTouch: A See-Through Mobile Device; http://www.patrickbaudisch.com/publications/2007-Wigdor-UIST07-LucidTouch.pdf *

Cited By (360)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090059035A1 (en) * 2002-11-20 2009-03-05 Sony Corporation Picture production system, and picture production apparatus and method
US20090071375A1 (en) * 2005-06-15 2009-03-19 Halliburton Energy Services, Inc. Gas-Generating Additives Having Improved Shelf Lives for Use in Cement Compositions
US11650713B2 (en) 2005-12-30 2023-05-16 Apple Inc. Portable electronic device with interface reconfiguration mode
US10884579B2 (en) 2005-12-30 2021-01-05 Apple Inc. Portable electronic device with interface reconfiguration mode
US10915224B2 (en) 2005-12-30 2021-02-09 Apple Inc. Portable electronic device with interface reconfiguration mode
US11449194B2 (en) 2005-12-30 2022-09-20 Apple Inc. Portable electronic device with interface reconfiguration mode
US9250703B2 (en) 2006-03-06 2016-02-02 Sony Computer Entertainment Inc. Interface with gaze detection and voice input
US11240362B2 (en) 2006-09-06 2022-02-01 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US10778828B2 (en) 2006-09-06 2020-09-15 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US11736602B2 (en) 2006-09-06 2023-08-22 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US10732821B2 (en) 2007-01-07 2020-08-04 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US11169691B2 (en) 2007-01-07 2021-11-09 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US11586348B2 (en) 2007-01-07 2023-02-21 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US11604559B2 (en) 2007-09-04 2023-03-14 Apple Inc. Editing interface
US9035876B2 (en) 2008-01-14 2015-05-19 Apple Inc. Three-dimensional user interface session control
US20090289188A1 (en) * 2008-05-20 2009-11-26 Everspring Industry Co., Ltd. Method for controlling an electronic device through infrared detection
US20110193938A1 (en) * 2008-07-17 2011-08-11 Nederlandse Organisatie Voor Toegepast- Natuurwetenschappelijk Onderzoek Tno System, a method and a computer program for inspection of a three-dimensional environment by a user
US9138135B2 (en) * 2008-07-17 2015-09-22 Nederlandse Organisatie Voor Toegepast-Natuurwetenschappelijk Onderzoek Tno System, a method and a computer program for inspection of a three-dimensional environment by a user
US20100138797A1 (en) * 2008-12-01 2010-06-03 Sony Ericsson Mobile Communications Ab Portable electronic device with split vision content sharing control and method
US20120204133A1 (en) * 2009-01-13 2012-08-09 Primesense Ltd. Gesture-Based User Interface
US11334239B2 (en) 2009-01-23 2022-05-17 Samsung Electronics Co., Ltd. Mobile terminal having dual touch screen and method of controlling content therein
US10705722B2 (en) * 2009-01-23 2020-07-07 Samsung Electronics Co., Ltd. Mobile terminal having dual touch screen and method of controlling content therein
US20170177211A1 (en) * 2009-01-23 2017-06-22 Samsung Electronics Co., Ltd. Mobile terminal having dual touch screen and method of controlling content therein
US20120121128A1 (en) * 2009-04-20 2012-05-17 Bent 360: Medialab Inc. Object tracking system
US20100302138A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Methods and systems for defining or modifying a visual representation
US20100315413A1 (en) * 2009-06-16 2010-12-16 Microsoft Corporation Surface Computer User Interaction
US8687845B2 (en) * 2009-09-04 2014-04-01 Sony Corporation Information processing apparatus, method for controlling display, and program for controlling display
US20110058711A1 (en) * 2009-09-04 2011-03-10 Takurou Noda Information Processing Apparatus, Method for Controlling Display, and Program for Controlling Display
US20110084983A1 (en) * 2009-09-29 2011-04-14 Wavelength & Resonance LLC Systems and Methods for Interaction With a Virtual Environment
US9104275B2 (en) * 2009-10-20 2015-08-11 Lg Electronics Inc. Mobile terminal to display an object on a perceived 3D space
US20110093778A1 (en) * 2009-10-20 2011-04-21 Lg Electronics Inc. Mobile terminal and controlling method thereof
US9513700B2 (en) 2009-12-24 2016-12-06 Sony Interactive Entertainment America Llc Calibration of portable devices in a shared virtual space
US10007393B2 (en) * 2010-01-19 2018-06-26 Apple Inc. 3D view of file structure
US20110179368A1 (en) * 2010-01-19 2011-07-21 King Nicholas V 3D View Of File Structure
US20140235311A1 (en) * 2010-03-05 2014-08-21 Sony Computer Entertainment America Llc Maintaining multiple views on a shared stable virtual space
US10424077B2 (en) * 2010-03-05 2019-09-24 Sony Interactive Entertainment America Llc Maintaining multiple views on a shared stable virtual space
US9310883B2 (en) * 2010-03-05 2016-04-12 Sony Computer Entertainment America Llc Maintaining multiple views on a shared stable virtual space
US20160214011A1 (en) * 2010-03-05 2016-07-28 Sony Computer Entertainment America Llc Maintaining multiple views on a shared stable virtual space
US9395195B2 (en) 2010-03-30 2016-07-19 Ns Solutions Corporation System, method and program for managing and displaying product information
US9689688B2 (en) 2010-03-30 2017-06-27 Ns Solutions Corporation Image display system, image display method and program
EP3392746A1 (en) * 2010-03-31 2018-10-24 Immersion Corporation System and method for providing haptic stimulus based on position
WO2011123599A1 (en) * 2010-03-31 2011-10-06 Immersion Corporation System and method for providing haptic stimulus based on position
US8540571B2 (en) 2010-03-31 2013-09-24 Immersion Corporation System and method for providing haptic stimulus based on position
CN102822772A (en) * 2010-03-31 2012-12-12 英默森公司 System and method for providing haptic stimulus based on position
US9987555B2 (en) 2010-03-31 2018-06-05 Immersion Corporation System and method for providing haptic stimulus based on position
US11500516B2 (en) 2010-04-07 2022-11-15 Apple Inc. Device, method, and graphical user interface for managing folders
US10788953B2 (en) 2010-04-07 2020-09-29 Apple Inc. Device, method, and graphical user interface for managing folders
US11281368B2 (en) 2010-04-07 2022-03-22 Apple Inc. Device, method, and graphical user interface for managing folders with multiple pages
US11809700B2 (en) 2010-04-07 2023-11-07 Apple Inc. Device, method, and graphical user interface for managing folders with multiple pages
US9535493B2 (en) 2010-04-13 2017-01-03 Nokia Technologies Oy Apparatus, method, computer program and user interface
WO2011127646A1 (en) 2010-04-13 2011-10-20 Nokia Corporation An apparatus, method, computer program and user interface
EP2558924A4 (en) * 2010-04-13 2016-05-18 Nokia Technologies Oy An apparatus, method, computer program and user interface
WO2011147561A3 (en) * 2010-05-28 2012-04-12 Chao Zhang Mobile unit, method for operating the same and network comprising the mobile unit
US9330470B2 (en) 2010-06-16 2016-05-03 Intel Corporation Method and system for modeling subjects from a depth map
US20120200600A1 (en) * 2010-06-23 2012-08-09 Kent Demaine Head and arm detection for virtual immersion systems and methods
US20120007854A1 (en) * 2010-07-12 2012-01-12 Lg Electronics Inc. Mobile terminal and controlling method thereof
US8791944B2 (en) * 2010-07-12 2014-07-29 Lg Electronics Inc. Mobile terminal and controlling method thereof
US9201501B2 (en) 2010-07-20 2015-12-01 Apple Inc. Adaptive projector
US9158375B2 (en) 2010-07-20 2015-10-13 Apple Inc. Interactive reality augmentation for natural interaction
US20150253951A1 (en) * 2010-09-01 2015-09-10 Canon Kabushiki Kaisha Imaging apparatus and control method thereof
US20120050155A1 (en) * 2010-09-01 2012-03-01 Canon Kabushiki Kaisha Imaging apparatus and control method thereof
JP2012053674A (en) * 2010-09-01 2012-03-15 Canon Inc Imaging apparatus, control method and program thereof, and recording medium
KR20120028743A (en) * 2010-09-15 2012-03-23 엘지전자 주식회사 Mobile terminal and operation control method thereof
US9513710B2 (en) * 2010-09-15 2016-12-06 Lg Electronics Inc. Mobile terminal for controlling various operations using a stereoscopic 3D pointer on a stereoscopic 3D image and control method thereof
KR101708696B1 (en) * 2010-09-15 2017-02-21 엘지전자 주식회사 Mobile terminal and operation control method thereof
US8959013B2 (en) 2010-09-27 2015-02-17 Apple Inc. Virtual keyboard for a non-tactile three dimensional user interface
WO2012052612A1 (en) * 2010-10-21 2012-04-26 Nokia Corporation Apparatus and method for user input for controlling displayed information
US9043732B2 (en) 2010-10-21 2015-05-26 Nokia Corporation Apparatus and method for user input for controlling displayed information
US9164581B2 (en) 2010-10-22 2015-10-20 Hewlett-Packard Development Company, L.P. Augmented reality display system and method of display
US8854802B2 (en) 2010-10-22 2014-10-07 Hewlett-Packard Development Company, L.P. Display with rotatable display screen
US20120098806A1 (en) * 2010-10-22 2012-04-26 Ramin Samadani System and method of modifying lighting in a display system
EP2453344A1 (en) * 2010-11-11 2012-05-16 Sony Corporation Information processing apparatus, stereoscopic display method, and program
US9456203B2 (en) 2010-11-11 2016-09-27 Sony Corporation Information processing apparatus, stereoscopic display method, and program
US8988499B2 (en) 2010-11-11 2015-03-24 Sony Corporation Information processing apparatus, stereoscopic display method, and program
EP3048515A1 (en) * 2010-11-11 2016-07-27 Sony Corporation Information processing apparatus, stereoscopic display method, and program
US10349034B2 (en) 2010-11-11 2019-07-09 Sony Corporation Information processing apparatus, stereoscopic display method, and program
US10652515B2 (en) 2010-11-11 2020-05-12 Sony Corporation Information processing apparatus, stereoscopic display method, and program
US8872762B2 (en) 2010-12-08 2014-10-28 Primesense Ltd. Three dimensional user interface cursor control
US8933876B2 (en) 2010-12-13 2015-01-13 Apple Inc. Three dimensional user interface session control
US20120162384A1 (en) * 2010-12-22 2012-06-28 Vesely Michael A Three-Dimensional Collaboration
US20120192067A1 (en) * 2011-01-20 2012-07-26 Research In Motion Corporation Three-dimensional, multi-depth presentation of icons in association with differing input components of a user interface
US9618972B2 (en) * 2011-01-20 2017-04-11 Blackberry Limited Three-dimensional, multi-depth presentation of icons in association with differing input components of a user interface
US8638297B2 (en) 2011-01-27 2014-01-28 Blackberry Limited Portable electronic device and method therefor
US8421752B2 (en) * 2011-01-27 2013-04-16 Research In Motion Limited Portable electronic device and method therefor
US20120194432A1 (en) * 2011-01-27 2012-08-02 Research In Motion Limited Portable electronic device and method therefor
US9342146B2 (en) 2011-02-09 2016-05-17 Apple Inc. Pointing-based display interaction
US9454225B2 (en) 2011-02-09 2016-09-27 Apple Inc. Gaze-based display control
US9285874B2 (en) 2011-02-09 2016-03-15 Apple Inc. Gaze detection in a 3D mapping environment
JP2014506709A (en) * 2011-02-22 2014-03-17 クアルコム,インコーポレイテッド Providing position-based correction images for users' mobile platforms
US9507416B2 (en) 2011-02-22 2016-11-29 Robert Howard Kimball Providing a corrected view based on the position of a user with respect to a mobile platform
US9354719B2 (en) 2011-02-28 2016-05-31 Stmicroelectronics (Research & Development) Limited Optical navigation devices
JPWO2012124250A1 (en) * 2011-03-15 2014-07-17 パナソニック株式会社 Object control apparatus, object control method, object control program, and integrated circuit
US9857868B2 (en) 2011-03-19 2018-01-02 The Board Of Trustees Of The Leland Stanford Junior University Method and system for ergonomic touch-free interface
US20120242793A1 (en) * 2011-03-21 2012-09-27 Soungmin Im Display device and method of controlling the same
US9504920B2 (en) 2011-04-25 2016-11-29 Aquifi, Inc. Method and system to create three-dimensional mapping in a two-dimensional game
US20120274589A1 (en) * 2011-04-28 2012-11-01 De Angelo Michael J Apparatus, system, and method for remote interaction with a computer display or computer visualization or object
US8702507B2 (en) 2011-04-28 2014-04-22 Microsoft Corporation Manual and camera-based avatar control
US8788973B2 (en) 2011-05-23 2014-07-22 Microsoft Corporation Three-dimensional gesture controlled avatar configuration interface
US10120438B2 (en) 2011-05-25 2018-11-06 Sony Interactive Entertainment Inc. Eye gaze to alter device behavior
CN102981606A (en) * 2011-06-07 2013-03-20 索尼公司 Information processing apparatus, information processing method, and program
US20120317510A1 (en) * 2011-06-07 2012-12-13 Takuro Noda Information processing apparatus, information processing method, and program
US11048333B2 (en) 2011-06-23 2021-06-29 Intel Corporation System and method for close-range movement tracking
US9910498B2 (en) 2011-06-23 2018-03-06 Intel Corporation System and method for close-range movement tracking
EP2538305A3 (en) * 2011-06-23 2013-08-21 Omek Interactive, Ltd. System and method for close-range movement tracking
US9459758B2 (en) 2011-07-05 2016-10-04 Apple Inc. Gesture-based interface with enhanced features
US9377865B2 (en) 2011-07-05 2016-06-28 Apple Inc. Zoom-based gesture user interface
US8881051B2 (en) 2011-07-05 2014-11-04 Primesense Ltd Zoom-based gesture user interface
US9360929B2 (en) * 2011-07-06 2016-06-07 Sony Corporation Display control apparatus, display control method, and program
US20130009863A1 (en) * 2011-07-06 2013-01-10 Sony Corporation Display control apparatus, display control method, and program
US10074346B2 (en) 2011-07-06 2018-09-11 Sony Corporation Display control apparatus and method to control a transparent display
US20140161319A1 (en) * 2011-07-19 2014-06-12 Nec Corporation Information processing apparatus, method for tracking object and program storage medium
US9911053B2 (en) * 2011-07-19 2018-03-06 Nec Corporation Information processing apparatus, method for tracking object and program storage medium
US9030498B2 (en) 2011-08-15 2015-05-12 Apple Inc. Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
US20130050076A1 (en) * 2011-08-22 2013-02-28 Research & Business Foundation Sungkyunkwan University Method of recognizing a control command based on finger motion and mobile device using the same
US20130050202A1 (en) * 2011-08-23 2013-02-28 Kyocera Corporation Display device
JP2013045255A (en) * 2011-08-23 2013-03-04 Kyocera Corp Display device
US9467683B2 (en) * 2011-08-23 2016-10-11 Kyocera Corporation Display device having three-dimensional display function
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
US20130050414A1 (en) * 2011-08-24 2013-02-28 Ati Technologies Ulc Method and system for navigating and selecting objects within a three-dimensional video image
US9218063B2 (en) 2011-08-24 2015-12-22 Apple Inc. Sessionless pointing user interface
US9918681B2 (en) 2011-09-16 2018-03-20 Auris Surgical Robotics, Inc. System and method for virtually tracking a surgical tool on a movable display
US20140055386A1 (en) * 2011-09-27 2014-02-27 Elo Touch Solutions, Inc. Touch and non touch based interaction of a user with a device
CN103502923A (en) * 2011-09-27 2014-01-08 电子触控产品解决方案公司 Touch and non touch based interaction of a user with a device
CN103403661A (en) * 2011-09-27 2013-11-20 电子触控产品解决方案公司 Scaling of gesture based input
US9448714B2 (en) * 2011-09-27 2016-09-20 Elo Touch Solutions, Inc. Touch and non touch based interaction of a user with a device
US20140055385A1 (en) * 2011-09-27 2014-02-27 Elo Touch Solutions, Inc. Scaling of gesture based input
KR20140082760A (en) * 2011-09-30 2014-07-02 마이크로소프트 코포레이션 Omni-spatial gesture input
US9423876B2 (en) 2011-09-30 2016-08-23 Microsoft Technology Licensing, Llc Omni-spatial gesture input
WO2013049753A1 (en) * 2011-09-30 2013-04-04 Microsoft Corporation Omni-spatial gesture input
CN102929388A (en) * 2011-09-30 2013-02-13 微软公司 Full space posture input
KR101981822B1 (en) * 2011-09-30 2019-05-23 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 Omni-spatial gesture input
JP2014531688A (en) * 2011-09-30 2014-11-27 マイクロソフト コーポレーション Omni-directional gesture input
US20130100008A1 (en) * 2011-10-19 2013-04-25 Stefan J. Marti Haptic Response Module
US20130293547A1 (en) * 2011-12-07 2013-11-07 Yangzhou Du Graphics rendering technique for autostereoscopic three dimensional display
US9223138B2 (en) 2011-12-23 2015-12-29 Microsoft Technology Licensing, Llc Pixel opacity for augmented reality
CN104094193A (en) * 2011-12-27 2014-10-08 英特尔公司 Full 3d interaction on mobile devices
WO2013100900A1 (en) 2011-12-27 2013-07-04 Intel Corporation Full 3d interaction on mobile devices
US9335888B2 (en) * 2011-12-27 2016-05-10 Intel Corporation Full 3D interaction on mobile devices
EP2798440A4 (en) * 2011-12-27 2015-12-09 Intel Corp Full 3d interaction on mobile devices
KR101608423B1 (en) * 2011-12-27 2016-04-01 인텔 코포레이션 Full 3d interaction on mobile devices
TWI493388B (en) * 2011-12-27 2015-07-21 Intel Corp Apparatus and method for full 3d interaction on a mobile device, mobile device, and non-transitory computer readable storage medium
US20140245230A1 (en) * 2011-12-27 2014-08-28 Lenitra M. Durham Full 3d interaction on mobile devices
JP2015506033A (en) * 2011-12-27 2015-02-26 インテル コーポレイション Full 3D interaction on mobile devices
US9298012B2 (en) 2012-01-04 2016-03-29 Microsoft Technology Licensing, Llc Eyebox adjustment for interpupillary distance
US9600078B2 (en) 2012-02-03 2017-03-21 Aquifi, Inc. Method and system enabling natural user interface gestures with an electronic system
US9368546B2 (en) 2012-02-15 2016-06-14 Microsoft Technology Licensing, Llc Imaging structure with embedded light sources
US9726887B2 (en) 2012-02-15 2017-08-08 Microsoft Technology Licensing, Llc Imaging structure color conversion
US9684174B2 (en) 2012-02-15 2017-06-20 Microsoft Technology Licensing, Llc Imaging structure with embedded light sources
US9297996B2 (en) 2012-02-15 2016-03-29 Microsoft Technology Licensing, Llc Laser illumination scanning
US9779643B2 (en) 2012-02-15 2017-10-03 Microsoft Technology Licensing, Llc Imaging structure emitter configurations
CN103294173A (en) * 2012-02-24 2013-09-11 冠捷投资有限公司 Remote control system based on user actions and method thereof
US9229534B2 (en) 2012-02-28 2016-01-05 Apple Inc. Asymmetric mapping for tactile and non-tactile user interfaces
US9807381B2 (en) 2012-03-14 2017-10-31 Microsoft Technology Licensing, Llc Imaging structure emitter calibration
US9578318B2 (en) 2012-03-14 2017-02-21 Microsoft Technology Licensing, Llc Imaging structure emitter calibration
US11068049B2 (en) 2012-03-23 2021-07-20 Microsoft Technology Licensing, Llc Light guide display and field of view
US9377863B2 (en) 2012-03-26 2016-06-28 Apple Inc. Gaze-enhanced virtual touchscreen
US11169611B2 (en) 2012-03-26 2021-11-09 Apple Inc. Enhanced virtual touchpad
US20130258701A1 (en) * 2012-03-28 2013-10-03 Microsoft Corporation Mobile Device Light Guide Display
US9558590B2 (en) 2012-03-28 2017-01-31 Microsoft Technology Licensing, Llc Augmented reality light guide display
US10191515B2 (en) * 2012-03-28 2019-01-29 Microsoft Technology Licensing, Llc Mobile device light guide display
US10388073B2 (en) 2012-03-28 2019-08-20 Microsoft Technology Licensing, Llc Augmented reality light guide display
US8933912B2 (en) 2012-04-02 2015-01-13 Microsoft Corporation Touch sensitive user interface with three dimensional input sensor
US10478717B2 (en) 2012-04-05 2019-11-19 Microsoft Technology Licensing, Llc Augmented reality and physical games
US9717981B2 (en) 2012-04-05 2017-08-01 Microsoft Technology Licensing, Llc Augmented reality and physical games
US20130265232A1 (en) * 2012-04-08 2013-10-10 Samsung Electronics Co., Ltd. Transparent display apparatus and method thereof
US9958957B2 (en) * 2012-04-08 2018-05-01 Samsung Electronics Co., Ltd. Transparent display apparatus and method thereof
US10732729B2 (en) 2012-04-08 2020-08-04 Samsung Electronics Co., Ltd. Transparent display apparatus and method thereof
US9477303B2 (en) 2012-04-09 2016-10-25 Intel Corporation System and method for combining three-dimensional tracking with a three-dimensional display for a user interface
US20130283214A1 (en) * 2012-04-18 2013-10-24 Electronics And Telecommunications Research Institute Apparatus and method for providing user interface for recognizing gesture
US10502876B2 (en) 2012-05-22 2019-12-10 Microsoft Technology Licensing, Llc Waveguide optics focus elements
US9581820B2 (en) 2012-06-04 2017-02-28 Microsoft Technology Licensing, Llc Multiple waveguide imaging structure
US8837780B2 (en) * 2012-06-22 2014-09-16 Hewlett-Packard Development Company, L.P. Gesture based human interfaces
US20130343601A1 (en) * 2012-06-22 2013-12-26 Charles Jia Gesture based human interfaces
US8830312B2 (en) 2012-06-25 2014-09-09 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching within bounded regions
US9098739B2 (en) 2012-06-25 2015-08-04 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching
US8934675B2 (en) 2012-06-25 2015-01-13 Aquifi, Inc. Systems and methods for tracking human hands by performing parts based template matching using images from multiple viewpoints
US9111135B2 (en) 2012-06-25 2015-08-18 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching using corresponding pixels in bounded regions of a sequence of frames that are a specified distance interval from a reference camera
US8655021B2 (en) 2012-06-25 2014-02-18 Imimtek, Inc. Systems and methods for tracking human hands by performing parts based template matching using images from multiple viewpoints
US20140002336A1 (en) * 2012-06-27 2014-01-02 Greg D. Kaine Peripheral device for visual and/or tactile feedback
US20140002339A1 (en) * 2012-06-28 2014-01-02 David Brent GUARD Surface With Touch Sensors for Detecting Proximity
US20150084866A1 (en) * 2012-06-30 2015-03-26 Fred Thomas Virtual hand based on combined data
US10048779B2 (en) * 2012-06-30 2018-08-14 Hewlett-Packard Development Company, L.P. Virtual hand based on combined data
US20170351331A1 (en) * 2012-08-02 2017-12-07 Immersion Corporation Systems and Methods for Haptic Remote Control Gaming
CN104603718A (en) * 2012-08-28 2015-05-06 Nec卡西欧移动通信株式会社 Electronic apparatus, control method thereof, and program
EP2891948A4 (en) * 2012-08-28 2016-04-27 Nec Corp Electronic apparatus, control method thereof, and program
US9310891B2 (en) 2012-09-04 2016-04-12 Aquifi, Inc. Method and system enabling natural user interface gestures with user wearable glasses
US20150312559A1 (en) * 2012-09-27 2015-10-29 Kyocera Corporation Display device, control method, and control program
US10341642B2 (en) * 2012-09-27 2019-07-02 Kyocera Corporation Display device, control method, and control program for stereoscopically displaying objects
US10108271B2 (en) 2012-11-08 2018-10-23 Cuesta Technology Holdings, Llc Multi-modal input control of touch-based devices
US11237638B2 (en) 2012-11-08 2022-02-01 Cuesta Technology Holdings, Llc Systems and methods for extensions to alternative control of touch-based devices
US9671874B2 (en) 2012-11-08 2017-06-06 Cuesta Technology Holdings, Llc Systems and methods for extensions to alternative control of touch-based devices
US9658695B2 (en) * 2012-11-08 2017-05-23 Cuesta Technology Holdings, Llc Systems and methods for alternative control of touch-based devices
US20140145933A1 (en) * 2012-11-27 2014-05-29 Hyundai Motor Company Display and method capable of moving image
US9471151B2 (en) * 2012-11-27 2016-10-18 Hyundai Motor Company Display and method capable of moving image
JP2014109802A (en) * 2012-11-30 2014-06-12 Casio Comput Co Ltd Image processor, image processing method and program
US9760166B2 (en) * 2012-12-17 2017-09-12 Centre National De La Recherche Scientifique Haptic system for establishing a contact free interaction between at least one part of a user's body and a virtual environment
US10192358B2 (en) 2012-12-20 2019-01-29 Microsoft Technology Licensing, Llc Auto-stereoscopic augmented reality display
WO2014103167A1 (en) * 2012-12-27 2014-07-03 Sony Corporation Information processing apparatus, information processing method, and program
US9092665B2 (en) 2013-01-30 2015-07-28 Aquifi, Inc Systems and methods for initializing motion tracking of human hands
US9129155B2 (en) 2013-01-30 2015-09-08 Aquifi, Inc. Systems and methods for initializing motion tracking of human hands using template matching within bounded regions determined using a depth map
US8615108B1 (en) 2013-01-30 2013-12-24 Imimtek, Inc. Systems and methods for initializing motion tracking of human hands
US20160065936A1 (en) * 2013-04-01 2016-03-03 Lg Electronics Inc. Image display device for providing function of changing screen display direction and method thereof
US9930313B2 (en) * 2013-04-01 2018-03-27 Lg Electronics Inc. Image display device for providing function of changing screen display direction and method thereof
US9298266B2 (en) 2013-04-02 2016-03-29 Aquifi, Inc. Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US10203765B2 (en) 2013-04-12 2019-02-12 Usens, Inc. Interactive input system and method
CN105264460A (en) * 2013-04-12 2016-01-20 微软技术许可有限责任公司 Holographic object feedback
US9367136B2 (en) * 2013-04-12 2016-06-14 Microsoft Technology Licensing, Llc Holographic object feedback
US20140354602A1 (en) * 2013-04-12 2014-12-04 Impression.Pi, Inc. Interactive input system and method
US20140306891A1 (en) * 2013-04-12 2014-10-16 Stephen G. Latta Holographic object feedback
WO2014168901A1 (en) * 2013-04-12 2014-10-16 Microsoft Corporation Holographic object feedback
US11020016B2 (en) 2013-05-30 2021-06-01 Auris Health, Inc. System and method for displaying anatomy and devices on a movable display
JP2015026286A (en) * 2013-07-26 2015-02-05 セイコーエプソン株式会社 Display device, display system and control method of display device
US9798388B1 (en) 2013-07-31 2017-10-24 Aquifi, Inc. Vibrotactile system to augment 3D input systems
US20160189430A1 (en) * 2013-08-16 2016-06-30 Audi Ag Method for operating electronic data glasses, and electronic data glasses
US10972631B2 (en) 2013-08-23 2021-04-06 Preemadonna, Inc. Apparatus for applying coating to nails
US9687059B2 (en) * 2013-08-23 2017-06-27 Preemadonna Inc. Nail decorating apparatus
US11290615B2 (en) 2013-08-23 2022-03-29 Preemadonna Inc. Systems and methods to initiate and perform the painting of an area of interest on a finger
US11265444B2 (en) 2013-08-23 2022-03-01 Preemadonna Inc. Apparatus for applying coating to nails
US10972600B2 (en) 2013-10-30 2021-04-06 Apple Inc. Displaying relevant user interface objects
US11316968B2 (en) 2013-10-30 2022-04-26 Apple Inc. Displaying relevant user interface objects
US10250735B2 (en) 2013-10-30 2019-04-02 Apple Inc. Displaying relevant user interface objects
US20150128061A1 (en) * 2013-11-05 2015-05-07 Intuit Inc. Remote control of a desktop application via a mobile device
US10635180B2 (en) 2013-11-05 2020-04-28 Intuit, Inc. Remote control of a desktop application via a mobile device
US10048762B2 (en) * 2013-11-05 2018-08-14 Intuit Inc. Remote control of a desktop application via a mobile device
US10635181B2 (en) 2013-11-05 2020-04-28 Intuit, Inc. Remote control of a desktop application via a mobile device
US9939925B2 (en) * 2013-11-26 2018-04-10 Adobe Systems Incorporated Behind-display user interface
US20150145773A1 (en) * 2013-11-26 2015-05-28 Adobe Systems Incorporated Behind-display user interface
US10175780B2 (en) 2013-11-26 2019-01-08 Adobe Inc. Behind-display user interface
FR3014228A1 (en) * 2013-12-04 2015-06-05 Renault Sa VIRTUAL DETECTION SYSTEM
WO2015102527A1 (en) 2014-01-05 2015-07-09 Yousefi Shahrouz Real-time 3d gesture recognition and tracking system for mobile devices
US9507417B2 (en) 2014-01-07 2016-11-29 Aquifi, Inc. Systems and methods for implementing head tracking based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US20150205372A1 (en) * 2014-01-22 2015-07-23 Samsung Electronics Co., Ltd. Method and apparatus for providing input interface for mobile terminal
US9619105B1 (en) 2014-01-30 2017-04-11 Aquifi, Inc. Systems and methods for gesture based interaction with viewpoint dependent user interfaces
GB2525304B (en) * 2014-03-10 2019-06-19 Bae Systems Plc Interactive information display
EP3117290B1 (en) * 2014-03-10 2022-03-09 BAE Systems PLC Interactive information display
EP3117290A1 (en) * 2014-03-10 2017-01-18 BAE Systems PLC Interactive information display
US9753549B2 (en) * 2014-03-14 2017-09-05 Sony Interactive Entertainment Inc. Gaming device with rotatably placed cameras
US20150301606A1 (en) * 2014-04-18 2015-10-22 Valentin Andrei Techniques for improved wearable computing device gesture based interactions
US9946336B2 (en) 2014-05-05 2018-04-17 Immersion Corporation Systems and methods for viewport-based augmented reality haptic effects
US9690370B2 (en) 2014-05-05 2017-06-27 Immersion Corporation Systems and methods for viewport-based augmented reality haptic effects
US10444829B2 (en) 2014-05-05 2019-10-15 Immersion Corporation Systems and methods for viewport-based augmented reality haptic effects
EP2942693A1 (en) * 2014-05-05 2015-11-11 Immersion Corporation Systems and methods for viewport-based augmented reality haptic effects
CN111338479A (en) * 2014-05-19 2020-06-26 意美森公司 Non-co-located haptic cues in immersive environments
JP2015219910A (en) * 2014-05-19 2015-12-07 イマージョン コーポレーションImmersion Corporation Non-collocated haptic cues in immersive environments
CN105094417A (en) * 2014-05-19 2015-11-25 意美森公司 Non-collocated haptic cues in immersive environments
EP2947548A1 (en) * 2014-05-19 2015-11-25 Immersion Corporation Non-collocated haptic cues in immersive environments
US10379614B2 (en) 2014-05-19 2019-08-13 Immersion Corporation Non-collocated haptic cues in immersive environments
US10564730B2 (en) * 2014-05-19 2020-02-18 Immersion Corporation Non-collocated haptic cues in immersive environments
US9740338B2 (en) 2014-05-22 2017-08-22 Ubi interactive inc. System and methods for providing a three-dimensional touch screen
US10948996B2 (en) 2014-06-03 2021-03-16 Google Llc Radar-based gesture-recognition at a surface of an object
US10509478B2 (en) 2014-06-03 2019-12-17 Google Llc Radar-based gesture-recognition from a surface radar field on which an interaction is sensed
US9304235B2 (en) 2014-07-30 2016-04-05 Microsoft Technology Licensing, Llc Microfabrication
US10592080B2 (en) 2014-07-31 2020-03-17 Microsoft Technology Licensing, Llc Assisted presentation of application windows
US10254942B2 (en) 2014-07-31 2019-04-09 Microsoft Technology Licensing, Llc Adaptive sizing and positioning of application windows
US10678412B2 (en) 2014-07-31 2020-06-09 Microsoft Technology Licensing, Llc Dynamic joint dividers for application windows
US10642367B2 (en) 2014-08-07 2020-05-05 Google Llc Radar-based gesture sensing and data transmission
US10268321B2 (en) * 2014-08-15 2019-04-23 Google Llc Interactive textiles within hard objects
US11169988B2 (en) 2014-08-22 2021-11-09 Google Llc Radar recognition-aided search
US10409385B2 (en) 2014-08-22 2019-09-10 Google Llc Occluded gesture recognition
US10936081B2 (en) 2014-08-22 2021-03-02 Google Llc Occluded gesture recognition
US11221682B2 (en) 2014-08-22 2022-01-11 Google Llc Occluded gesture recognition
US11816101B2 (en) 2014-08-22 2023-11-14 Google Llc Radar recognition-aided search
US20170201721A1 (en) * 2014-09-30 2017-07-13 Hewlett Packard Enterprise Development Lp Artifact projection
US11163371B2 (en) 2014-10-02 2021-11-02 Google Llc Non-line-of-sight radar-based gesture recognition
US10664059B2 (en) 2014-10-02 2020-05-26 Google Llc Non-line-of-sight radar-based gesture recognition
US10523993B2 (en) 2014-10-16 2019-12-31 Disney Enterprises, Inc. Displaying custom positioned overlays to a viewer
JP2015084261A (en) * 2015-02-05 2015-04-30 キヤノン株式会社 Imaging apparatus, control method and program thereof, and recording medium
US9827209B2 (en) 2015-02-09 2017-11-28 Microsoft Technology Licensing, Llc Display system
US11086216B2 (en) 2015-02-09 2021-08-10 Microsoft Technology Licensing, Llc Generating electronic components
US10018844B2 (en) 2015-02-09 2018-07-10 Microsoft Technology Licensing, Llc Wearable image display system
US9423360B1 (en) 2015-02-09 2016-08-23 Microsoft Technology Licensing, Llc Optical components
US9513480B2 (en) 2015-02-09 2016-12-06 Microsoft Technology Licensing, Llc Waveguide
US9429692B1 (en) 2015-02-09 2016-08-30 Microsoft Technology Licensing, Llc Optical components
US10317677B2 (en) 2015-02-09 2019-06-11 Microsoft Technology Licensing, Llc Display system
US9535253B2 (en) 2015-02-09 2017-01-03 Microsoft Technology Licensing, Llc Display system
US9372347B1 (en) 2015-02-09 2016-06-21 Microsoft Technology Licensing, Llc Display system
US11709552B2 (en) 2015-04-30 2023-07-25 Google Llc RF-based micro-motion tracking for gesture tracking and recognition
US10139916B2 (en) 2015-04-30 2018-11-27 Google Llc Wide-field radar-based gesture recognition
US10664061B2 (en) 2015-04-30 2020-05-26 Google Llc Wide-field radar-based gesture recognition
US10310620B2 (en) 2015-04-30 2019-06-04 Google Llc Type-agnostic RF signal representations
US10241581B2 (en) 2015-04-30 2019-03-26 Google Llc RF-based micro-motion tracking for gesture tracking and recognition
US10496182B2 (en) 2015-04-30 2019-12-03 Google Llc Type-agnostic RF signal representations
US10817070B2 (en) 2015-04-30 2020-10-27 Google Llc RF-based micro-motion tracking for gesture tracking and recognition
US10572027B2 (en) 2015-05-27 2020-02-25 Google Llc Gesture detection and interactions
US10936085B2 (en) 2015-05-27 2021-03-02 Google Llc Gesture detection and interactions
US10203763B1 (en) 2015-05-27 2019-02-12 Google Inc. Gesture detection and interactions
US10155274B2 (en) 2015-05-27 2018-12-18 Google Llc Attaching electronic components to interactive textiles
US10180720B2 (en) * 2015-06-11 2019-01-15 Facebook Technologies, Llc Hand-held controller with pressure-sensing switch for virtual-reality systems
US20180164883A1 (en) * 2015-06-11 2018-06-14 Oculus Vr, Llc Hand-Held Controller with Pressure-Sensing Switch for Virtual-Reality Systems
US10874936B2 (en) 2015-06-11 2020-12-29 Facebook Technologies, Llc Hand-held controller with pressure-sensing switch for virtual-reality systems
US20170038830A1 (en) * 2015-08-04 2017-02-09 Google Inc. Context sensitive hand collisions in virtual reality
US10635161B2 (en) * 2015-08-04 2020-04-28 Google Llc Context sensitive hand collisions in virtual reality
US10620803B2 (en) * 2015-09-29 2020-04-14 Microsoft Technology Licensing, Llc Selecting at least one graphical user interface item
US10503883B1 (en) 2015-10-06 2019-12-10 Google Llc Radar-based authentication
US11698438B2 (en) 2015-10-06 2023-07-11 Google Llc Gesture recognition using multiple antenna
US10300370B1 (en) 2015-10-06 2019-05-28 Google Llc Advanced gaming and virtual reality control using radar
US10908696B2 (en) 2015-10-06 2021-02-02 Google Llc Advanced gaming and virtual reality control using radar
US10768712B2 (en) 2015-10-06 2020-09-08 Google Llc Gesture component with gesture library
US11693092B2 (en) 2015-10-06 2023-07-04 Google Llc Gesture recognition using multiple antenna
US10222469B1 (en) 2015-10-06 2019-03-05 Google Llc Radar-based contextual sensing
US11656336B2 (en) 2015-10-06 2023-05-23 Google Llc Advanced gaming and virtual reality control using radar
US10705185B1 (en) 2015-10-06 2020-07-07 Google Llc Application-based signal processing parameters in radar-based detection
US11256335B2 (en) 2015-10-06 2022-02-22 Google Llc Fine-motion virtual-reality or augmented-reality control using radar
US11698439B2 (en) 2015-10-06 2023-07-11 Google Llc Gesture recognition using multiple antenna
US11592909B2 (en) 2015-10-06 2023-02-28 Google Llc Fine-motion virtual-reality or augmented-reality control using radar
US10823841B1 (en) 2015-10-06 2020-11-03 Google Llc Radar imaging on a mobile computing device
US11175743B2 (en) 2015-10-06 2021-11-16 Google Llc Gesture recognition using multiple antenna
US10401490B2 (en) 2015-10-06 2019-09-03 Google Llc Radar-enabled sensor fusion
US10817065B1 (en) 2015-10-06 2020-10-27 Google Llc Gesture recognition using multiple antenna
US10379621B2 (en) 2015-10-06 2019-08-13 Google Llc Gesture component with gesture library
US11481040B2 (en) 2015-10-06 2022-10-25 Google Llc User-customizable machine-learning in radar-based gesture detection
US11385721B2 (en) 2015-10-06 2022-07-12 Google Llc Application-based signal processing parameters in radar-based detection
US11132065B2 (en) 2015-10-06 2021-09-28 Google Llc Radar-enabled sensor fusion
US10310621B1 (en) 2015-10-06 2019-06-04 Google Llc Radar gesture sensing using existing data protocols
US10540001B1 (en) 2015-10-06 2020-01-21 Google Llc Fine-motion virtual-reality or augmented-reality control using radar
US10459080B1 (en) 2015-10-06 2019-10-29 Google Llc Radar-based object detection for vehicles
US10049493B1 (en) * 2015-10-22 2018-08-14 Hoyt Architecture Lab, Inc System and methods for providing interaction with elements in a virtual architectural visualization
US10754422B1 (en) 2015-10-22 2020-08-25 Hoyt Architecture Lab, Inc. Systems and methods for providing interaction with elements in a virtual architectural visualization
US20170131775A1 (en) * 2015-11-10 2017-05-11 Castar, Inc. System and method of haptic feedback by referral of sensation
US10089681B2 (en) 2015-12-04 2018-10-02 Nimbus Visulization, Inc. Augmented reality commercial platform and method
US20170177077A1 (en) * 2015-12-09 2017-06-22 National Taiwan University Three-dimension interactive system and method for virtual reality
WO2017116813A3 (en) * 2015-12-28 2017-09-14 Microsoft Technology Licensing, Llc Haptic feedback for non-touch surface interaction
US10976819B2 (en) 2015-12-28 2021-04-13 Microsoft Technology Licensing, Llc Haptic feedback for non-touch surface interaction
US10186081B2 (en) 2015-12-29 2019-01-22 Microsoft Technology Licensing, Llc Tracking rigged smooth-surface models of articulated objects
US10565791B2 (en) 2015-12-29 2020-02-18 Microsoft Technology Licensing, Llc Tracking rigged polygon-mesh models of articulated objects
US20170185141A1 (en) * 2015-12-29 2017-06-29 Microsoft Technology Licensing, Llc Hand tracking for interaction feedback
US10949056B2 (en) * 2016-02-03 2021-03-16 Salesforce.Com, Inc. System and method to navigate 3D data on mobile and desktop
US10459597B2 (en) * 2016-02-03 2019-10-29 Salesforce.Com, Inc. System and method to navigate 3D data on mobile and desktop
WO2017179786A1 (en) * 2016-04-11 2017-10-19 (주)스마트몹 Three-dimensional input device, method and system using motion recognition sensor
US11140787B2 (en) 2016-05-03 2021-10-05 Google Llc Connecting an electronic component to an interactive textile
US10492302B2 (en) 2016-05-03 2019-11-26 Google Llc Connecting an electronic component to an interactive textile
US10285456B2 (en) 2016-05-16 2019-05-14 Google Llc Interactive fabric
US10175781B2 (en) 2016-05-16 2019-01-08 Google Llc Interactive object with multiple electronics modules
US10739974B2 (en) 2016-06-11 2020-08-11 Apple Inc. Configuring context-specific user interfaces
US11733656B2 (en) 2016-06-11 2023-08-22 Apple Inc. Configuring context-specific user interfaces
US11073799B2 (en) 2016-06-11 2021-07-27 Apple Inc. Configuring context-specific user interfaces
US11816325B2 (en) 2016-06-12 2023-11-14 Apple Inc. Application shortcuts for carplay
US20180126268A1 (en) * 2016-11-09 2018-05-10 Zynga Inc. Interactions between one or more mobile devices and a vr/ar headset
US10579150B2 (en) 2016-12-05 2020-03-03 Google Llc Concurrent detection of absolute distance and relative movement for sensing action gestures
TWI637363B (en) * 2017-07-26 2018-10-01 銘傳大學 Augmented reality human–machine interaction system
US10782793B2 (en) 2017-08-10 2020-09-22 Google Llc Context-sensitive hand interaction
US11181986B2 (en) 2017-08-10 2021-11-23 Google Llc Context-sensitive hand interaction
US11277584B2 (en) * 2017-09-26 2022-03-15 Audi Ag Method and system for carrying out a virtual meeting between at least a first person and a second person
US11103041B2 (en) 2017-10-04 2021-08-31 Preemadonna Inc. Systems and methods of adaptive nail printing and collaborative beauty platform hosting
US11717070B2 (en) 2017-10-04 2023-08-08 Preemadonna Inc. Systems and methods of adaptive nail printing and collaborative beauty platform hosting
US10849532B1 (en) * 2017-12-08 2020-12-01 Arizona Board Of Regents On Behalf Of Arizona State University Computer-vision-based clinical assessment of upper extremity function
US10497179B2 (en) 2018-02-23 2019-12-03 Hong Kong Applied Science and Technology Research Institute Company Limited Apparatus and method for performing real object detection and control using a virtual reality head mounted display system
US10877645B2 (en) * 2018-04-30 2020-12-29 Samsung Electronics Co., Ltd. Electronic device and operating method thereof
CN109739358A (en) * 2019-01-03 2019-05-10 京东方科技集团股份有限公司 Gesture collision checking method and equipment based on naked eye 3D
US11675476B2 (en) 2019-05-05 2023-06-13 Apple Inc. User interfaces for widgets
US11341569B2 (en) * 2019-10-25 2022-05-24 7-Eleven, Inc. System and method for populating a virtual shopping cart based on video of a customer's shopping session at a physical store
US20220180424A1 (en) * 2019-10-25 2022-06-09 7-Eleven, Inc. System and method for populating a virtual shopping cart based on video of a customer's shopping session at a physical store
US11285674B1 (en) * 2020-02-02 2022-03-29 Robert Edwin Douglas Method and apparatus for a geo-registered 3D virtual hand
US11833761B1 (en) * 2020-02-02 2023-12-05 Robert Edwin Douglas Optimizing interaction with of tangible tools with tangible objects via registration of virtual objects to tangible tools
US20210351241A1 (en) * 2020-05-08 2021-11-11 Samsung Display Co., Ltd. Display device
US11797048B2 (en) * 2020-05-08 2023-10-24 Samsung Display Co., Ltd. Display device
US11340707B2 (en) * 2020-05-29 2022-05-24 Microsoft Technology Licensing, Llc Hand gesture-based emojis
US20230273711A1 (en) * 2020-09-11 2023-08-31 Beijing Bytedance Network Technology Co., Ltd. Electronic device control method and apparatus, and terminal and storage medium
WO2022074791A1 (en) * 2020-10-08 2022-04-14 マクセル株式会社 Three-dimensional augmented reality processing system, three-dimensional augmented reality processing method, and user interface device for three-dimensional augmented reality processing system
CN112354179A (en) * 2020-11-23 2021-02-12 浙江浙大中控信息技术有限公司 Three-dimensional geographic information content display and interaction method

Also Published As

Publication number Publication date
KR101637990B1 (en) 2016-07-11
US20100053322A1 (en) 2010-03-04
US8456524B2 (en) 2013-06-04
US20100134618A1 (en) 2010-06-03
WO2010027193A3 (en) 2012-06-21
KR20100027976A (en) 2010-03-11
US20100053164A1 (en) 2010-03-04
US20100053324A1 (en) 2010-03-04
US8253795B2 (en) 2012-08-28
US8253649B2 (en) 2012-08-28
WO2010027193A2 (en) 2010-03-11
KR20110082636A (en) 2011-07-20
US8310537B2 (en) 2012-11-13

Similar Documents

Publication Publication Date Title
US20100053151A1 (en) In-line mediation for manipulating three-dimensional content on a display device
US11340756B2 (en) Devices, methods, and graphical user interfaces for interacting with three-dimensional environments
US20100128112A1 (en) Immersive display system for interacting with three-dimensional content
CN116324677A (en) Non-contact photo capture in response to detected gestures
US11599239B2 (en) Devices, methods, and graphical user interfaces for providing computer-generated experiences
US20230325004A1 (en) Method of interacting with objects in an environment
US11714540B2 (en) Remote touch detection enabled by peripheral device
KR20140100547A (en) Full 3d interaction on mobile devices
US20230092282A1 (en) Methods for moving objects in a three-dimensional environment
US11803233B2 (en) IMU for touch detection
US11675198B2 (en) Eyewear including virtual scene with 3D frames
US11620790B2 (en) Generating a 3D model of a fingertip for visual touch detection
US11836871B2 (en) Indicating a position of an occluded physical object
US11641460B1 (en) Generating a volumetric representation of a capture region
US20240103712A1 (en) Devices, Methods, and Graphical User Interfaces For Interacting with Three-Dimensional Environments
WO2024064231A1 (en) Devices, methods, and graphical user interfaces for interacting with three-dimensional environments
WO2023196344A1 (en) Devices, methods, and graphical user interfaces for modifying avatars in three-dimensional environments
KR20220115886A (en) Measurement based on point selection

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION