CN101810003B - Enhanced camera-based input - Google Patents

Enhanced camera-based input Download PDF

Info

Publication number
CN101810003B
CN101810003B CN200880109208XA CN200880109208A CN101810003B CN 101810003 B CN101810003 B CN 101810003B CN 200880109208X A CN200880109208X A CN 200880109208XA CN 200880109208 A CN200880109208 A CN 200880109208A CN 101810003 B CN101810003 B CN 101810003B
Authority
CN
China
Prior art keywords
user
project
image
surveyed area
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN200880109208XA
Other languages
Chinese (zh)
Other versions
CN101810003A (en
Inventor
埃文·希尔德雷思
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qualcomm Inc
Original Assignee
Qualcomm Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Inc filed Critical Qualcomm Inc
Priority to CN201810420206.8A priority Critical patent/CN108399010B/en
Publication of CN101810003A publication Critical patent/CN101810003A/en
Application granted granted Critical
Publication of CN101810003B publication Critical patent/CN101810003B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements

Abstract

Enhanced camera-based input, in which a detection region surrounding a user is defined in an image of the user within a scene, and a position of an object (such as a hand) within the detection region is detected. Additionally, a control (such as a key of a virtual keyboard) in a user interface is interacted with based on the detected position of the object.

Description

The input based on camera that strengthens
The cross reference of related application
The application requires the U.S. Patent application No.12/124375 that submits on May 21st, 2008 and the priority of the U.S. Patent application No.12/102587 that submits on April 14th, 2008, and these two applications all require the rights and interests of the U.S. Provisional Patent Application No.60/952448 that submits on July 27th, 2007.The disclosure of each part application is all incorporated into herein by reference in these applications.
Technical field
The disclosure relates generally to computer input, and according to an embodiment, relates to the man-machine interface based on camera.
Background technology
In the field of user interface, control (or little) is with it mutual interface element of computer user, for example window or text box.In some cases, control (for example virtual push button) can have the function similar to the physics homologue of this control and outward appearance.Typically, the user uses computer mouse or keyboard and widget interaction.
Summary of the invention
According to a general execution mode, enhancing control described here can be used for being convenient to the option from project team, for example selects letter in consisting of alphabetic(al) letter group, perhaps selection function in a large amount of functions.By determine along datum line these projects towards or aim at these projects, these projects can show not overlappingly, do not hinder or not make a part of target that is also shown in the user interface unclear, thereby make intuitively option of user.
Can pass through cursor hovers is continued in project one period of setting, or pass through the physical control of selection such as mousebutton or keyboard in above cursor is positioned at project, or pass through other method, occur selecting.The selection of project or potential selection can cause project to change its outward appearance (namely ' activation ' or ' outstanding '), thereby project is distinguished from each other out and reduces the selection mistake along datum line.Give the definition of datum line or the target that the location provides the basis can be own mutual with project, for example in the action mapping of input unit or other user input in the situation of the arm action of the incarnation that is centered on by the project along the datum line aligning.
According to a general execution mode, the processing that computer is carried out is included in the user interface with respect to the object definition datum line, and shows the project of aiming at datum line under the condition of fuzzy object not.This processing also comprise based on receive to shown project one of them selection and export selected project.
Execution mode can comprise one or more following features.For example, selection can be based on the user selection of cursor.Can determine the position of target in user interface, wherein datum line can dynamically be defined as at least a portion around determined position.Can detect the second target in user interface, wherein datum line can dynamically be defined as on a side of the target relative with the second target.Can detect the change of determined position, and can redefine datum line with respect to target based on determined change.
In another example, datum line can be straight, crooked, circular, polygonal or zigzag datum line, and wherein each project can be alphanumeric character, symbol, background or title.Export selected project and can also comprise outstanding selected project, for example by changing color, opacity or the size of selected project.Change the color, opacity of selected project or size and can also comprise color, opacity or the size of selected project are changed into the first rank, and will change into second level with color, opacity or the size of the project of selected project adjacency.Display items display also can comprise along datum line distribution entry equably.Target can be white space or the incarnation in the user interface.The starting point of datum line and terminal point can flatly or vertically be aimed at.
In other example, the definition datum line can also comprise the actuating range of the control section of determining incarnation, and defines datum line in the actuating range of control section, and wherein this datum line can be defined as along the outward flange of the actuating range of control section.Can specify the first or second potential control section of incarnation is control section, the control section of appointment can be got up by activity, and control section can be specified and exchange to the second potential control section from the first potential control section, or exchange to the first potential control section from the second potential control section.
According to another general execution mode, computer program visibly is embodied in the machine readable media.Computer program comprises such instruction: when being read by machine the operation so that make data processing equipment in user interface with respect to the object definition datum line, in order under the condition of fuzzy object not, showing the project of aiming at datum line, and in order to based on reception selected project is exported in one selection in the shown project.
According to other general execution mode, a kind of device comprises processor and user interface.Processor in user interface with respect to the object definition datum line.User interface shows the project of aiming at datum line under the condition of fuzzy object not, and based on reception selected project is exported in one selection in the shown project.
In a general execution mode again, the user is for example by swinging in the space around their health or determining that on purpose their arm or the position of two arms do dumb show.Camera is taken user's image, and determines user's position from image.Use this position, define such surveyed area around the user: if the user plans to carry out the input dumb show, user's hand or arm may be in this surveyed areas so.Detect user's hand or arm by the surveyed area that checks this definition, and the position of these control targets that detect or action mapping are to the input of computer application.
In an example, and based on the anatomical model to the ability of user's body and restriction definition or modeling, surveyed area is defined as in the zone of user's above-head image, and extends outwardly into the user side.The position of the control target that detects or action can be mapped to cursor, incarnation, performance, mouse event or other element, with with user interface on control (for example, scroll bar, button, dummy keyboard, drop-down menu or any other control) mutual.
According to a general execution mode again, definition and detects the position of target (for example hand) in surveyed area around user's surveyed area in user's the image in scene.In addition, mutual based on position and the control in the user interface (for example button of dummy keyboard) of the target that detects.Can also comprise the selection character with widget interaction.
Execution mode can comprise one or more following features.For example, can detect user's agreement dumb show (engagement gesture), wherein can arrange the position that dumb show detects target based on detecting.User interface can also comprise user's representative, the control project that can comprise and aim at respect to the datum line of representative definition wherein, and under the fuzzy condition that represents display items display.User's representative can also comprise such incarnation or cursor, and wherein the position based on the target that detects makes the incarnation activity get up to imitate user's action.Control may be displayed on top or the next door of representative in the user interface.
In an example again, the definition surveyed area can also comprise the head of determining the user or the position of trunk, and does not comprise trunk based on the position definition surveyed area of determined head or trunk.User's trunk position can be determined with image segmentation, and the facial head position of determining the user that detects can be used.Replacedly, the definition surveyed area can also comprise the coverage area of determining user's arm, and based on the coverage area of determined arm, the definition surveyed area does not comprise at least a portion of the inaccessiable image-region of arm.On the contrary, the definition surveyed area can also comprise position based on determined trunk, and that determines that target can not arrive, image can not arrive the zone, and the definition surveyed area do not comprise determined image can not arrive the zone.The coverage area of user's arm can be based on the determined position of user's trunk, head or shoulder.
In an example again, image can be cropped and be presented in the user interface.User's image can be placed in the middle, wherein can be based on image is come the cutting image between two parties.Can determine multiplication factor, this multiplication factor so that in image placed in the middle the user can arrive control, wherein can come the cutting image based on multiplication factor.Can use anatomical model to determine multiplication factor.Can detect user's face, can determine based on the face that detects user's identity, and can adjust control based on user's identity.Can detect the position of the second target in surveyed area, and can be based on the position of the second target that detects and the second widget interaction in the user interface, or can adjust control based on the position of the second target.
In an example again, can use image shot by camera.The arc that surveyed area can be shaped as the user top detects subregion, and this arc detects subregion and two linears detect the subregion vicinity, and each linear detects subregion and is positioned at user's one side.Can detect target about the position of datum line in surveyed area of mapping.Can show that image through cutting apart is as user's representative.
According to a general execution mode again, computer-readable medium passes through computer program product encodes.Computer program comprises such instruction: operation when carrying out this instruction is so that definition is around user's surveyed area in computer user's in scene the image, and the position of detection target in surveyed area.Computer program also comprises such instruction: when machine reads this instruction the operation so that data processing equipment based on the position of the target that detects and the widget interaction in the user interface.
According to a general execution mode again, device comprises processor.Processor is configured to be suitable for maybe operating the surveyed area that centers on the user with definition in the image of user in scene, and detects the position of target in surveyed area.Processor also is configured to based on the position of the target that detects and the widget interaction in the user interface.
Provide this brief outline, so that various conceptions and the execution mode can fast understanding described by presents.By obtaining more comprehensively to understand with reference to following detailed description and accompanying drawing.Should be appreciated that, can adopt other execution mode and can change.
Description of drawings
Fig. 1 is the conceptual schematic view based on the input of camera that enhancing is shown.
Fig. 2 is for the block diagram based on the device of the input of camera that carry out to strengthen.
Fig. 3 shows the exemplary process based on the input of camera that use to strengthen.
Fig. 4 shows the example system for the image of taking the user.
Fig. 5 has described the user's of system's shooting of using Fig. 4 example images.
Fig. 6 has described the exemplary detection zone of definition in the image shown in Figure 5.
Fig. 7 shows by larger image being carried out image placed in the middle and that the cutting generation is less.
Fig. 8 shows the surveyed area that defines around the control user in the scene that comprises two candidate user.
Fig. 9 shows dumb show to the exemplary map of control input.
Figure 10 is the conceptual schematic view of the user interface of usage example enhancing control.
Figure 11 shows the user interface that comprises target and exemplary enhancing control, and this exemplary enhancing control also comprises datum line (guide line).
Figure 12 to Figure 19 shows the exemplary relation between exemplary reference line layout and datum line and the target.
Figure 20 to Figure 38 shows along the exemplary selection of the project of datum line aligning.
Figure 39 shows the user interface that comprises one or more exemplary enhancing controls.
Figure 40 shows other exemplary controls of using based on the input of camera.
Figure 41 shows the example of the outward appearance of calculation element.
Figure 42 is the block diagram that the internal structure of the computer shown in Figure 41 is shown.
Embodiment
According to a general execution mode, enhancing control described here can be used for being convenient to the option from project team, for example selects letter in consisting of alphabetic(al) letter group, perhaps selection function in a large amount of functions.By along datum line location or aim at these projects, these projects can show not overlappingly, do not hinder or a fuzzy part that is also shown in the target in the user interface otherwise, thereby make intuitively option of user.
In addition, use Enhancement Method described here, the user can be by swinging or otherwise locate their arm or two arms and expressing with dumb show in the space around their health.Camera is taken user's image, and determines user's position from image.Use this position, around the user, define such surveyed area: if the user plans to the computer application input command, in surveyed area, probably find so user's hand or arm.By checking that defined range detects user's hand or arm, and the position of these control targets that detect or action are as the input to computer application.
In an example, and based on the definition or the ability of analog subscriber health and the anatomical model of restriction, surveyed area is defined in user's above-head and extends outwardly in the zone on user side.The position of the control target that detects or action can be mapped to cursor, incarnation, representative, mouse event or other element, with with user interface on control (for example, scroll bar, button, dummy keyboard, drop-down menu or any other little) mutual.
Fig. 1 is the conceptual schematic view based on the input of camera that enhancing is shown.Scene 100 comprises user 101, and user 101 is sitting on the sofa 102 of set-top box 104 fronts, watches TV programme or listens to the music.Set-top box 104 is connected to TV 105 and camera 106.As with the mutual a kind of method of set-top box 104, extensible his left hand 107 of user surpasses his head 109, and uses the image 110 of camera 106 generating scenes 100.This image can be the single image of taking with still camera, or the one or more images in a series of images of use video camera shooting.
In the image 110 of scene 100, the surveyed area 111 around the definition user 101.In this example, surveyed area 111 comprises that the arc of user 101 tops detects subregion 111a, and the linear that this arc detects subregion 111a and user 101 both sides detects subregion 111b and 111c adjacency.Surveyed area 111 can presentation video 110 such part: except order input dumb show other action appears unlikely wherein.
For example, surveyed area 111 can not comprise the image-region that is associated with user's trunk or head, the very most of of caused action because the normal body that these zones can experience and the input dumb show is irrelevant is moved.Similarly, also can not comprise these zones close to trunk: when user 101 subconsciously moved or only adjusts themselves position, arm or head may occupy in this zone.In addition, user's 101 healths image-region that can not arrive also can be not included in the surveyed area 111.
Therefore, surveyed area 111 occupies such part of image: the control target that detects in this zone will may represent the intentional dumb show of user's 101 input commands.In this, at follow-up inspection surveyed area 111 determining whether the control target is arranged in the processing operation in this zone, can be only for surveyed area 111 or for less than whole image but comprise that those image sections of surveyed area 111 carry out these operations.
Therefore, surveyed area 111 may be defined as zone such in the visual field of camera 106: in this zone user 101 can show by the action of body part they just in fill order input, signal or do dumb show.Yet, similarly, surveyed area 111 can be defined as the so a part of visual field that does not also comprise camera 106: the spot is so frequent to be difficult to distinguish the dumb show that is not intended as the order input and to be intended as the dumb show that order is inputted to such an extent as to wherein move.
In this, in adjusting (setup) or calibration operation, for each pixel in the image, pixel map can quantization device duration of work visual field in the amount of action that experiences of each independent pixel.Use this pixel map, the amount of action that surveyed area can be by will experience is less than the definition that forecloses of those pixels of particular minimum threshold, because these pixels may represent background or pixel outside user's coverage area.Similarly, surveyed area can be defined as the amount of action that do not comprise experience greater than those pixels of specific max-thresholds, because these pixels may represent to experience the pixel of the frequent movement that does not represent to order input.Do not represent to order the dumb show of input or the example of action to comprise, the action in the user chest that is caused by breathing, user's eyes or facial activity to content, body language reaction the time, perhaps subconscious health move, ballism or shake.
Can be based on anatomical model (anatomical model) definition surveyed area 111.For example, can estimate or measure height or the width of user's head, and the normal coverage area of user's hand can be estimated as three times of this measured value.Use this estimation, the arc surveyed area can be defined in the distance away from user's chest twice to three times this measured value.Estimate by this, can ignore less than this apart from action that occur, expression health or trunk motion, and occur outside this distance, represent the action that can not be moved by user's arm reasons for its use.
In this respect, image 110 comprises can not arrive zone 113, and this can not reach those such parts that zone 113 comprises image 110: based on anatomical model and suppose that user's the position of taking one's seat will keep relatively fixing and being defined as the part that hand 107 can not arrive.Image 110 also comprises the torso area 114 that is occupied by trunk, expect that this torso area 114 comprises such target context or user action: compare with action or the position of other target that detects (hand 107 that for example detects in the surveyed area 111), these target contexts or user action are with not too relevant based on the input of camera or based on the interference of the input of camera.Because camera 106 can keep fixing, so surveyed area 111 can be mapped on another image or other image in direct mode, for example successive image.
In case surveyed area 111 is mapped to another image, so just can from other image, detects 107 absolute position or the relative positions in surveyed area 111 of selling.Therefore replacedly, because action may be easier to detect than the position, can detect 107 the actions in surveyed area 111 of selling.In addition, if surveyed area 111 is divided into or is defined as a plurality of discrete pieces, for example which indicator collet piece 112a to 112g can export so close to hand 107 or the fuzzyyest indication owing to hand 107.For example, if hand 107 makes piece 112e ratio make any other piece fuzzyyer, so can IOB 112e be selected indication.This indication can comprise the unique identifier that identifies this piece.
In user interface 114, the position of hand 107 in surveyed area 111 is mapped to cursor position, representative (representation) or the virtual controlling target mutual with control 117, for example virtual hand 115 of incarnation 116.Control control 117 or mutual with control 117 in, the motion of hand 107 surveyed area 111 in can make incarnation 116 activities, thereby so that with alternately more directly perceived for the user and the visually happiness of control 117.By mutual with control 117, user's 101 options 119 (letter " R "), this project 119 appears in the output area 120 of user interface 114.
Incarnation can be packed with the part that image 110 is taken user's physical appearance.By the method, for user 101, appear to their operating means or physically mutual with device in user interface 114.Change a kind of saying, camera 106 can be taken user 101 appearance images, and this appearance images is covered on the three-dimensional wireframe that consists of incarnation, gives this incarnation with user 101 outward appearance.
User 101 action can occur with 1: 1 relation to the mapping of the action of incarnation 116, the action mirror image can be able to be adopted acceleration, perhaps the action of incarnation can be forced to specifically " locking (snap to) " point.In addition, in velocity mode, the position of control target in surveyed area can also represent such speed: incarnation should continue mobile until the position of control target changes with this speed.According to desired concrete configuration, for example, the action of user 101 action and incarnation 116 is mirror image mutually, so that the user raises left hand 107 right hand 121 of incarnation is raised.
Fig. 2 is for the block diagram based on the device 200 of the input of camera that realize to strengthen.In brief, except other parts, device 200 comprises user interface 201, storage medium 202, input unit 204 and processor 205.Operative installations 200, enhancing can be used for being convenient to widget interaction with user interface based on the input of camera, for example so that the user can option.Do not need the user to hold any special target with widget interaction, the for example retroreflector (retro-reflector) in their hand, gyroscope equipment (gyroscopic device) or remote sensing controller, but intuitively with the action mapping of arm or hand to cursor or other input.
User interface 201 be for so that the user can with device or with the mutual mechanism of the application of being called by device.User interface 201 both can affect input also can affect output, so that the user can operating means or so that the effect of device generation user's operation.Device 200 can adopt the user interface 201 of any type, for example graphic user interface (GUI), voice user interface or user interface touch or sense of touch.
User interface 201 can be configured to present visual display images.For example, user interface 201 can be monitor, television set, liquid crystal display (LCD), plasma display system, the projector with projection screen, automatic stereoscopic display device, cathode ray tube (CRT) display, digital light processing (DLP) display or the display unit that is configured to present any other type that shows image.User interface 201 can comprise one or more display unit.In some constructions, user interface 201 can be configured to show the image that is associated with application, for example by the demonstration image that use to generate, comprises control and such as the target of incarnation.Storage medium 202 storages and recorded information or data, and can be optical storage medium, magnetic storage medium, flash memory or any other storage media types.
Camera 204 is the devices for photographic images, is picture or a series of moving image with image taking.Camera 204 can use the other parts of light or the electromagnetic spectrum of visible spectrum, for example infrared ray.For example, camera 204 can be digital camera, Digital Video or the device that is configured to any other type of photographic images.Camera 204 can comprise one or more cameras.In some instances, camera 204 can be configured to take and use mutual target or user's image.For example, camera 204 can be configured to take in the visual field of camera 204 physically and use mutual user or people's image.
Camera can be stereoscopic camera (stereo camera), flight time (time-of-flight) camera or any other camera.For example, camera 204 can be can be to the visual detector of background image sampling, in order to detect user's action and similarly, user's dumb show.Camera 204 can generate gray level image, coloured image or range image (distance image), for example can generate stereoscopic camera or the time-of-flight camera of range image.Stereoscopic camera can be included in two imageing sensors that slightly different viewpoint place obtains image, and the image that processor relatively obtains from different viewpoints in stereoscopic camera is with the distance of computed image part.Time-of-flight camera can comprise the reflector that generates light pulse, and this light can be infrared light, and light pulse is from the reflector to the target and turn back to measured distance with the computed image part of time of transducer in time-of-flight camera.
Device 200 is electrically connected to camera 204 and user interface 201 through wired or wireless path, and the operation that is configured to control processor 205 is to provide the input based on camera.In a structure, the application based on the input of camera that device 200 uses processors 205 or other control circuit to be provided for strengthening with execution.Particularly, device receives from the input of camera 204 and processes the input that receives, with calculating user's position and the motion of representative in user interface 201, and based on these motions and widget interaction.
In the execution mode of an example, produce input by the image of being the user of dumb show of camera calibration.For example, mobile phone can place table upward and can be to operate the image that generates the user for the camera of (face-forward) in face of using.Replacedly, use camera 204 can detect or identify dumb show, for example by using light stream (optical flow) or some other methods to detect " (tilt left) tilts " dumb show left, and the project of using the left side that the dumb show detect is positioned at control with the representative that is moved to the left the user and selection, or by detecting " forward and be tilted to the right " dumb show, with the user's that moves up representative and move to the right-hand of center, thereby select to be positioned at the project of the upper right of control.
Therefore, camera 204 can also comprise or replace to input unit or the module of some other types of position, angle that can checkout gear 200, for example gyroscope (gyroscope), accelerometer (accelerometer) or light stream tracker (optical flow tracker).In this, camera can replenish with the inclination sensor input or with the inclination sensor input and replace, to carry out function or the order of user's expectation.Similarly, can not use camera and the detection of user's dumb show occurs.For example, by stroke pattern (stroke pattern) mobile device with visual same type on the control on the user interface, the user can control identical interface or application in the mode with visually happiness and stimulation direct, intuition.
Be described as personal computer (PC) or set-top box although will install 200, such description only is for the sake of simplicity, and can conceive other execution mode or the form of expression.For example, device 200 can be implemented as television set, ultra portable personal computer (ultra-mobile personal computer, UMPC), mobile internet device (MID), DPF (digital picture frame, DPF), portable electronic device (PMP), all-purpose computer (for example: desktop computer, work station or notebook), server, game station or control desk, or comprise processor or be configured to carry out the electronic installation of any other type of other control circuit of instruction, or comprise any miscellaneous equipment of user interface.
Fig. 3 shows in real time or approaches and uses in real time the processing procedure 300 based on the input of camera that strengthens.Surveyed area is can be with the user overlapping or can be not overlapping with the user on every side, comprises user's frequent movable part.In brief, processing procedure 300 is included in the image of user in the scene definition around user's surveyed area, and detects the position of target in surveyed area.In addition, processing procedure 300 comprises based on the position of the target that detects and the widget interaction in the user interface.
In more detail, when processing procedure 300 beginning (S301), in scene, define the surveyed area (S302) around the user in user's the image.Briefly with reference to aforementioned, Fig. 4 shows the example system 400 for the image of taking user 410 (illustrating from the side), and Fig. 5 has described the user's 410 (illustrating from the front) who uses 400 shootings of identical example system example images 500.
Particularly, system 400 comprises set-top box 401, camera 402 and television set 404.Camera 402 is taken the image of a certain scene in its visual field 403, wherein this scene for example comprises sofa 411 and the background of user 410 and sofa 411 back, for example wall that the right hand 405, head 406, trunk 407 and the user's 410 of stretching, extension the part, user 410 of two legs 409 sat thereon.
Can be with one or more compress or memory image 500 in any amount of Image Compression, this image 500 is rest images or from the piece image in a series of moving images.Image 500 can be the camera image through cutting apart that detects for touching, and also is used for determining and showing the user representative of user interface.Use after camera 402 photographic images, can be at this image of transmission over networks.
After the user carries out the agreement dumb show, can define the surveyed area around the user.The agreement dumb show activates or call such function: monitoring is used for other image based on the order input of dumb show, and ignores, filters or get rid of random, unconscious body action or do not have other body action or the background action of definition command input.
As using all the time like that herein, " dumb show " is intended to refer to non-language (non-verbal) form of communication that a part or whole part or a plurality of human body by human body carry out, and contrasts with forming such as the such communication of speech.For example, can define dumb show by movement, change or the conversion between primary importance, posture or expression and second, position or expression.Exemplary dumb show for example comprises: " beating quotation marks (air quote) " gesture, the posture of bowing, curtsey, cheek are kissed, are pointed or manually do, worship on bended knees, head shakes or move, clap the hands, nod, sad face, the fist that lifts, salute, impact or shake action, the action of thumbing up or Fingers to dumb show.
The agreement dumb show can be to keep certain gestures or hand action sequence the scheduled volume time, that use gesture in the camera tracing area of display front and express.An exemplary gesture is: such gesture: keep vertical position and all finger and thumbs to separate fully.Another example is by stretching their arm in face of the user, and in user's head the place ahead circularly mobile they arm and the circular hand action finished.In fact, the agreement dumb show has been ready to occur further input based on camera to the device explanation user who generates user interface.In order to reduce mistake, the agreement dumb show can be atypical dumb show, the dumb show that for example can subconsciously not make with body language during normal talk, or the dumb show that can not make in the common performance of normal human subject activity.
Thereby, can obtain having defined the dumb show of user's idea, suggestion, emotion, interchange, order, demonstration or expression from this two width of cloth image.For example, user's dumb show can be single or multiple finger gesture; Single hand posture; Single hand and arm posture; Single hand and arm and body gesture; The both hands posture; The change of head pose or attitude; The change of eye position; The change of facial expression; The conversion of the change of body gesture or attitude or the condition of any other expressivity.
For simplicity, be commonly referred to " control target " for body part or a plurality of body part of carrying out relevant dumb show.For example, the user can use their whole health or express order with other physical target, and in the case, their whole health or other physical target can be the control targets.The user can express and order more knifeedge by blinking, pointing by the expansion nostril or by swinging, and in the case, eyelid, nose or finger can be the control targets.User's dumb show can be to start or the expression of " agreement " dumb show in the single image or between two images.The control target can also be physical unit, for example infrared finger light, retroreflector or remote control.
Many modes of user's dumb show are determined in existence from camera image.For example, by using dumb show analysis and the Check processing of hand, arm, health, head or other target position information, can detect the dumb show of " drawing circle aloft " or " hand is waved a side ".Shift although dumb show can comprise bidimensional or three-dimensional position, for example when making when brandishing gesture, dumb show comprises the conversion of the position transfer of not following in other cases.For example, if send "signal for " stop " with hand forward by stretching five fingers, palms, if all five fingers retraction balling-up and palms keep forward so, even then the integral position of hand or arm keeps fixing, user's posture also changes and expresses thus dumb show.
Can use heuristic technique to detect dumb show, for example by determining whether hand position information passes through the rule set of clear and definite (explicit).For example, satisfied if following dumb show detects rule: (1) through the time interval less than predetermined limits, the change of horizontal level is greater than preset distance; (2) through this time interval, horizontal level changes monotonously; (3) through this time interval, the change of upright position is less than preset distance; And the border (or on the border at surveyed area) of the position more close surveyed area of (4) position when the end in this time interval during than beginning in this time interval, can identify so the dumb show of " hand is waved a side ".
Some dumb shows adopt a plurality of rule sets that are performed and satisfy with clear and definite order, and wherein the satisfied system that causes of rule set changes to the state of using the Different Rule collection.Such system may not detect delicate dumb show, can use in the case hidden Markov model (Hidden MarkovModel), because these models make it possible to detect a series of concrete actions, and consider that these actions fully meet the whole possibility of a certain dumb show.
Except the dumb show of health, arm or hand, can from one or more images, identify Fingers to dumb show.For example, the dumb show of " pointing to left " can be made with the finger tip of user's finger, and detects by analyzing finger-image.Fingerprint analysis or other method can be used for determining to point to the direction of finger tip.In other exemplary execution mode, and as mentioned above, can not detect dumb show with camera, for example in the situation that dumb show is language dumb show or when detecting with inclination sensor or accelerometer.
In the user images that camera is taken, user's arm can be identified as making the gesture from downward primary importance to the palm of the stretching out second place forward.The user carries out this agreement dumb show can cause user's representative to be presented in the user interface together with control.Because when carrying out this dumb show, the user may not at the center of viewing field of camera, therefore can call again aligning, cutting, redefine the center or amplify processing after agreement.
In an illustrative embodiments, image 500 is to scratch picture camera image, the user's who is positioned at the color background front 410 who for example obtains image.Scratch the part of mating as those colors of processing and identification image 500 and background color and be background with those part classifyings.The part of those color deviation background colors of image 500 may be user 410 part, and is classified as prospect.
In other illustrative embodiments, image 500 is degree of depth key camera images.Typically, obtain degree of depth key camera image by the camera that can determine the degree of depth or distance, for example time-of-flight camera (for example by
Figure GPA00001077255100131
The camera of producing) or stereoscopic camera (for example by
Figure GPA00001077255100132
The camera of producing).For an exemplary time-of-flight camera, infrared transmitter emission infrared light pulse, optical pickocff is measured light and is advanced to target and turn back to the time that camera spends, and based on the distance of this Time Calculation target.
For an exemplary stereoscopic camera, a plurality of optical pickocffs are from the image of a plurality of visual angles photographed scene.So stereoscopic camera can more a plurality of images, determining the difference of target location between a plurality of visual angles, and calculate the distance of target based on this difference.Based on this distance image section is categorized as prospect and background.Classification is processed and can be comprised this distance and threshold value are compared, thereby, if this distance is prospect less than threshold value then with these part classifyings, and if this distance be background greater than threshold value then with these part classifyings.Classification is processed and can also be comprised this distance and background model are compared, thus the background model scene degree of depth of (for example floor and furniture) partly that represents not have the user.If this distance is shorter than the appropriate section of model, then these parts can be categorized as prospect, and if this distance appropriate section of equaling or being longer than model, then these parts can be categorized as background.
In another example, image 500 uses the background color model.For example, image 500 can compare with the background color model of the desired appearance that represents background.There is not user's the scene image can the background extraction color model by storage.If the color similarity of the appropriate section of the color of certain part of camera image and background color model is background with this part classifying so just.If the color of certain part of camera image not with the color similarity of the appropriate section of background color model, be prospect with this part classifying so just.
In other example, use skin color model that image 500 is cut apart.For example, the skin color model of image 500 with the desired appearance of expression user skin can be compared.Can pre-defined skin color model, and the skin color of sampling in the image based on a large amount of people.Can also define skin color model based on the image of user's skin.If the color similarity in the color of certain part of camera image and the skin color model is prospect with this part classifying so just.If color not with skin color model in any color similarity of occurring, be background with this part classifying so just.
In addition, image 500 can be motion images, thereby the image section that comprises action is classified as prospect, and seems that static image section is classified as background.Divide camera image by action and can comprise and obtain sequence of camera images, calculate the difference between the consecutive image, and this difference and threshold value are compared.Can use other technology for detection of the action in one or more images, for example light stream.
Although above described several for generation of the illustrative methods of the camera image through cutting apart, but in fact can conceive other method or technology, and these other methods or technology can replace described method, replenishing or using in conjunction with described method as described method.
Because position (" body position ") (inter alia) that subsequently can user 410 defines the position of surveyed area, in order to define the position of control in user interface, or in order to cutting or enlarged image, therefore at image 500 interior detection body positions.Body position can be to use the facial position of detecting the head 406 of determining.Facial detection can the scanning camera image each several part to detect and the corresponding feature of human face's outward appearance.
Body position can also be: user's arm position, so that all projects are all in the easy coverage area of arm; The position at trunk 407, shank 409, center of gravity or trunk center; Perhaps this position can be relevant with user 410 other parts or aspect, for example shoulder position.Body position can be defined as point, line or be defined as all or part of zone or the scope that comprises user's body.For computational efficiency, body position can be defined as simple shape, and is for example square or circular.For example, in Fig. 5, position 501 (being depicted as plus sige (+)) can be determined, calculated or be estimated as body position.
As shown in Figure 6, the surveyed area 600 around the definition user 410 in user 410 image 500.Usually, surveyed area represents to detect the image section of control target (namely hand or arm), do not comprise generally therefore that the control target can not be positioned in the image 500 part and background wherein, action or other factors so that be difficult to find to control target or calculate expensive image section.Therefore, image 500 for example is divided into surveyed area 600, can not arrive zone 601 and torso area 602.
Because these body part motions of user's trunk or head are frequent, so surveyed area 600 can not comprise the torso area 602 that is associated with user's trunk or head.Torso area 602 also comprises these zones of image 500: this zone is near trunk, and arm or hand also may occupy this zone when user 410 subconsciously moves or only adjusts themselves position.In addition, the zone 601 that can not arrive that user's 410 healths can not arrive is not included in the surveyed area 600.Remaining surveyed area 600 occupies so a part of image: the control target that detects in this zone should more may show user 410 intentional body kinematics or order input.In this, checking surveyed area 600 determining whether the control target is arranged in the post-treatment operations in this zone, can be only for surveyed area 600 or for less than whole image but those image sections that comprise surveyed area 600 carry out these and process operation.
Can locate surveyed area 600 according to user's actuating range.Determining of actuating range can be based on the anatomical model of complexity, for example the ability of mimic biology motion (bio-kinetic) restriction or human body and these abilities are applied to user's model.In some cases, also can use better simply rule of thumb (rules-of-thumb), for example the accessibility distance of the estimating user arm size that equals user's head multiply by the method for a certain multiplier.In other cases, can and be stored in based on user identity (for example, use face recognition and determine) that device is interior to come estimating user arm coverage area about user's information (for example, user's age and sex).Under any circumstance, define the concrete location of surveyed area 600 by some aspects of user.Like this, suppose that the detected zone 600 of user centers on.
Therefore, surveyed area 600 can be defined as in the zone such in the visual field of camera: in this zone for user 410, the action by body part show they in the fill order input, to signal or make a gesture be feasible or possible.Yet, similarly, surveyed area 600 can be defined as outside so a part of visual field of camera: to such an extent as to action occurs to such an extent that so frequently be difficult to intentional order input dumb show and non-order input action are made a distinction in this part.
Datum line 604 can be positioned in the surveyed area 600.For example, datum line 604 can be defined as being parallel to the line (or having with it other spatial relationship) on the border 603 of surveyed area 600.Such as following more detailed description, can optional item be set along datum line 604.For example, in Figure 10, the datum line 1009 of continuing to use in the interface, family 1000 arranges optional item 1010a to 1010z.
Fig. 7 shows by larger image being carried out placed in the middle and cutting produces less image.Particularly, the image 700 that is similar to image 500 is centered by the head position 703 of user's head 701 and be exaggerated, to produce the image 702 through cutting.Make some part of camera image placed in the middle or amplify can so that the motion of control target (for example hand 704) can be more intuitively or easily be mapped in the representative of the user in the last resulting user interface.
The zoomed-in view that can in user interface, show the image 702 of the image 702 of cutting or cutting.Can make user's image placed in the middle, wherein can be based on image is carried out cutting to image between two parties.Can determine so that in the image placed in the middle control be accessibility multiplication factor to the user, wherein can be based on this multiplication factor and further cutting image.Can use anatomical model to determine multiplication factor.
In Fig. 7, to image 700 clap panoramas or carry out convergent-divergent so that body position with respect to for example positioning in some such projects of the assembly of the control shown in the user interface.Can simulate bat panorama or convergent-divergent by such processing: take full camera image 700, and from full camera image 700, select to zoom in or out the image 702 through cutting that shows to be fit to.Will be through image 702 bi-directional scalings of cutting filling the demonstration image of user interface, and storage or abandon the remainder of full camera image.Such as hereinafter in further detail description, coordinate, for example the coordinate of body position 703 or hand position 704 can correspondingly be mapped to the feature in the user's that user interface shows representative.
Surveyed area can comprise that the arc of user top detects subregion, and this arc detects subregion and two linears detect the subregion adjacency, and each linear detects subregion and is positioned at user's one side.Although above surveyed area has been described and has been depicted as the shape of a hoof, can also based on the setting in advance or other shape of user's setup and use of manufacturer, perhaps can dynamically determine the shape of surveyed area.For example, surveyed area can be round-shaped, only rectilinear form or comprise symmetry or any other shape of asymmetrical shape.Surveyed area can also define a plurality of non-adjacent zones, for example for the user with dummy keyboard control (for example referring to Figure 40).
Fig. 8 shows in the scene that comprises two candidate user, the surveyed area of definition around the control user.Particularly, in case detect two candidate user 801 and at 802 o'clock in image 800, surveyed area 804 can be defined as the actuating range 805 of the arm of considering non-control user 802.Rephrase the statement, if in certain part of surveyed area 804, can detect non-control user 802 (or control target of non-control user 802), so just can this part not foreclosed from surveyed area.In addition, can define surveyed area for each user in the user interface 800, wherein at those otherwise the overlapping image-region place of surveyed area will occur, the border of each surveyed area of cutting (perhaps for concrete processing operation, mark).
The result is that surveyed area 804 can adopt shape more complicated, Dynamic Definition.The rule application that replacedly, less calculating can be spent is in image, so that image-based theme or scenario definition surveyed area.For example, if the user so at wall or other people opposition side, can define close to the user surveyed area of rectilinear or other shape near wall or other people.
Mutual project can be arranged in the surveyed area 804.Can the computational item destination locations and size so that all projects are at least partially disposed in the surveyed area 804, and when changing the surveyed area definition, for example for fear of non-control user's 802 arm, can recomputate, so that all projects remain in the surveyed area.Can project be set along the datum line in the surveyed area 804, and can make the position activity of project get up to move along datum line, so that all projects appear in the surveyed area of cutting.
Detect the position (S304) of target (for example hand) in surveyed area.Use or be stacked on the new images by the surveyed area that will before define, can detect in same image or another image or other image, for example the position of the target in the successive image.For example, surveyed area can be defined in the zone of scene or in the zone of image.If camera or user are moving, can from the image to the image, adjust surveyed area to compensate this motion in ground so.For with user interface in widget interaction, determine the position (" customer location ") of user's control section, be used for being mapped to other representative of cursor, mark, incarnation or user in the user interface.
Can express in a number of ways the position of control target.For example and shown in the image 900 of Fig. 9, user 902 hand position 901 can be expressed as absolute or relative coordinate (x, y), perhaps is expressed as angle θ or from angle θ and absolute or the relative distance d of known point 904 (for example body position).In addition, if surveyed area 905 is divided into a plurality of subregions (for example subregion 905a to 905h) or is associated with a plurality of interactive elements, the position of controlling so target can be expressed as subregion unique identifier or with the interactive elements of hand position 901 overlapping or the most close hand positions 901.In Fig. 9, for example, because hand position 901 overlapping subregion 905e, so the position of control target can be expressed as the identifier that is associated with subregion 905e simply.Can also use other method with one dimension, bidimensional or three-dimensional expression target location.
Determine that customer location can comprise the camera image of analyzing through cutting apart, wherein camera image is divided into prospect part and background parts.For example, may represent that user's parts of images (for example pixel) is categorized as prospect, can not represent that user's part classifying is background.Use determines that through the camera image of cutting apart customer location takes the image section of datum line top into account, and selectively, can comprise the short distance of datum line below.Customer location can be calculated as the mean place of all prospect parts in the zone.Use determines that through the camera image of cutting apart the quantity of prospect part is counted in customer location can comprise the zone, and should count and threshold value compares, and if this counting greater than threshold value then according to the customer location that finds customer location is classified.
Use is determined through the camera image of cutting apart that customer location can also comprise and is divided clustering with the foreground portion in the zone, selects bunch (cluster), and all prospects mean place of (for example pixel) partly in customer location is calculated as bunch.Calculate mean place and can comprise the calculating weighted average, wherein for the part of getting over above datum line, the weight of this part is larger.
By to more the partial weighting above datum line is larger, mean place may be a part of user of datum line top.In some embodiments, system can be arranged so that the customer location major part with user interface interaction is below datum line.In these execution modes, system can be arranged so that the user can be positioned at by the hand with the user head upward and select character above datum line.In addition, in these execution modes, by to more the partial weighting above datum line is larger, user's the mean place that detects may be the finger tip of expression user's hand.
In other embodiments, use determines that through the camera image of cutting apart customer location comprises the shape of the prospect part of analyzing the camera image through cutting apart.The shape of analysis prospect part can comprise identification top prospect part, and the position that customer location is calculated as top prospect part.Replacedly, the shape of analyzing the prospect part of the camera image through cutting apart can comprise the profile of generation prospect part, the shape facility of contour identification, and definite shape facility hand whether.Customer location can be calculated as hand position.
In addition, determine that customer location can comprise the analysis camera image with identification user's hand, and the position of definite hand.Hand position in the identification camera image can comprise that the skin color model with the desired appearance of part camera image and expression user skin compares.Such as the color in skin color model of being similar in color of fruit part camera image, be skin with this part classifying so just.The part that is classified as skin is gathered into part bunch, and the part that overall positions and size satisfy one or more standards bunch is categorized as hand.Can use the whole bag of tricks of the hand position in the identification camera image, and should be appreciated that, can adopt other hand tracking.
User interface can adopt regional ensemble.Regional ensemble can comprise the zone about each project in the set.Determine that each regional touch condition can comprise that the part of the camera image through cutting apart (for example user) that determines whether the prospect that is classified as is in the zone.In other example, determine that each regional touch condition can comprise that part hand, the camera image through cutting apart of determining to be classified as the user is whether in the zone.This regional ensemble can comprise the zone about each project of subset.Regional ensemble can change based on the mutual of user (for example, mobile within showing image).
The representative that shows the user can comprise the incarnation that demonstration is such: its outward appearance can be configured by the user.For example, the user can configure sex, size, facial characteristics and the clothes of incarnation.The representative that shows the user can comprise the identity that detects the user, and demonstration and the user's that detects the corresponding user representative of identity.Can use face recognition to detect user's identity.Can comprise incarnation with the corresponding user representative of the user's who detects identity, its outward appearance is configured by the user.
The representative that shows the user can comprise the demonstration camera image, for example covers the texture image of the user on the incarnation.For example, obtain video image and be presented on the display from video camera.Video image can comprise user's image and the room at user place (and other things around the user).Comprise that the project of subset and the foreground graphic set of text can be presented on the video camera image, partly cover camera image.
The representative that shows the user can comprise the camera image that shows through cutting apart, and wherein camera image is divided into prospect part and background parts.For example, probably the image section of representative of consumer is classified as prospect, and part that can not representative of consumer is classified as background.
The camera image of demonstration through cutting apart can comprise the part that only presents the prospect of being categorized as.The part that only presents the prospect of being categorized as can comprise the alpha channel (alphachannel) that generates expression opacity (opacity), and use alpha channel with camera image and background graphics set combination, wherein prospect partly is defined as opaquely, and background parts is defined as transparent.
Therefore, user interface can display foreground element and background element.Foreground elements can cover background element.Comprise the foreground graphic collection presentation of the project of subset and text on foreground elements and background graphics, partly overlapping with foreground elements and background element.Can generate other foreground elements and background element or figure by using.
Based on the position of the target that detects, with the control (for example button of dummy keyboard) mutual (S305) in the user interface, and processing procedure 300 finishes (S306).In an example, control is dummy keyboard or mouse.In another example, control can comprise and the project of aiming at respect to the datum line of representative definition, does not wherein make display items display under the fuzzy condition of representative.
Again turn back to Fig. 9, the position of control target is mapped to control inputs.For example, can based on hand position 901 (being expressed as (x, y), θ, θ+d, subregion or interactive elements identifier or some other marks) is input to look-up table, database or formula, determine cursor position.For example, in user interface 910, cursor position 911 can be expressed as absolute or relative screen coordinate (x ', y '), or angle θ ' or add apart from d ' away from the angle θ ' of known location (for example the incarnation body position 912).Replacedly, if user interface 910 definition is a plurality of regional 914, export so the unique identifier that cursor position 911 can be designated and input the corresponding zone of subregion or interactive elements identifier.In the example that illustrates, because regional 914d is corresponding to subregion 905e, so cursor position 911 is presented in the regional 914d.
So, occur and widget interaction by calling be associated with cursor position functional.Functional project, the operation that can select or select near cursor position of for example, calling used, carried out the media concentrator function, triggers mouse event, changes the state of control or carry out any other man-machine input.
Input position parameter (for example x, y, θ or d) can be identical with the outgoing position parameter or can be in it 1: 1 relation.For example, the angle θ of user's arm can be identical with the angle θ ' of incarnation arm or mirror image with it.For example in user interface 920, cursor position 921 is arranged on angle θ place with respect to incarnation body position 922.Therefore incarnation (or cursor) can represent the user, and can be positioned at top, below or the next door of control.
Based on the position of the target that detects, can make the incarnation activity get up to imitate user's action.Can detect user's face, can determine based on the face that detects user's identity, and can adjust control based on user's identity.Can detect the position of the second target in the surveyed area, and can be based on the position of the second target that detects and the second widget interaction in the user interface.
Fig. 1 to Fig. 9 above shows the control that the input based on camera can be applied to strengthen, and for example is mapped to cursor in the user interface or other representative of user by the user's arm action that will photograph in the image.In this, hereinafter Figure 10 to Figure 39 shows a kind of non-limitative example of control of enhancing, and it can use by the input based on camera, perhaps is adapted for the input of accepting based on camera.Different controls also can use the input based on camera, for example control shown in Figure 40 or other figure.
Figure 10 is the conceptual schematic view in the user interface 1000 of the control 1002 of the usage example enhancing shown in the state 1004a to 1004d, the input that the control 1002 of this enhancing uses based on camera.In state 1004a, user interface 1000 comprises (except other) incarnation 1006 and is arranged on the control 1002 of incarnation 1006 tops, wherein this state 1004a appears at and detects before the agreement dumb show, perhaps after detection agreement dumb show but detect the control target in defined surveyed area before.Control 1002 comprises datum line 1009 (being shown dotted line in state 1004a), and each represents the project 1010a to 1010z of the letter in the English alphabet.Although project 1010a to 1010z is depicted as expression letter, they also can represent other, for example the group of icon, function, image, target, symbol, application, character or similar or dissimilar project or bunch.
In state 1004a, because also do not detect user's input (namely, do not detect user's hand or arm in the defined range in image around the user), institute's incarnation 1006 is depicted as and is in lax (neutral) or release position, wherein the arm 1011a of incarnation 1006 and 1011b loosen along against the trunk 1012 of incarnation 1006.
Incarnation 1006 can have similar people's outward appearance, and can handle (navigate) in virtual world.The example of incarnation comprises and being similar to by BLIZZARD
Figure GPA00001077255100201
The WORLD OF that produces
Figure GPA00001077255100202
Video-game and such as by LINDEN RESEARCH,
Figure GPA00001077255100203
The SECOND that produces
Figure GPA00001077255100204
Virtual world in the character that can play.Input based on camera can be mapped to the control section of target, the hand of incarnation for example is with the action of incarnation mirror image or imitate their action in the space around the user's body.By take user's texture image with camera and with this doubling of the image on incarnation 1006, incarnation 1006 can have user's outward appearance.
" project " is intended to refer to the user may want any unit or the element selected from unit or element set, include but not limited to this item class like or similarly unit or element.This set can include as few as does not have project or as many as is hundreds of, thousands of or millions of projects.In a simple example, project is alphabetic character, and project set comprises 26 letters in the English alphabet, or 52 upper and lower case letters.This set could form or comprise sign character, for example symbol by sign character similarly! ,, #, $, %, ^, ﹠amp; , *, (),,,<,:; , ', ",? ,/,~, `, or generally select other unavailable symbol via keyboard or keypad.Other set can comprise the title that can select from the tabulation of title, icon, function or setting, for example name or place name, in this way media setting of this setting example (for example, " broadcast ", " volume tunes up ", " closing ", " record series " etc.).
In this, project can comprise numeral (0 to 9); Letter (for example, the alphabetical A to Z of the English alphabet, or the assumed name in the japanese type (kana) character); Textual character (for example, space " ", hyphen "-", fullstop ". "); Text phrases (for example, " .com ", " .org ", " frequently asked questions and corresponding answer ", " main menu ", " transmission ", " receiving calls ", " DVD playing back ", " shutdown " etc.) predetermined or that dynamically determine; Title; Title; Time; Date; Operation, event, processing or function (for example, " preservation game ", " being written into game ", " beginning to use ", " the transmission of data ", " media player ", " Photo Browser ", " painting brush ", " Email ", " map "); Config option (for example, " 640 * 480 resolution model ", " 800 * 600 resolution model ", " expert mode ", " new hand's pattern ", " training mode ", " cheat mode "); Or any out of Memory or information combination.
When detect in the defined range around the user user such as the control target of hand the time, go out present condition 1004b.This position is mapped to the cursor position shown in the plus sige, and it passes through top, project 1010z residing user interface 1000 zone, and defines thus the situation similar with the mouse-over situation.In case detect the mouse-over situation, just make the arm 1011b activity of incarnation 1006 get up to appear to incarnation 1006 sensings or crawl project 1010z.In addition, it becomes outstanding when the size of project 1010z increases, as contiguous project 1010y really diminishes.Mouse-over situation above project 1010z also causes exporting project 1010z, for example by display items display 1010z in output area 1014, or by using the text voice transducer to use loud speaker to send the sound of project 1010z.Therefore the mouse-over situation can place the project that is associated that select in advance, that activate or outstanding state, although the in advance selection of project or outstandingly strictly do not require project to change its outward appearance.
When hand that the user stretches him in the space around their health moves near the position in his defined range of above-head, and when detecting this more vertical position, user interface 1000 is transitioned into state 1004c.This hand position is mapped to the project 1010s top of control 1002 along the new cursor position of datum line 1009.Make the arm 1011b activity of incarnation 1006, and follow user's the motion of hand in the space.Because user's representative is near project or cross datum line 1009, the project that is therefore approached becomes outstanding by increased in size, and along with representative is advanced away from outstanding project, project reduces size.
When the user selection physical button or when carrying out other selection dumb show (dumb show of for example carrying out with his another hand, or by changing his hand posture), can export the project 1010s of activation.Also can continue predetermined amount of time by representative being hovered over the letter top, perhaps in above the label type cursor is positioned at letter, select keyboard, occur selecting.The selection (or potential selection) of letter can cause project to become outstanding, thereby along datum line letter is distinguished.Incarnation or to be used for other target of definition datum line position can self and project mutual, as user's Motion mapping to the situation of the action of the arm 1011b of incarnation 1006.
In state 1004d, option 1010s.In case select, project just turns back to their initial non-outstanding situations.In addition, the arm 1011b of incarnation 1006 turns back to the initial release position of its contiguous trunk 1012.The project 1010s that for example output is selected in output area 1014.
Shown in various states, can use the control of enhancing, so that from form alphabetic(al) letter group, select a letter or a plurality of letter.By these projects are directed or aim at along datum line, these projects show not overlappingly, do not block or otherwise fuzzy incarnation or other target, thus so that intuitively option of user.
Use device shown in Figure 2, can call, export and use the control of enhancing.In brief, this processing can comprise with respect to an object definition datum line in the user interface, and show the project of aiming at datum line under the condition that does not make objective fuzzy.In addition, can export selected project based on receiving a project of selecting in the shown project.
More specifically, can be with respect to an object definition datum line in the user interface.This can when the user manually shows the control that show enhancing or when identifying project be with the input project the time (when for example the user advances to input text area) beginning.Because, by increasing the size of outstanding project, can use the control of enhancing to show more project along datum line, if wherein all projects have identical yardstick, this datum line will be suitable in the user interface usually so, therefore, also can determine that projects all under not using the control condition of enhancing begins the definition of datum line after can't effectively being presented in the user interface.Do like this, the control of enhancing shows for subset of items provides with fully or effective size, so that the user can be easily and selected reliably specific project.
With reference to aforementioned, Figure 11 shows user interface 1101 briefly, and this user interface 1101 comprises target 1102 (with dashed lines description) and also comprises the control 1105 of datum line 1104.Although in Figure 11 datum line 1104 is depicted as dotted line, in other exemplary execution mode, datum line 1104 is described as solid line or does not describe fully.
Control 1105 makes it possible to select intuitively or exports project such as letter 1106, make it possible to simultaneously in abutting connection with, close, partially or even wholly defined by datum line 1104 or around or the zone of definition shown in part or all target 1102.When determining to export or during selected project, for example when user selection requires the input field of text data input, control is positioned at position such in the user interface 1101: this position makes it possible to display items display 1106 and target 1102, and thus with respect to, for, based on or about target 1102 definition.Rephrase the statement, the position of target 1102 can represent that such anchor position puts: other element of datum line 1104 and control 1105 can be put orientation or aligning about this anchor position, or other element can be put the location with respect to this anchor position.
Put for definite this anchor position, determine to come the position of target He this target of definition datum line 1104 by it.In some cases, can pre-determine or the predefine target, these situations for example export control 1105 together with the known white space of incarnation, user's image, knob, icon, tabulation, tables of data, datagraphic, text entry field, other control or little or user interface 1101 or control 1105 comprises above-mentioned situation.In other cases, in the time will exporting control 1105, dynamically determine target, these situations for example are used as the situation of target to the text field and text field in user's tabbing, in the situation of the white space of user interface 1101 being located based on the current state of user interface 1101, determining that dynamically maximum on the screen, outstanding, the most bright-coloured or least bright-coloured target becomes the situation of target, or determining that dynamically in focus element, zone or window are the situation of target.
In these or other situation, detect size, shape, position, border or other background of target 1102, and about these background definition datum lines 1104 that detects.For example, datum line 1104 can be defined as relation or any other relation, for example relation of the size of based target 1102, shape, ratio or anatomical model of having relation, space constraint or the restriction of overlapping or non-overlapped relation, the relation of dividing equally, division with target or a part of target.
In a word, except other, user interface 1101 comprises it can being the target 1102 of user representative and the project 1106 that forms project set.Display items display 1106 in a dynamic way so that the size that project 1106 shows and position so that the user can be expediently and select reliably each project.Because may not so that each project 1106 is suitable for user interface 1101 or aims at datum line 1104 with large scale, therefore can present with larger size the subset of project 1106.
Figure 12 and Figure 13 show exemplary datum line layout and the exemplary relation between datum line and the target.In Figure 12, datum line 1201 is straight, and is defined as with target 1202 overlappingly, or divides the top 1/3rd of target 1202, or the trunk 1205 of the head 1204 that makes target and target 1202 separates.In Figure 13, datum line 1301 is crooked, and is defined as the top apart from the distance of target 1302 certain pixel of top, and target 1302 is fogged, perhaps in the actuating range of the arm 1304 (or other control section) of target 1302.
Can be with respect to the object definition datum line, so that the project of aiming at datum line is in the coverage area of target.As shown in figure 13, crooked datum line 1301 forms arc, so that the arm 1304 of one of them stretching, extension of incarnation 1302 can arrive any single project or subset of items.In some cases, can determine the radius of curvature of crooked datum line 1301 based on the length of the arm 1304 of incarnation 1302, so that when one of them arm 1304 be in stretch and the crown on the position time, the position of crooked datum line 1301 is corresponding to the naturally swing of one of them arm 1304.In other cases, can determine based on the scope of input unit (for example, the scope at the angle of inclination of inclination sensor, or the range of movement of directionkeys on the PlayStation 3 videogame console/PS3) radius of curvature of crooked datum line 1301.In other cases, can determine the radius of curvature of crooked datum line 1301 based on user's the actuating range length of user's arm (for example, according to).The actuating range of determining the user can comprise the identity (using facial the detection) that detects the user, and based on specific user's age, sex or estimate this user's actuating range about user's out of Memory.
Although datum line has been described as being positioned at the project below, wherein cursor is mobile with option above datum line, can be other layout and configuration.For example, datum line can be positioned at the top of optional item, or the side of optional item, and can be flatly, vertically or diagonally towards.
Except straight and datum line bending shown in Figure 12 and Figure 13, datum line can adopt zigzag, circle, polygon or any other shape.The starting point of datum line and terminal point can flatly or vertically be aimed at, and perhaps these points can not be in the aligning of any type.Datum line can be circular with continuous, thereby datum line does not have limited beginning or end (or beginning or end not occurring), or starting point is consistent with terminal point.Datum line self can be target motion or that the quilt activity is got up, so that move consistently or off and on or activity along the project self of datum line aligning.
Can determine actuating range based on the anatomical model of complexity, for example mimic biology movement limit or Human Body Capacity and these abilities are applied to incarnation or user's model.Can also use simpler rule of thumb, for example the accessibility distance of the incarnation arm size that equals the incarnation head multiply by the method for estimation of multiplier.Under any circumstance, come concrete location, orientation, aligning or the structure of definition datum line by some aspects of target.For this reason, can say that target 1302 is by datum line 1301 encirclements.
The position of based target can dynamically be defined as datum line around a part of fixed position.If the part of target or target is positioned at circumference, and datum line is defined as the camber line of any part of this circumference, datum line can be considered as surrounding target so.For example as shown in figure 13, if extend, datum line 1301 will form the circumference around most of target 1302 so, except the finger that represents target 1302, will reach outside the zonule the circumference.
The user because the definition of datum line can be dynamic, thereby can in real time or approach and redefine in real time datum line, so that can discover or observe redefining or reorientating of datum line.For example, when detecting when not having other target adjacent objects, datum line can initially be defined as above this target, as shown in figure 13.As shown in figure 14, if the second target 1401 enters in the user interface 1402 in the left side of controlling target 1404, can redefine or reorientate with away from the second target 1401 datum line 1405 so, perhaps dynamically be defined as the side in the control target 1404 relative with the second target 1401.The second target can corresponding to the second user in the camera image, by network control the second user incarnation, by programme controlled non-participation personage's incarnation or the target of other type.This feature can advantageously make the control of enhancing can more be effectively applied to have the non-static user interface of a plurality of moving targets.
The definition datum line can also comprise the actuating range of the control section of determining incarnation, and defines datum line in the actuating range of control section, wherein can come the definition datum line along the outward flange of the actuating range of control section.For example, can based on the arm length of incarnation determine vertically towards the radius of curvature of datum line 1405 so that when arm was in extended position to a side, the position of the datum line 1405 of vertical orientation was corresponding to the naturally swing of the arm of incarnation.
In another example, for example the quantity of project too large and can not with situation that single datum line is associated under, perhaps in the situation that user interface is crowded with other elements, in fact each project may represent a plurality of projects or project bunch, perhaps datum line can be divided into a plurality of parts and can define discontinuous part thus, perhaps can define a plurality of datum lines.In Figure 15, because user interface 1500 comprises the target 1501 that almost occupies its whole vertical length, therefore datum line 1502 definition have discrete part or zone 1504 (shown in dotted line), thereby effectively datum line 1502 are divided into datum line part 1502a and 1502b.
In Figure 16 because wait to select or the quantity of the project exported large, and may be so that be difficult to demonstration or option or not directly perceived on single datum line, therefore two datum lines 1601 of definition and 1602 in user interface.Have identical general shape and parallel although datum line 1601 and 1602 is depicted as, whether must adopt these two features.For example, datum line 1601 can be the zigzag datum line of generally aiming at along the left vertical of target 1604, datum line 1602 can be to be defined as the hexagon datum line overlapping with the trunk of target 1604 self, or at the circular horizon line of the complete surrounding target 1604 of all sides.
The selection of the project on the first datum line can produce demonstration or the definition of the project of aiming at the second datum line, for example in the situation of the first datum line for the selection of alphabetic character, in this case, in case selected alphabetic character, the title (or other item types) that begins with the alphabetic character that shows on the first datum line just shows or output at the second datum line.
Replacement comes the definition datum line with respect to target, can also define based on the border of user interface or position baseline.For example, datum line can be defined as a edge from user interface (or near an edge point) and extend to relative edge (or near opposite edges point).Can based on position or other parts of user interface, for example, with respect to the target in desktop icons, user interface button or the virtual world, define or position baseline.Datum line can be about the border symmetry of user interface or asymmetric.
Can under the condition of fuzzy object or non-fuzzy object, show the project of aiming at datum line.If the bottom of each project, centre, right side, left side, center or other parts on datum line, or are arranged to be parallel on the datum line and the corresponding point of project, project can be aimed at datum line so.For example in Figure 11, because the part of each project is parallel to datum line 1104, so even project 1106a and 1106z strictly do not aim at each other, project 1106a and 1106z also aim at datum line 1104 separately.When showing, the project 1106 of aligning presents common shape or the appearance of datum line 1104 self on the whole.Therefore Figure 12 also shows the project of aiming at datum line 1201, because datum line 1201 is straight, the first project 1206a and last project 1206z are generally also aligned with each other.
Shown in the user interface 1700 of Figure 17, in the center of each project or other internal point and situation that datum line 1701 is aimed at, each project can be around its central rotation, and is so not just the same or seem the comparison randomization to give shown project.Can also further emphasize this randomization by project being displaced to datum line above and below (or be parallel to some other lines of the datum line), some projects appear at the below of datum line so that some projects appear at the datum line top, although project is in batch generally still aimed at datum line, or present general appearance, sensation, direction, configuration or the layout that is defined by datum line.
Project can distribute or also can uneven distribution equably along datum line.For example, project can initially evenly distribute, and then redistributes unevenly on detailed programs and during this project adjustment size when cursor hovers.Replacedly, acquiescence or preferred project can distribute along datum line, to seem comparing comparatively outstanding with more not preferred project.In addition, the interval between can identifying project by template or arithmetic function, perhaps this interval can be fully randomized.
Because when display-object, target or at least a portion target are visible, therefore think that target is unambiguous.Rephrase the statement, by display items display under the condition of fuzzy object not, project can be fully not overlapping or be hindered the observability of target.Some features of project, for example the transparency of project, color or live width can change (or optionally change, for example in the place of project with overlapping target), so that goal discrepancy seldom is visible.
Display items display under very first time point can the condition in fuzzy object not, display items object time initially for example, and early than or be later than the second time point of very first time point, dimness, obstruction or overlapping target or a part of target can be blured, be covered, make to project.For instance, show respectively among Figure 12 and Figure 13 not bluring the project that shows under the condition of each target, because at least a portion of each target is visible in user interface.In another example, Figure 16 shows the project that shows under the condition of the target of the connection of fuzzy correlation not, because which arbitrary project all do not have overlap, hinder or intervene the observability of any part of the target that is associated in the user interface.
For example use the input of mouse or camera, can receive one of them the selection of shown project.In one embodiment, user interface can define the such zone around each project: if wherein detect the control section of cursor or target in the associated area of a project, so just select this project.Can be based on the user dynamically redefine alternately this zone, for example, based on the mouse-over situation or when detecting cursor and crossed datum line with regard to increased in size, perhaps just reduce the size in zone away from, outstanding different project or when again having crossed datum line when cursor.
Such as more detailed description hereinafter, the selection of project can appear via several different methods in the input unit type that the realization of using according to the user is selected.For example, this selection can be use mouse, based on the user selection of cursor, wherein the user locates the cursor of mouse or hover over the project top (causing the mouse-over event) that will select, depress mousebutton (causing mouse to press event), and discharge mousebutton (causing mouse to discharge (mouseup) event).Also can use other mouse event, in order to use the mouse option, for example click event connects by (double-click) event, or mouse-over event only.
Use keyboard, the user can be from a project tabbing to another project, and select the project of other keyboard (for example, space bar or return key) to select to give prominence to, perhaps the user can begin typing character to reduce the project of expecting or the project of identifying expectation to select.Use PlayStation 3 videogame console/PS3 or hand-held remote control unit, the user can depress directionkeys and change outstanding project, or press push button is selected the project of giving prominence to.Use has the mobile device of inclination sensor, the user can make device tilt, to the right, up or down left, with left, to the right, move up or down cursor or other indicating device, until then the project of outstanding expectation is depressed button or shaken mobile device to record this selection.Use touch panel device, the user can directly touch the user interface X and Y coordinates of output expectation project.Use speech interfaces, the user can say order, for example " tab (tabbing) ", " left (left side) ", " right (right side) ", " select (selection) " or make cursor mobile other similar voice command between project, and select the project of expectation.
The user can directly control incarnation with mouse or PlayStation 3 videogame console/PS3.For example, use PlayStation 3 videogame console/PS3, user can move the simulation control rod and come the arm of mobile incarnation, thereby the angle that will simulate control rod is mapped to the angle of the arm of incarnation.The user can directly control incarnation by the usage operation filming apparatus, thus the real arm action of the arm of incarnation imitation user.
The control of enhancing described here is compatible mutually with the video based on control system.Particularly, camera can detected image, the action in the part of for example user's image, and image, the image or can in real time or approach in real time dynamic mapping to cursor from the dumb show of image recognition.For example, the surveyed area that centers on the user can be defined as in the image of user in scene, and can detect the position of target (for example hand) in surveyed area.Based on the position of the target that detects and the widget interaction of enhancing.
Although in order briefly to have omitted further describing of other input mechanism, method or technology, can say, can come automatically or manually option with any method of expecting, technology or mechanism.
Target can be user's representative, for example incarnation.By in user interface, showing user's representative, reduced the training demand, this be because, the user can easily identify target with respect to the position of item location, and the position of the control section of mobile cursor or incarnation (for example hand) rapidly is with consistent with the item location of expectation.In this regard, the position of the control section of cursor or incarnation is for detection of selection or " touch " to the project that shows in the user interface.
In the situation that target is incarnation, the potential control section of incarnation can be appointed as the control section that gets up by activity.For example, control section can be one or more arms, leg, elbow, knee, hand or the finger of incarnation; Or the head of incarnation or trunk; Or the physical trait of incarnation, for example nose, eyes, ear, navel, neck or hair; Or the clothes of incarnation, adornment or other dress, for example clothes, jewelry or other personal belongings.Along with receiving user input, the control section of appointment is got up with mutual with project or provide the scene mutual with project by activity.
In the situation that target has knob or switch outward appearance, control section can be the arrow that extends from target subject, and target can rotate to towards the item point of destination.In the situation that target comprises the particIe system effect, for example simulation of flame, plasma, lightning or liquid, particle can form the extension of sensing project, for example distortion of lightning or water droplet, thus provide the scene mutual with project.
For example in Figure 13, target 1302 is incarnation, and the arm 1304a of incarnation is appointed as control section.Move or make their arm to the left or to the right motion in the space around their health along with the user, arm 1304a is got up with mobile to the left or to the right respectively by activity, and near arm, or more specifically near the hand on the arm or finger, come option based on project.Similarly, hand 1307 also can be appointed as control section.In similar configuration, up or down the action in the space around their health of single hand of user or both hands can cause arm 1304a or hand 1305 to move up or down the mode that for example is harmonious with the action with the people.
Because at arm 1304a or hand 1305 not under the certain situation near project or datum line (or not having mutual with it), these actions up or down can make arm 1304a or hand 1305 away from project, so project lip-deep " selection " will can not cause the output project.So that the incarnation major part below datum line, is positioned at the above-head of incarnation and above datum line by the hand with incarnation, the character that undertaken by target or the selection of project can appear by the definition datum line.
Specify the potential control section of incarnation can dynamically exchange to other control section as control section.For example and refer again to Figure 13, if the user exchanges them as the arm of control target, or the position of single arm through being associated with project 1306n that move right, control section just can be transformed into right arm 1304b from left arm 1304a so, so that left arm 1304a can be mutual and select with all items in project 1306n left side along datum line 1301, and so that right arm 1304b can be mutual and select with all items on project 1306o right side along datum line 1301.Such feature can increase the visual fragrance that the user experiences, because replace only using simple cursor of mouse along the project option, the user appears to control with the incarnation of reality and intuitive manner reaction.
If there is exchange, and no longer left arm 1304a is appointed as the control target, makes so left arm 1304a activity get up to turn back to position lax, that have a rest or that loosen, for example along the position on the side of target 1302.On the contrary, such exchange will cause right arm 1304b by the activity of seamless slash, the position that is occupied by arm 1304a before moving to from release position with the side along target 1302, or the position of contiguous this position.Continue this example, if the user is along datum line to and fro their arm of fast moving or both arms between by the space of project 1306n and 1306o restriction, the incarnation arm that will alternately make them in mode joyful, humour or that visually stimulate swings on the head of target 1302 from the sidepiece of target 1302 so.
In another example, the eyes of incarnation can be appointed as the control target, and they can seem and can stare or follow project along datum line, this datum line can surrounding target.The other parts incarnation of not being appointed as the control target also can be by activity, for example seem outstanding Item Response Pattern, or based on outstanding project incarnation is positioned in the posture or more real posture of expectation.In one embodiment, can make the mouth activity of incarnation, speaking or with the selected or outstanding project of language performance to appear to, perhaps can make neck or the head movement of incarnation get up to stretch out one's neck, emphasize that incarnation is being seen as possible or the scene of the target of considering to give prominence to thereby increase.
In a similar fashion, if in user interface, show a plurality of targets (for example, a plurality of incarnation), when close to outstanding letter or the control that strengthens, can change to the second target as controlling target based on user selection or based on any other factors so.Therefore, the appointment of control section or control target can exchange to the second potential control section or target from the first potential control section or target, or exchanges to the first potential control section or target from the second potential control section or target.
Can export selected project.Export selected project and can also comprise the selected project of demonstration, export the electronic marker of selected project, or give prominence to selected project by the color, opacity or the size that in user interface, change selected project.The mark of selected project or selected project can output to other device or storage medium for follow-up use.
In Figure 11, the color of selected project 1106r, opacity or size have changed to the first rank or degree, thereby so that selected project 1106r seems in all items that shows along datum line 1104 is maximum, and with selected project 1106r the equidistant and project 1106p of adjacency and color, opacity or the size of 1106t have changed to second than low level or degree.Still by changing contiguous item purpose color, opacity, size or further feature, the user can identify the position of cursor or the position of selected project relatively easily, increases the usability that is easy to of characteristic directly perceived and controller.Even in the situation that do not present clearly cursor glyph, the color of project, opacity, size or further feature also can the instructs cursor positions.
Because the control that strengthens so that preset time point can select a small amount of project, non-selected thereby most of project keeps, so can reduce the size of non-selected project so that can in user interface, show a large amount of projects.Yet the size that increases some project of consideration selection may increase integral body identification or the readability of the project of considering selection, thus the reliability of raising control.Rephrase the statement, by reducing the size of non-selected project, can present more optional item to the user than selecting dividually or touching.
As describe in more detail below, in the output area 1107 of user interface 1101, can export selected project 1106r together with project 1106f and the prediction project 1109 selected before, the project that prediction project 1109 is based on the project of current selection and selects before uses prediction or prophesy heuristic dynamically to determine.
The control that strengthens provides screen from project set (on-screen) option, for example selects letter from alphabet.Show user representative, so that representative can be expediently and touched reliably along datum line or with respect to the optional item of the camber line aligning that represent Kinematic Positioning.Can in the part of representative and the mutual situation along the item area of datum line, determine to touch.When the part of representative enters the touch area, can amplify or outstanding project near this touch area.
Although datum line is described as defining in two dimension with respect to border or the target of user interface hereinbefore, also can be in three-dimensional the definition datum line.For example, and as shown in figure 18, datum line 1801 can have the first datum line part 1802 that generally defines in the X-Y plane identical with user interface, and the second datum line part 1804 that generally in the Z plane, defines, the Z plane seems to extend orthogonally with respect to the X-Y plane of user interface.Can use other plane or axle.
Use three-dimensional datum line, the number of entry that the enough controls of energy show can be exponential increase.For example, except the arm by move left and right incarnation 1806 comes project " O " on the selection reference line 1801, the user can be in the space forward or backward mobile their arm select other project, when this other project is selected, in output area 1809, show, for example character " " 1807. Datum line part 1802 and 1804 the two all be defined on the three-dimensional of arm of incarnation 1806 can coverage area in.
In addition, although above datum line has been described as comprise discrete starting point and terminal point, in other embodiments, datum line can not have starting point and terminal point, maybe can have the outward appearance that does not have starting point and terminal point, or starting point can be consistent with terminal point.For example, the user interface among Figure 19 1900 comprises the circular horizon line 1901 for option 1902a to 1902g.The circular horizon line comprises starting point 1905 and terminal point 1906, yet these points are consistent, and compares not outstanding or as seen at datum line 1901 with any other point.
In addition, and different from some other exemplary controls, project 1902a to 1902g is separately corresponding to will be by the function of application call, shown in icon.For example, when incarnation 1904 is selected earth icon 1902a, can call mapping and use.Similarly, can use in intuitively and visually joyful mode the control of enhancing described here, be used for the selection of character and be used for calling more complicated functional.Other icon can represent other function, comprise such as volume tune up and turn down function media function, send mail function, forbid controlling function or picture browsing function.
Figure 20 to Figure 24 shows the exemplary selection of detailed programs in the project set 2001 of aiming at along datum line 2002.In Figure 20, the cursor 2004 that is mapped to and is described as the hand of incarnation moves towards interested project 2001r.In Figure 21, datum line 2002 is crossed in the position of cursor 2004.Cross datum line 2002 and can start further Check processing, this Check processing is based on the distance between each project and the cursor 2004, selects or the subset 2101 of identification project 2001.Determine the position of cursor 2004 in the position in the space of their health based on user's hand, as detection in the camera image.
With the subset 2101 of larger size or font ratio display items display 2001, be convenient to the user and more easily select.Can cross datum line 2002 in response to detecting cursor 2004, the amplification that the subset 2101 of the selection of subset 2101 of project 2001 and project 2001 occurs shows, perhaps no matter why cursor 2004 can occur with respect to the position of datum line 2002.When detecting cursor 2004 at home position 2102 and cross datum line 2001 or detect option, just can generate user feedback, comprise sound for example, portrait and/or such as the sense of touch output of vibration.
In Figure 22, the interested project 2001r of user selection.Give prominence to subset 2101 so that the user can select to comprise the general area of items of interest 2001r, the project in " amplification " this zone, and select reliably and expediently interested project 2001r, this project 2001r represents letter " R ".Can knock with mouse event, keyboard or keypad, dumb show identification, occur selecting based on the input of camera or by many other methods.
Can detect with many modes the position of home position 2102, this home position represents that cursor 2004 crossed the position of datum line 2001.For example, and as shown in figure 23, the position of home position 2102 can be defined as position such on the datum line 2001: this position is in the position that detects the most close cursor 2004 that the moment of cursor 2004 after above datum line 2001 observe, or the position of the most close outstanding project such as project 2001r.
Also can use other method to detect home position.For example, can be when cursor 2004 be crossed datum line 2001, perhaps before crossing datum line 2001 and the position of using cursor 2004 of time afterwards, detect the position of home position 2102.For example, Figure 24 shows end points 2401 and end points 2402, and end points 2401 is illustrated in the position of the cursor 2004 of crossing datum line 2002 time observation before, and end points 2402 is illustrated in the position of the cursor 2004 of crossing datum line 2002 time observation afterwards.Home position can be defined as the crosspoint by line segment 2404 with the datum line 2001 of end points 2401 and 2402 definition.
The subset 2101 of outstanding project 2001 can comprise the position of determining along the datum line 2001 of the project that forms subset 2101.In some embodiments, selection is positioned at (for example will the giving prominence to of subset 2101 that project near home position 2102 is project 2001, show with large scale) part, so that remain on or near they initial non-outstanding positions near the project of home position 2102, and more outwards mobile away from the project of home position 2102, increase with the size of the subset 2101 that is fit to project 2001.
Following equation (1) can be used for determining the not position of the project in subset 2101 after the outstanding subset 2101.
X i ′ = X b + ( X i - X b ) · ( S i ′ S i ) - - - ( 1 )
In equation (1), X iBe illustrated under the initial condition project i along the position of datum line; X ' iBe illustrated under the magnifying state project i along the position of datum line; X bExpression is along the home position of datum line; S iBe illustrated in the basic size of project i under the initial condition; And S ' iBe illustrated in the size of project i under the magnifying state.
Figure 25 shows in the first state 2501 before the subset of outstanding project and the subset of project in the second state 2502 after the subset of outstanding project.For example, if cursor 2505 is crossed the datum line 2505 (so that home position 2506 is consistent with project " S ") of project " S " below at first, project " S " just remains on its initial position so, and project " R " moves to left apart from 2507 with respect to its initial position.Therefore, the second state 2502 shows and is highlighting consequent purpose convergent-divergent size and position.
Figure 26 shows in the first state 2601 before the subset of outstanding project and the subset of project in the second state 2602 after the subset of outstanding project.For example, if cursor 2604 is crossed the datum line 2605 (so that home position 2606 is consistent with project " Q ") of project " Q " below, project " Q " will remain on its initial position so, and project " R " moves to right apart from 2607 with respect to its initial position.Therefore, the second state 2602 shows in the project that highlights by convergent-divergent.Thereby, be used for to select the position of the cursor 2604 of detailed programs can depend on that cursor 2604 primitively crosses the position of datum line 2605.
Figure 27 shows the subset in 2701 to the 2704 times projects of state that are associated with outstanding subset of items.Particularly, Figure 27 shows the first subset 2706 of project 2710 to the selection of three subsetss 2708 and outstanding overview.Can be according to the position with respect to the cursor 2711 of the first subset 2706 and the second subset 2707, respectively to the second subset 2707 and three subsetss 2708 location.
In the first state 2701, they initial not have size and a position of giving prominence to project 2710 reflection.In state 2702, selected and the first subset 2706 of outstanding project 2710.In state 2703, selected and the second subset 2707 of outstanding project 2710.In state 2704, selected and the three subsetss 2708 of outstanding project 2710.
In state 2702, cursor 2711 is primitively crossed the datum line 2712 (so that home position is consistent with project " S ") of project " S " below, and project " S " remains on its initial position, and letter on every side outwards moves from their initial position.Be transitioned into state 2703 from state 2702, if cursor 2711 moves right, select so the second subset 2707 of the interior project 2710 of distance of cursor 2711.In state 2703, if cursor 2711 moves with consistent with the project " T " of amplifying along datum line 2712, project " T " remains on the position of its amplification so, and further gives prominence to project " V " towards right side displacement 2715 along datum line 2712.
If do not have enough spaces to be used for some project at datum line 2712, for example project " W " is to " Z ", and these projects are just by the end of " release " datum line so, and do not show.Be transitioned into state 2704 from state 2703, if cursor 2711 continues mobile towards the right-hand member of datum line 2712, so may not have enough space demonstrations as the other project of the part of the second subset 2707 yet, and can form three subsetss 2708 (as the subset of the second subset 2707).
For the project on the right side part of selecting the second subset 2707, project " U " for example, or by the project of the end of " release " datum line, for example project " W " is to " Z ", the user can cross datum line 2712 again with cursor 2711, and crosses for the third time datum line 2712 to set up the new home position of more close expectation project.In addition, replace the right-hand member with project " release " datum line 2712, and can be with the project on the datum line left end " release ", to hold otherwise will be by the demonstration of the project of " release " right-hand member.In some embodiments, replace " release " project, can according on the datum line can with the space reduce item size so that all projects are presented on the datum line.
In other embodiments, can again be appeared at the datum line left end by the project of " release " datum line right-hand member.In the execution mode that adopts continuous (for example circular) datum line, project can advance around datum line.Therefore, can be in the example of the circular horizon line of the continuous clockwise mobile cursor of datum line the user, project beyond getting rid of in the current subset of project can flow clockwise take the angular speed less than cursor (raising speed when they are increased to subset for the project vacating space with box lunch).In this example, for the rev of project around datum line, cursor can move repeatedly revolution around datum line.
Can use rolling (scrolling) so that select otherwise will be pushed out the project of datum line end.Rolling can comprise and detect cursor 2711 whether in the preset distance of datum line 2712 ends, and with a certain speed applications in item location.In the situation that come computational item position (referring to equation (1) above) with respect to home position, can be with speed applications in home position, and therefore can mobile project.
Figure 28 shows exemplary velocity function, and wherein transverse axis 2801 expression is along the position of datum line, and the longitudinal axis 2802 expression speed.Use this velocity function, when cursor position was terminal near datum line, speed moved or the transfer project by applying, when cursor position during at datum line middle, and mobile or transfer project (because speed is zero).
Figure 29 shows so exemplary convergent-divergent function, and it is used for based on the distance (by transverse axis 2902 reflections) of cursor position with respect to datum line, the size of the project in the convergent-divergent subset of items (by the longitudinal axis 2901 reflections).Shown in curve 2904, project (S ' i) size therefore can be the function of the current location of cursor.By reference point 2905 definite basic size (S that are associated with non-outstanding project or all items (if cursor is not crossed datum line) i), and the full-size that line 2906 definition are associated with outstanding project.
In an exemplary execution mode, curve 2904 approximately is 1: 1 near the slope of point 2905, and therefore along with connecing in-plant reducing with cursor, item size seems linearly and increases pari passu.Project will begin to increase near full-size, produce aesthetic joyful transition must move closer to along with cursor variable in outstanding project.
Figure 30 and Figure 31 show the use of zoom feature of the control of enhancing, when particularly appearing at cursor 3001 and crossing datum line 3002.The position of height 3004 expression cursors 3001 and the distance between the datum line 3002, and the size of the project in the height 2005 expression subsets 3007, for example project 3006r.The height 3005 of project is based on height 3004 and convergent-divergent in the subset 3007, so that project seems in Figure 30 than larger in Figure 31, in Figure 30, cursor 3001 has been crossed datum line 3002 in a small amount, and in Figure 31, cursor 3001 has been crossed the less amount of datum line 3002.As above, can use the position of determining cursor 3001 based on the input of camera.
Figure 32 and Figure 33 show exemplary datum line, have wherein given prominence to the subset of project.Particularly, Figure 32 shows exemplary datum line 3201, wherein with the subset 3202 of the mode display items display 3204 of amplifying.The subset 3202 of option 3204 can comprise the project of selecting predetermined quantity, or project-based size dynamically the project of some to be included in the subset 3202.
Can dynamically select to be included in the project of the some in the subset 3202, so that the project of subset 3202 is crossed over the whole length 3207 of datum line 3201, perhaps they can cross over a part of datum line 3201.Shown in figure 32, also can show project 3205a and the 3205b that is not included in the subset 3202 along datum line 3201.Can change based on subset mean terms purpose size the quantity of project in the subset 3202, for example subset 3007 (among Figure 30) shows five projects, and subset 3007 (among Figure 31) shows three projects, and is the same even the width of subset 3007 keeps.
Come the project of giving prominence to comprise with identical large scale and show all outstanding projects by show the project of subset with large scale, shown in figure 32, or by show each project with such size in subset: this size depends on that each project is with respect to the position of cursor position along datum line, as shown in figure 25.
Figure 33 shows such exemplary reference line 3301: wherein with the subset 3302 of the item size display items display 3304 that changes.For example, can be less than a project or a plurality of project of subset 3302 centers in the size of the project 3305p at subset 3302 end places and 3305t (respectively expression letter " P " and " T "), for example little than project 3305r (expression alphabetical " R ").Show that with the size that changes the project in the subset 3302 can produce joyful aesthetic appearance, and can use more intuitively the control of enhancing.As shown in figure 33, also can show project 3305a and the 3305b that is not included in the subset 3302 along datum line 3301.
Show with large scale that the project of subset 3202 and 3302 can comprise and make project activity.Project activity is got up to comprise the project of amplifying subset and through the position (for example, maintenance project perpendicular to datum line) of too short time period along datum line translation project.Can make the project activity that is not included in subset 3202 and 3302 get up to shrink size and mobile along datum line outwardly, thereby be subset 3202 and 3302 " vacating space ".
The project of " release " datum line end can disappear simply, perhaps can be by activity for falling the edge that datum line, or visually to stimulate or the mode of humour is eliminated, for example by burning, implosion, evaporation, blast, liquefaction, crushing or other technology.Similarly, because previous quilt " release " project that again occurs in datum line generation space can reappear simply, perhaps can for falling, the top from user interface be turned back to datum line by activity, or visually to stimulate or the humorous spontaneous generation of mode.
In the discontinuous situation of datum line, project activity is got up cross discontinuous place to move.Can make project activity is to cross intermittently with high-speed mobile, maybe can use any above-mentioned visual effect and by movable be by " release " and " reappearing ".Similarly, can be by an end of " release " datum line to reappear in project activity on the opposite end as high-speed mobile between the datum line end points, maybe can use any above-mentioned visual effect and by movable be by " release " and " reappearing ".
Figure 34 has described the activation of project along datum line, and wherein " activation " or " giving prominence to " relate generally to identifies project to select.The system that identifies project to select can use hysteresis.Selection can comprise the project of determining initial selected when cursor is initially crossed datum line, and wherein the project of initial selected can be its position project of close home position on distance.After this, in order to reduce the non-flicker of having a mind between the project when cursor is placed between two adjacent projects, selection can comprise that the mid point preset distance that exceeds between the adjacent project whenever cursor movement just determines the new project of selecting.
For example, two project 3402r and 3402s are shown along the position of datum line 3401 by line 3404r and 3404s, by line 3405 point midway between project 3402r and the 3402s and distance 3406 expression predefine distances are shown.For example, if project 3402s (namely letter " S ") is the project of initial selected, the user surpasses to line 3407 left sides cursor movement the predefine distance 3406 of mid point 3405 along the direction towards project 3402r (namely letter " R ") so, with option 3402r.If selected subsequently project 3402r, in order to reselect project 3402s, the user just surpasses to the right side of line 3409 cursor movement the predefine distance 3406 of mid point 3405 along the direction towards project 3402s so.
The impact of the user movement that consideration is caused by the restriction of the vibration of health unintentionally or input unit, can define described predefine distance based on expection ultimate range, described expection ultimate range be the user attempt keeping static in, but the distance that the expectability cursor waves or shakes.When the position of cursor is below datum line, maybe when not finding cursor or cursor not in particular user interface or not in the interface region that is occupied by the control that strengthens, can cancel option.The selection of project can also generate user feedback, comprises sound for example, portrait and/or such as the sense of touch output of vibration.
The project that shows subset can comprise display items display so that their outward appearance provides feedback about their selection mode to the user.For example, can show with unique color selected project, or use the visual effect such as the increase outward appearance to show selected project.Be longer than first predeterminedly when continuing threshold value when project is in time that selection mode continues, can activate or outstanding current selected project.In this, when remaining on, cursor activates this project when the project top continues a time period.If selected project keeps being selected the time that continues and is longer than the second predetermined threshold value that continues, just can reconditioning or cancellation activation.
In addition, when the position of cursor be fix and when continuing a time period, can activate or outstanding project.When the position of the component that is parallel to datum line changes less than predetermined distance threshold and lasts longer than the scheduled time during threshold value, cursor can be categorized as fixing.For example, identification and activate the project of close cursor position.If cursor keeps being categorized as fixing and lasts longer than the second predetermined threshold value that continues, so can reconditioning or cancellation activate.Additionally, can be based on coming the activation project between the position of cursor and with respect to the distance of datum line.For example, can when surpassing predetermined distance threshold, this distance activate project.
In other example, can after option, the user's input by other type cause the activation project.For example, the user can provide the user of other type to input to activate selected project.In these examples, in order to activate selected project, the user (for example can touch user's load button, on controller), the input (for example, saying " activation ") of the sense of hearing is provided, (for example carry out the dumb show of other type, move hand for option towards display, or mobile subscriber's another hand to the other part that shows image to activate selected project), or provide user's input of any other type.
Figure 35 to Figure 38 shows exemplary project set.In Figure 35, the project 3501 that has from the character of the English alphabet comprises project 3502, and when activating or selecting this project 3502, this project 3502 is opened the second project set.When selecting, second project set that project 3501 or project 3501 parts can occur having along datum line 3504, or the second project set can be replaced the project 3501 on the datum line 3504.From being used for the symbol of expression project 3502, the user can determine intuitively that the second project set comprises numeral.
Figure 36 shows the project 3601 of aiming at datum line 3602.Display items display 3601 when the project 3502 among selection Figure 35.In case select the second project set, remove the cursor of datum line 3602 belows or from the user interface of display items display 3601, remove cursor and just can cause reselecting or reactivating project 3501.Project 3601 comprises project 3604, reopens, reactivates or reselect project 3501 when project 3604 is activated.From being used for the symbol of expression project 3604, user's 3501 characters that comprise from the English alphabet of can identifying project intuitively.
Figure 37 shows the project 3701 of aiming at datum line 3701, and wherein these projects represent numeral and the alphabetical combination that button known and on the standard telephone keypad is associated separately.Figure 38 shows project 3801, shows this project 3801 when the project 3702 (referring to Figure 37) of selection project 3701, and this project 3801 comprises the character that is associated with numerical key " 7 " on the standard telephone keypad and the combination of numeral " 7 ".
Use the project shown in Figure 35 to Figure 38, the text entry mechanism based on camera can be provided, for example be filled in the text field by the letter that sequentially will form word.In case activate numeral, letter, textual character or pre-determined text, numeral, letter, textual character or pre-determined text just can be attached in the character string of having inputted.Replacedly, the activation of project (for example among Figure 35 from the project of project 3501) can show more project, and the project of more items activates and the character that activates can be attached to character string.One of them project can be the backspace project, so that remove nearest project from string.
Text entry mechanism based on camera can comprise that combining characters is to form precomposed character.The text entry mechanism that is used for the Japanese text input can comprise in conjunction with kana character to form japanese character character (with the method for being familiar with) for the user of Japanese personal computer keyboard.The first project set can comprise the project that represents assumed name.When activating the assumed name project, corresponding kana character is attached on the character string.Project set can comprise the project of expression operation, and this project activates the processing that the nearest kana character that will go here and there converts japanese character to when being activated.The processing that converts the nearest kana character of string to japanese character can comprise the second set of the japanese character project of show candidate.Activate the japanese character project and activate such processing: the nearest kana character of wherein replacing string with the japanese character that activates.Should can extend to other Languages based on the text entry method of camera.
The replacement method that is used for the Japanese text input can comprise the first project set that shows the expression assumed name.When activating the assumed name project, show the second project set.The second project set can comprise japanese character, and the assumed name of activation forms the part of this japanese character.This set can be greater than the second above-mentioned set.Text input method can extend to other Languages.
Text entry mechanism can comprise the project of confirming in the project set.Project through confirming activates processing when being activated, thus character string is offered application.In addition, the text input mechanism can comprise that prediction (predictive) text finishes processing.The prediction text finish processing can retrieve dictionary with the most probable text of the character of finding to comprise character string.Can in output field, show this most probable text.Can select to finish for the prediction text based on the context of wherein having used text entry mechanism the dictionary of processing.For example, when the text input mechanism was used for the input title, dictionary can comprise title.
According to the state of using, use the project that can be identified for user interface of processing.The activation project can offer message to use and process.Can control this application based on this message processes.
Figure 39 shows the exemplary user interface 3900 that comprises based on the control of camera input and exemplary enhancing.Since the user can calling party interface 3900 on various text entry fields, so text entry field is as the target of definition datum line, and the project that is suitable for text entry field is shown as and aims at datum line and text entry field itself is blured.The user can select the project of aiming at datum line, the text entry field that is associated with filling.Although user interface 3900 seems to illustrate simultaneously the control of a plurality of visible enhancings, it only is exemplary that such simulation shows, and is used for illustrating possible control position, structure, aligning and item types.For example in other user interface, will show the control of an enhancing at every turn.
For example, if user's accessing text input field 3901, for example arrive this field by tabbing, by select this field with cursor of mouse, or by making suitable dumb show, then with respect to text entry field 3901 definition datum lines 3902, and project 3904 is shown as with datum line 3902 and aims at, so that fuzzy text entry field 3901.Because text entry field 3901 is accepted text or character data, the control that therefore strengthens determines that automatically alphabetic character project 3904 is suitable for target type.When the user from project 3904 during option, text entry field 3901 is filled with selected project.Replacement is from datum line 3902 options, and the user can also use around the control 3905 of the enhancing of incarnation 3906 definition, is used for filling various field with project.
Make a gesture with other field in tabbing or the calling party interface 3900 according to the user, can dynamically define the control of other enhancing, the output project is used for selecting.For example, access social security number field 3907 can cause and show numericitem 3909 above field 3907; Access sex field 3910 can cause dynamically exports sex project 3911 above field 3910; Access nationality field 3912 can cause at two datum lines 3914 and 3915 and show national projects 3913, and these two datum lines 3914 and 3915 are respectively in the above and below of field 3912; Access marital status field 3917 can cause the datum line 3910 on the right side of field 3917 to show marital status indicating device project 3919, and wherein owing to spatial limitation, user preference or other reason, datum line 3910 can dynamically be defined in the right side; Access street address field 3920 can cause display items display 3921 on the datum line 3924 above the field 3920, and project 3921 comprises numeral and the letter character 3922 of replacing numeral along datum line 3924 usefulness alphabetic characters; And Access status field 3925 can cause at two datum lines 3927 and 3929 that are defined in above the field 3925 and shows the project 3926 that comprises Status Name.
Although the input based on camera that will strengthen hereinbefore is described as using the control of particular type, also can use the control of other type.For example, as shown in figure 40, can be for mutual with control 4001 based on the input of camera, this control 4001 comprises datum line part 4002 (as mentioned above) and virtual button part 4004.Control 4005 is included in a plurality of button 4006a to 4006h of the either side aligning of incarnation 4007.Control 4008 is such dumb show wheel controls (gesture wheelcontrol): wherein select interactive elements (for example interactive elements 4009) to cause interactive elements further to show character or other project that is represented by interactive elements by user representative (being hand 4010 in the case).
In addition, control 4012 is dummy keyboards, and this dummy keyboard comprises with qwerty keyboard constructs the virtual key 4014 that is arranged on the user interface.Control 4012 can not need user representative to appear on the user interface.For example, replace illustrating hand or incarnation as user representative, when the user in the space control target occupy with the corresponding surveyed area of virtual key in the position time, each in the virtual key 4014 can be lighted.Control 4015 is that normal window is used desktop, comprises using representing 4019 mutual icon 4016a to 4016c and functional elements 4017 with it.
Control 4020 comprises a representative of user 4021, and this representative can be centered on by interactive elements 4022a to 4022c, user's camera image, and these interactive elements represent respectively to roll left, roll to the right and calling function.Use is based on the input of camera, and user 4021 representative causes calls be associated with interactive elements 4022a to 4022c functional.In this example, rolling function and calls selected application so that the user can select to use 4024 from the application bar that shows in user interface bottom.In the example that shows, selected to represent 4021 to call photo browse application 4024d.
Figure 41 shows the example of the external outward appearance of calculation element 4101, and this calculation element 4101 also comprises processor and user interface.Processor is configured, definition and detects the position of target in surveyed area around user's surveyed area in the adaptive user images that maybe can be operating as in scene.Processor also is configured to based on the position of the target that detects and the widget interaction in the user interface.
In more detail, the hardware environment of calculation element 4101 comprises: be used for to show text and image with the monitor 4108 of user interactions, be used for text data and user command are input to the keyboard 4109 of calculation element 4101, be used in reference to, select and operate on the monitor 4108 mouse 4110 of the target that shows, fixed disk drive 4111, removable disk drive 4112, magnetic tape station 4114, the hard copy output device, computer network connects and digital input unit 4117.
Monitor 4108 shows such figure, image and text: it comprises the user interface for the software application of being used by calculation element 4101, and the operating system program that needs operation calculation device 4101.The user uses keyboard 4109 input commands and data, with operation and control computer operating system program and application program.Mouse 4110 can be the indicator device of any type, and can be action bars, trace ball, touch pad or other indicator device.In the computer-readable recording medium such as fixed disk drive 4111, store for showing user interface and so that the software that the user could input or select text, numeral or select from options menu locally.
In other execution mode, fixed disk drive 4111 self can comprise a large amount of physical drives unit, for example the redundant array of independent disk (" RAID ") maybe can be disk array or the disc driver place that is physically located in the discrete computing unit.Such computer-readable recording medium allow calculation element 4101 access be stored in movably with immovable storage medium in computer can carry out treatment step, application program etc.
It can be that modulator-demodulator connects that computer network connects, the local area network (LAN) (" LAN ") that comprises Ethernet connects, or the broadband wide area network (" WAN ") that connects such as Digital Subscriber Line (" DSL "), the connection of cable high-speed Internet, dial-up connection, T-1 line, T-3 line, optical fiber connection or satellite connects.Network 4106 can be lan network, company or government's WAN network, internet or other network.
It can be wired or wireless connector that computer network connects.The example of wireless connector for example comprises: INFRARED DATA
Figure GPA00001077255100411
Figure GPA00001077255100412
Wireless connector, optical-wireless connector, INSTITUTE OF ELECTRICAL AND ELECTRONICS
Figure GPA00001077255100413
Figure GPA00001077255100414
Standard 802.11 wireless connectors,
Figure GPA00001077255100415
Wireless connector, OFDM (" OFDM ") ultra broadband (" UWB ") wireless connector, time-modulation ultra broadband (" TM-UWB ") wireless connector or other wireless connector.The example of wired connection device for example comprises:
Figure GPA00001077255100416
-1394
Figure GPA00001077255100417
Connector, USB (" USB ") connector, serial port connector, parallel port connector or other wired connection device.
Removable disk drive 4112 is for from calculation element 4101 downloading datas or upload the data to the mobile storage means of calculation element 4101.Removable disk drive 4112 can be floppy disk,
Figure GPA00001077255100418
Driver, compact disc-ROM (" CD-ROM ") driver, CD-R driver (" CD-R "), CD-RW driver (" CD-RW "), flash memory, the USB flash memory memory, flash memory disk (thumb drive), pen drive, key drive, high-density digital multifunctional optical disc (" HD-DVD ") CD drive, the Blu-ray Disc driver, any in holographic digital data storage (" HDDS ") CD drive or various recordable or rewritable digital versatile disc (" the DVD ") driver, for example recordable DVD (" DVD-R " or " DVD+R "), rewritable DVD (" DVD-RW " or " DVD+RW ") or DVD-RAM.Operating system program, application and various data file are stored in the disk, and these disk storages are on fixed disk drive 4111 or be stored in removable medium for removable disk drive 4112.
Magnetic tape station 4114 is for from calculation element 4101 downloading datas or upload the data to the magnetic tape strip unit of calculation element 4101.This magnetic tape station 4114 can be the tape of 1/4th inches open reel tapes (" QIC "), 4mm digital audiotape (" DAT "), 8mm SDLT (" DLT ") driver or other type.
In addition, although hereinbefore calculation element 4101 is described as desktop PC, but in other execution mode, calculation element 4101 can be the computer of laptop computer, work station, medium-size computer, large-scale computer, embedded system, phone, palmtop computer or flat computer, PDA, game device or control desk, DPF, electric wire conference apparatus or other type.
Figure 42 is the block diagram that the internal structure of the computer shown in Figure 41 is shown.The exemplary internal structure of calculation element 4101 is described now.Computing environment comprises: computer CPU (" CPU ") 4201, wherein process the computer instruction that comprises operating system or application; The display interface 4202 that communication interface is provided and is used for presenting at monitor 4108 processing capacity of figure, image and text; The keyboard interface 4204 of communication interface is provided for keyboard 4109; Indicator device interface 4205 or the equivalent indicator device of communication interface being provided for mouse 4110; The digital inputting interface 4206 of communication interface is provided for digital input unit 4117; The hard copy output device interface of communication interface is provided for the hard copy output device; Random access memory (" RAM ") 4210, wherein computer instruction and data are stored in and are used in the volatile storage being processed by computer CPU 4201; Read-only memory (" ROM ") 4211, the constant low-level system code or the data that wherein are used for basic system functions are stored in Nonvolatile memory devices, and this basic system functions for example is basic input and output (" I/O "), start or receive key-press input from keyboard 4109; And the memory of memory 4220 or other suitable type (for example, random access memory (" RAM "), read-only memory (" ROM "), programmable read-only memory (" PROM "), Erarable Programmable Read only Memory (" EPROM "), EEPROM (Electrically Erasable Programmable Read Only Memo) (" EEPROM "), disk, CD, floppy disk, hard disk, removable tape, flash drive), wherein storage comprises operating system 4221, file and the data file 4225 of application program 4222 (input based on camera that comprises enhancing uses 4223, and other application 4224 as required); Connect the computer network interface that communication interface is provided to network via computer network.Component devices and computer CPU 4201 communicate with one another by computer bus 4227.
According to another general execution mode, the code computer computer-readable recording medium by computer program.Computer program comprises such instruction: operation when carrying out this instruction is so that definition is around user's surveyed area in the user images of computer in scene, and the position of detection target in surveyed area.Computer program also comprises such instruction: when machine reads this instruction the operation so that data processing equipment based on the position of the target that detects and the widget interaction in the user interface.
RAM 4210 is mutual with computer bus 4227, with during the software program of carrying out such as the operating system application program, will provide quick RAM to store to computer CPU 4201 and device driver.More specifically, the computer CPU 4201 in the future computer in self-retaining disc driver 4111 or other storage medium can be carried out in the field that treatment step is loaded into RAM 4210, with for software program for execution.Data are stored among the RAM4210 like this: wherein the term of execution by computer CPU 4201 visit datas.
Calculation element 4101 storage is used for the computer of operating system 4221, application program 4222 can run time version, and application program 4222 for example is word processing, spreadsheet, introduction, game or other application.Although can use above-mentioned execution mode that input based on camera is provided, can also will be embodied as dynamic link library (" DLL ") according to these functions of the present disclosure or be inserted into other application program, for example such as
Figure GPA00001077255100431
The internet webpage browser of Internet Explorer web browser.
Computer CPU 4201 is in a large amount of high-performance computer processors, and these high-performance computer processors comprise
Figure GPA00001077255100432
Or Processor, Processor,
Figure GPA00001077255100435
Reduced Instruction Set Computer (" RISC ") processor,
Figure GPA00001077255100436
Processor,
Figure GPA00001077255100437
Machine
Figure GPA00001077255100438
Framework processor, HP Processor or be used for the special-purpose computer processor of large-scale computer.In other configuration, computer CPU 3301 is the processing units more than, is included in a plurality of CPU structures in high-performance workstation and the server, but or a plurality of extension process unit in the large-scale computer.
Operating system 4221 can be
Figure GPA000010772551004310
WINDOWS
Figure GPA000010772551004311
Figure GPA000010772551004312
The XP work station; WINDOWS
Figure GPA000010772551004313
Figure GPA000010772551004314
The XP server; Various
Figure GPA000010772551004315
The operating system of style, comprise for Work station and server Be used for
Figure GPA000010772551004318
Work station and server
Figure GPA000010772551004319
Be used for
Figure GPA000010772551004320
Based on the work station of CPU and server
Figure GPA000010772551004321
Be used for
Figure GPA000010772551004322
The HP UX WORKLOAD of work station and server
Figure GPA00001077255100441
Be used for
Figure GPA00001077255100442
Work station and server
Figure GPA00001077255100443
For the VAX/VMS of Digital Equipment Corporation's computer, for HP
Figure GPA00001077255100444
Computer based
Figure GPA00001077255100445
Be used for The MAC of element task station and server
Figure GPA00001077255100447
The SYMBIAN that is used for mobile device
Figure GPA00001077255100448
WINDOWS
Figure GPA00001077255100449
Or WINDOWS
Figure GPA000010772551004410
(" NOS "),
Figure GPA000010772551004412
Or
Figure GPA000010772551004413
Or be used for special purpose operating system or the embedded system of computer.The application and development or the framework that are used for operating system 4221 can be: BINARYRUNTIME ENVIRONMENT FOR
Figure GPA000010772551004414
Java platform, MicroEdition (" Java ME ") or Java 2 platforms, Micro Edition (
Figure GPA000010772551004415
); PYTHON TM,
Figure GPA000010772551004416
Or
Figure GPA000010772551004417
NET Compact.
Although Figure 41 and 42 shows a possibility execution mode of calculation element, this calculation element executive program code or program or treatment step, and be configured to provide the control of enhancing so that the user can be intuitively and easily input text, numeral or select, but also can use computer or the execution mode of other type from a large amount of projects.
A large amount of execution modes has been described.Yet, should be appreciated that in the situation that do not break away from spirit and scope of the present disclosure and can carry out various modifications.Thereby other execution mode all within the scope of the appended claims.

Claims (28)

1. computer installation comprises:
Module for the image that is created on the user in the scene;
Be used for also comprising in the module of described image definition around user's surveyed area:
Be used for to determine the module of the coverage area of the position of trunk and user's arm;
Be used for defining described surveyed area and do not comprise trunk and the module that does not comprise at least a portion of the image-region that arm can not arrive;
For the module of determining the part that described surveyed area the second user can be detected; And
Be used for described surveyed area is defined to get rid of the module of the described part that described the second user therein can be detected;
The module that is used for the position of detection user's hand in described surveyed area; And
Be used for based on the position of the hand that detects and the module of the widget interaction in the user interface, described control comprises the project of aiming at datum line, and wherein the incarnation with respect to expression user on the user interface represents to define described datum line.
2. computer implemented method comprises:
Definition is around user's surveyed area in the user images in scene, and described definition comprises:
The zone that can not arrive of the image that definite target that is associated with the user can not arrive;
Define described surveyed area not comprise at least a portion in the described zone that can not arrive;
Determine the part that the second user can be detected in the described surveyed area; And
Described surveyed area is defined to get rid of the described part that described the second user therein can be detected;
In described surveyed area, detect the position of described target; And
Position and the draw piece in the user interface based on the described target that detects are mutual.
3. method according to claim 2 also comprises the agreement dumb show that detects the user, wherein detects the position of described target based on the described agreement dumb show of detection.
4. method according to claim 2, wherein said target is user's hand.
5. method according to claim 2 wherein also comprises the selection character with described widget interaction.
6. method according to claim 2, wherein said user interface also comprises user's representative.
7. method according to claim 6 comprises that also the image of demonstration through cutting apart is as user's representative.
8. method according to claim 6, wherein said control comprise and with respect to the base of described representative definition
The project that directrix is aimed at, and wherein under the condition of not fuzzy described representative, show described project.
9. method according to claim 6, wherein user's described representative also comprises incarnation or cursor.
10. method according to claim 9 also comprises the position based on the described target that detects, and makes described incarnation activity get up to imitate user's action.
11. method according to claim 6, the top or the next door that also are included in the inherent described representative of described user interface show described control.
12. method according to claim 2 wherein defines described surveyed area and also comprises: the position of determining user's trunk; And
Based on the position of determined trunk, define described surveyed area and do not comprise trunk.
13. method according to claim 12 is wherein determined the position of user's trunk with image segmentation.
14. method according to claim 12 wherein defines described surveyed area and comprises: based on the position of determined body thousand, that determines that described target can not arrive, described image can not arrive the zone; And
What define that described surveyed area do not comprise determined, described image can not arrive the zone.
15. method according to claim 2, wherein said target comprises at least a portion of user's arm, and wherein defines described surveyed area and also comprise:
Determine the coverage area of user's arm; And
Based on the coverage area of determined described arm, define at least a portion that described surveyed area does not comprise the image-region that described arm can not arrive.
16. method according to claim 15, wherein the coverage area of user's described arm is based on the position of determined, user's trunk, head or shoulder.
17. method according to claim 2 wherein defines described surveyed area and also comprises: the position of determining user's head; And
Based on the position of determined head, define described surveyed area and do not comprise head.
18. method according to claim 17 is wherein with the facial position of detecting the head of determining the user.
19. method according to claim 2 also comprises:
The described image of cutting; And
In described user interface, show described image.
20. method according to claim 19, also comprise the described image about the user placed in the middle, wherein based on described image being come between two parties the described image of cutting.
21. method according to claim 20 also comprises and determines in image placed in the middle wherein to come the described image of cutting based on described multiplication factor so that the user can arrive the multiplication factor of described control.
22. method according to claim 21 is wherein determined described multiplication factor with anatomical model.
23. method according to claim 2 also comprises:
Detect user's face;
Determine user's identity based on the face that detects; And
Adjust described control based on user's identity.
24. method according to claim 2 also comprises:
In described surveyed area, detect the position of the second target; And
Adjust described control based on the position of described the second target.
25. method according to claim 2 also comprises and uses camera to take described image.
26. method according to claim 2, wherein said surveyed area are shaped to the arc of user top and detect subregion, described arc detects subregion and each two linears detection subregion section of one of the every side of user connect.
27. method according to claim 2, the position of wherein detecting described target in described surveyed area also comprises:
Detect described objective mapping in the described surveyed area, with respect to the position of datum line.
28. a computer installation comprises:
Be used for the image definition of the user in scene around the module of user's surveyed area, described module for definition comprises:
The module in the zone that can not arrive of the image that the target of be used for determining to be associated with the user can not arrive;
Be used for defining described surveyed area with the module of at least a portion of not comprising the described zone that can not arrive;
For the module of determining the part that described surveyed area the second user can be detected; And
Be used for described surveyed area is defined to get rid of the module of the described part that described the second user therein can be detected;
Be used in described surveyed area, detecting the module of the position of target; And
Be used for based on the position of the described target that detects and the module of the widget interaction in the user interface.
CN200880109208XA 2007-07-27 2008-07-25 Enhanced camera-based input Active CN101810003B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810420206.8A CN108399010B (en) 2007-07-27 2008-07-25 Enhanced camera-based input

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
US95244807P 2007-07-27 2007-07-27
US60/952,448 2007-07-27
US12/102,587 2008-04-14
US12/102,587 US8726194B2 (en) 2007-07-27 2008-04-14 Item selection using enhanced control
US12/124,375 2008-05-21
US12/124,375 US8659548B2 (en) 2007-07-27 2008-05-21 Enhanced camera-based input
PCT/US2008/071224 WO2009018161A1 (en) 2007-07-27 2008-07-25 Enhanced camera-based input

Related Child Applications (2)

Application Number Title Priority Date Filing Date
CN201810420206.8A Division CN108399010B (en) 2007-07-27 2008-07-25 Enhanced camera-based input
CN2013100725016A Division CN103218041A (en) 2007-07-27 2008-07-25 Enhanced camera-based input

Publications (2)

Publication Number Publication Date
CN101810003A CN101810003A (en) 2010-08-18
CN101810003B true CN101810003B (en) 2013-04-10

Family

ID=40294873

Family Applications (3)

Application Number Title Priority Date Filing Date
CN2013100725016A Pending CN103218041A (en) 2007-07-27 2008-07-25 Enhanced camera-based input
CN200880109208XA Active CN101810003B (en) 2007-07-27 2008-07-25 Enhanced camera-based input
CN201810420206.8A Active CN108399010B (en) 2007-07-27 2008-07-25 Enhanced camera-based input

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN2013100725016A Pending CN103218041A (en) 2007-07-27 2008-07-25 Enhanced camera-based input

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201810420206.8A Active CN108399010B (en) 2007-07-27 2008-07-25 Enhanced camera-based input

Country Status (4)

Country Link
US (5) US8726194B2 (en)
JP (3) JP5575645B2 (en)
CN (3) CN103218041A (en)
WO (1) WO2009018161A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI474264B (en) * 2013-06-14 2015-02-21 Utechzone Co Ltd Warning method for driving vehicle and electronic apparatus for vehicle

Families Citing this family (733)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8352400B2 (en) 1991-12-23 2013-01-08 Hoffberg Steven M Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore
US7966078B2 (en) 1999-02-01 2011-06-21 Steven Hoffberg Network media appliance system and method
US7565008B2 (en) 2000-11-06 2009-07-21 Evryx Technologies, Inc. Data capture and identification system and process
US8224078B2 (en) 2000-11-06 2012-07-17 Nant Holdings Ip, Llc Image capture and identification system and process
US7680324B2 (en) 2000-11-06 2010-03-16 Evryx Technologies, Inc. Use of image-derived information as search criteria for internet and other search engines
US9310892B2 (en) 2000-11-06 2016-04-12 Nant Holdings Ip, Llc Object information derived from object images
US7899243B2 (en) 2000-11-06 2011-03-01 Evryx Technologies, Inc. Image capture and identification system and process
US8035612B2 (en) * 2002-05-28 2011-10-11 Intellectual Ventures Holding 67 Llc Self-contained interactive video display system
US8300042B2 (en) * 2001-06-05 2012-10-30 Microsoft Corporation Interactive video display system using strobed light
US7159194B2 (en) * 2001-11-30 2007-01-02 Palm, Inc. Orientation dependent functionality of an electronic device
US7710391B2 (en) 2002-05-28 2010-05-04 Matthew Bell Processing an image utilizing a spatially varying pattern
US20050122308A1 (en) * 2002-05-28 2005-06-09 Matthew Bell Self-contained interactive video display system
AU2003301043A1 (en) * 2002-12-13 2004-07-09 Reactrix Systems Interactive directed light/sound system
CN102034197A (en) * 2003-10-24 2011-04-27 瑞克楚斯系统公司 Method and system for managing an interactive video display system
US7598942B2 (en) * 2005-02-08 2009-10-06 Oblong Industries, Inc. System and method for gesture based control system
WO2006106522A2 (en) 2005-04-07 2006-10-12 Visionsense Ltd. Method for reconstructing a three- dimensional surface of an object
US9128519B1 (en) 2005-04-15 2015-09-08 Intellectual Ventures Holding 67 Llc Method and system for state-based control of objects
US8081822B1 (en) 2005-05-31 2011-12-20 Intellectual Ventures Holding 67 Llc System and method for sensing a feature of an object in an interactive video display
US8098277B1 (en) 2005-12-02 2012-01-17 Intellectual Ventures Holding 67 Llc Systems and methods for communication between a reactive video system and a mobile communication device
US8370383B2 (en) 2006-02-08 2013-02-05 Oblong Industries, Inc. Multi-process interactive systems and methods
US9075441B2 (en) * 2006-02-08 2015-07-07 Oblong Industries, Inc. Gesture based control using three-dimensional information extracted over an extended depth of field
US8531396B2 (en) 2006-02-08 2013-09-10 Oblong Industries, Inc. Control system for navigating a principal dimension of a data space
US9910497B2 (en) * 2006-02-08 2018-03-06 Oblong Industries, Inc. Gestural control of autonomous and semi-autonomous systems
US8537111B2 (en) 2006-02-08 2013-09-17 Oblong Industries, Inc. Control system for navigating a principal dimension of a data space
US9823747B2 (en) 2006-02-08 2017-11-21 Oblong Industries, Inc. Spatial, multi-modal control device for use with spatial operating system
US8537112B2 (en) * 2006-02-08 2013-09-17 Oblong Industries, Inc. Control system for navigating a principal dimension of a data space
JP3920904B1 (en) * 2006-02-23 2007-05-30 株式会社コナミデジタルエンタテインメント Communication game system, communication game control method, and program
US20080252596A1 (en) * 2007-04-10 2008-10-16 Matthew Bell Display Using a Three-Dimensional vision System
WO2008134452A2 (en) * 2007-04-24 2008-11-06 Oblong Industries, Inc. Proteins, pools, and slawx in processing environments
US8726194B2 (en) 2007-07-27 2014-05-13 Qualcomm Incorporated Item selection using enhanced control
JP5430572B2 (en) * 2007-09-14 2014-03-05 インテレクチュアル ベンチャーズ ホールディング 67 エルエルシー Gesture-based user interaction processing
JP4636064B2 (en) 2007-09-18 2011-02-23 ソニー株式会社 Image processing apparatus, image processing method, and program
JP4569613B2 (en) * 2007-09-19 2010-10-27 ソニー株式会社 Image processing apparatus, image processing method, and program
JP4938617B2 (en) * 2007-10-18 2012-05-23 幸輝郎 村井 Object operating device and method for specifying marker from digital image frame data
US8159682B2 (en) * 2007-11-12 2012-04-17 Intellectual Ventures Holding 67 Llc Lens system
US8933876B2 (en) 2010-12-13 2015-01-13 Apple Inc. Three dimensional user interface session control
US20120204133A1 (en) * 2009-01-13 2012-08-09 Primesense Ltd. Gesture-Based User Interface
US9035876B2 (en) 2008-01-14 2015-05-19 Apple Inc. Three-dimensional user interface session control
US20090193361A1 (en) * 2008-01-30 2009-07-30 Research In Motion Limited Electronic device and method of controlling same
US20100039500A1 (en) * 2008-02-15 2010-02-18 Matthew Bell Self-Contained 3D Vision System Utilizing Stereo Camera and Patterned Illuminator
US8259163B2 (en) 2008-03-07 2012-09-04 Intellectual Ventures Holding 67 Llc Display with built in 3D sensing
WO2009128064A2 (en) * 2008-04-14 2009-10-22 Pointgrab Ltd. Vision based pointing device emulation
KR20090110242A (en) * 2008-04-17 2009-10-21 삼성전자주식회사 Method and apparatus for processing audio signal
KR101599875B1 (en) * 2008-04-17 2016-03-14 삼성전자주식회사 Method and apparatus for multimedia encoding based on attribute of multimedia content, method and apparatus for multimedia decoding based on attributes of multimedia content
KR20090110244A (en) * 2008-04-17 2009-10-21 삼성전자주식회사 Method for encoding/decoding audio signals using audio semantic information and apparatus thereof
US9952673B2 (en) 2009-04-02 2018-04-24 Oblong Industries, Inc. Operating environment comprising multiple client devices, multiple displays, multiple users, and gestural control
US10642364B2 (en) 2009-04-02 2020-05-05 Oblong Industries, Inc. Processing tracking and recognition data in gestural recognition systems
US8723795B2 (en) 2008-04-24 2014-05-13 Oblong Industries, Inc. Detecting, representing, and interpreting three-space input: gestural continuum subsuming freespace, proximal, and surface-contact modes
US9684380B2 (en) 2009-04-02 2017-06-20 Oblong Industries, Inc. Operating environment with gestural control and multiple client devices, displays, and users
US9495013B2 (en) 2008-04-24 2016-11-15 Oblong Industries, Inc. Multi-modal gestural interface
US9740922B2 (en) 2008-04-24 2017-08-22 Oblong Industries, Inc. Adaptive tracking system for spatial input devices
US9740293B2 (en) 2009-04-02 2017-08-22 Oblong Industries, Inc. Operating environment with gestural control and multiple client devices, displays, and users
US8405727B2 (en) * 2008-05-01 2013-03-26 Apple Inc. Apparatus and method for calibrating image capture devices
US20090305785A1 (en) * 2008-06-06 2009-12-10 Microsoft Corporation Gesture controlled game screen navigation
US8595218B2 (en) * 2008-06-12 2013-11-26 Intellectual Ventures Holding 67 Llc Interactive display management systems and methods
US20090312100A1 (en) * 2008-06-12 2009-12-17 Harris Scott C Face Simulation in Networking
KR101652535B1 (en) * 2008-06-18 2016-08-30 오블롱 인더스트리즈, 인크 Gesture-based control system for vehicle interfaces
MY168264A (en) * 2008-07-15 2018-10-16 Dtt holdings pty ltd Method for preparing a patent for a medical treatment
US9324173B2 (en) * 2008-07-17 2016-04-26 International Business Machines Corporation System and method for enabling multiple-state avatars
WO2010011923A1 (en) * 2008-07-24 2010-01-28 Gesturetek, Inc. Enhanced detection of circular engagement gesture
WO2010011929A1 (en) 2008-07-25 2010-01-28 Gesturetek, Inc. Enhanced detection of waving engagement gesture
US8957914B2 (en) * 2008-07-25 2015-02-17 International Business Machines Corporation Method for extending a virtual environment through registration
US10166470B2 (en) * 2008-08-01 2019-01-01 International Business Machines Corporation Method for providing a virtual world layer
US8508671B2 (en) 2008-09-08 2013-08-13 Apple Inc. Projection systems and methods
US8538084B2 (en) * 2008-09-08 2013-09-17 Apple Inc. Method and apparatus for depth sensing keystoning
JP4752887B2 (en) * 2008-09-12 2011-08-17 ソニー株式会社 Information processing apparatus, information processing method, and computer program
US8704832B2 (en) 2008-09-20 2014-04-22 Mixamo, Inc. Interactive design, synthesis and delivery of 3D character motion data through the web
US8610726B2 (en) * 2008-09-26 2013-12-17 Apple Inc. Computer systems and methods with projected display
US20100079426A1 (en) * 2008-09-26 2010-04-01 Apple Inc. Spatial ambient light profiling
US8527908B2 (en) * 2008-09-26 2013-09-03 Apple Inc. Computer user interface system and methods
US7881603B2 (en) * 2008-09-26 2011-02-01 Apple Inc. Dichroic aperture for electronic imaging device
US20100079653A1 (en) * 2008-09-26 2010-04-01 Apple Inc. Portable computing system with a secondary image output
US8869197B2 (en) 2008-10-01 2014-10-21 At&T Intellectual Property I, Lp Presentation of an avatar in a media communication system
JP4793422B2 (en) * 2008-10-10 2011-10-12 ソニー株式会社 Information processing apparatus, information processing method, information processing system, and information processing program
US8749556B2 (en) * 2008-10-14 2014-06-10 Mixamo, Inc. Data compression for real-time streaming of deformable 3D models for 3D animation
US8863212B2 (en) * 2008-10-16 2014-10-14 At&T Intellectual Property I, Lp Presentation of an adaptive avatar
US9586135B1 (en) 2008-11-12 2017-03-07 David G. Capper Video motion capture for wireless gaming
US9383814B1 (en) 2008-11-12 2016-07-05 David G. Capper Plug and play wireless video game
US10086262B1 (en) 2008-11-12 2018-10-02 David G. Capper Video motion capture for wireless gaming
KR101185589B1 (en) * 2008-11-14 2012-09-24 (주)마이크로인피니티 Method and Device for inputing user's commands based on motion sensing
US8954894B2 (en) * 2008-11-15 2015-02-10 Adobe Systems Incorporated Gesture-initiated symbol entry
US8788977B2 (en) * 2008-11-20 2014-07-22 Amazon Technologies, Inc. Movement recognition as input mechanism
EP2368168B1 (en) * 2008-11-21 2018-07-11 London Health Sciences Centre Research Inc. Hands-free pointer system
US8659596B2 (en) * 2008-11-24 2014-02-25 Mixamo, Inc. Real time generation of animation-ready 3D character models
US8982122B2 (en) 2008-11-24 2015-03-17 Mixamo, Inc. Real time concurrent design of shape, texture, and motion for 3D character animation
US8232989B2 (en) * 2008-12-28 2012-07-31 Avaya Inc. Method and apparatus for enhancing control of an avatar in a three dimensional computer-generated virtual environment
US8588465B2 (en) 2009-01-30 2013-11-19 Microsoft Corporation Visual target tracking
US7996793B2 (en) 2009-01-30 2011-08-09 Microsoft Corporation Gesture recognizer system architecture
US8565476B2 (en) * 2009-01-30 2013-10-22 Microsoft Corporation Visual target tracking
US8682028B2 (en) 2009-01-30 2014-03-25 Microsoft Corporation Visual target tracking
US8577084B2 (en) 2009-01-30 2013-11-05 Microsoft Corporation Visual target tracking
US8565477B2 (en) * 2009-01-30 2013-10-22 Microsoft Corporation Visual target tracking
US20100199231A1 (en) * 2009-01-30 2010-08-05 Microsoft Corporation Predictive determination
US8267781B2 (en) 2009-01-30 2012-09-18 Microsoft Corporation Visual target tracking
US8577085B2 (en) 2009-01-30 2013-11-05 Microsoft Corporation Visual target tracking
US9105014B2 (en) 2009-02-03 2015-08-11 International Business Machines Corporation Interactive avatar in messaging environment
US9195317B2 (en) * 2009-02-05 2015-11-24 Opentv, Inc. System and method for generating a user interface for text and item selection
US20100259547A1 (en) 2009-02-12 2010-10-14 Mixamo, Inc. Web platform for interactive design, synthesis and delivery of 3d character motion data
US9436276B2 (en) * 2009-02-25 2016-09-06 Microsoft Technology Licensing, Llc Second-person avatars
US20100235786A1 (en) * 2009-03-13 2010-09-16 Primesense Ltd. Enhanced 3d interfacing for remote devices
US9256282B2 (en) * 2009-03-20 2016-02-09 Microsoft Technology Licensing, Llc Virtual object manipulation
US10824238B2 (en) 2009-04-02 2020-11-03 Oblong Industries, Inc. Operating environment with gestural control and multiple client devices, displays, and users
US9317128B2 (en) 2009-04-02 2016-04-19 Oblong Industries, Inc. Remote devices used in a markerless installation of a spatial operating environment incorporating gestural control
JP5256109B2 (en) * 2009-04-23 2013-08-07 株式会社日立製作所 Display device
US9898675B2 (en) 2009-05-01 2018-02-20 Microsoft Technology Licensing, Llc User movement tracking feedback to improve tracking
WO2010129721A2 (en) * 2009-05-05 2010-11-11 Mixamo, Inc. Distributed markerless motion capture
US20100289912A1 (en) * 2009-05-14 2010-11-18 Sony Ericsson Mobile Communications Ab Camera arrangement with image modification
US9417700B2 (en) 2009-05-21 2016-08-16 Edge3 Technologies Gesture recognition systems and related methods
KR101597553B1 (en) * 2009-05-25 2016-02-25 엘지전자 주식회사 Function execution method and apparatus thereof
US20100306685A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation User movement feedback via on-screen avatars
US8693724B2 (en) 2009-05-29 2014-04-08 Microsoft Corporation Method and system implementing user-centric gesture control
US8487871B2 (en) 2009-06-01 2013-07-16 Microsoft Corporation Virtual desktop coordinate transformation
US8615713B2 (en) * 2009-06-26 2013-12-24 Xerox Corporation Managing document interactions in collaborative document environments of virtual worlds
JP5291560B2 (en) * 2009-07-27 2013-09-18 パナソニック株式会社 Operating device
US8428368B2 (en) * 2009-07-31 2013-04-23 Echostar Technologies L.L.C. Systems and methods for hand gesture control of an electronic device
JP5614014B2 (en) * 2009-09-04 2014-10-29 ソニー株式会社 Information processing apparatus, display control method, and display control program
US9507411B2 (en) 2009-09-22 2016-11-29 Facebook, Inc. Hand tracker for device with display
US8619128B2 (en) 2009-09-30 2013-12-31 Apple Inc. Systems and methods for an imaging system using multiple image sensors
US8502926B2 (en) * 2009-09-30 2013-08-06 Apple Inc. Display system having coherent and incoherent light sources
GB2474536B (en) 2009-10-13 2011-11-02 Pointgrab Ltd Computer vision gesture based control of a device
US9971807B2 (en) 2009-10-14 2018-05-15 Oblong Industries, Inc. Multi-process interactive systems and methods
US9933852B2 (en) 2009-10-14 2018-04-03 Oblong Industries, Inc. Multi-process interactive systems and methods
US20110099476A1 (en) * 2009-10-23 2011-04-28 Microsoft Corporation Decorating a display environment
WO2011056657A2 (en) * 2009-10-27 2011-05-12 Harmonix Music Systems, Inc. Gesture-based user interface
US9981193B2 (en) 2009-10-27 2018-05-29 Harmonix Music Systems, Inc. Movement based recognition and evaluation
US8325136B2 (en) 2009-12-01 2012-12-04 Raytheon Company Computer display pointer device for a display
US8687070B2 (en) * 2009-12-22 2014-04-01 Apple Inc. Image capture device having tilt and/or perspective correction
US20120262488A1 (en) * 2009-12-23 2012-10-18 Nokia Corporation Method and Apparatus for Facilitating Text Editing and Related Computer Program Product and Computer Readable Medium
EP2521352B1 (en) * 2009-12-28 2014-03-26 Panasonic Corporation Operation sound guide device and operation sound guide method
US20110162004A1 (en) * 2009-12-30 2011-06-30 Cevat Yerli Sensor device for a computer-controlled video entertainment system
US9207765B2 (en) * 2009-12-31 2015-12-08 Microsoft Technology Licensing, Llc Recognizing interactive media input
US20110316877A1 (en) * 2010-01-08 2011-12-29 Hidehiko Shin Display area control apparatus, display area control method, and integrated circuit
US9019201B2 (en) * 2010-01-08 2015-04-28 Microsoft Technology Licensing, Llc Evolving universal gesture sets
US9542001B2 (en) 2010-01-14 2017-01-10 Brainlab Ag Controlling a surgical navigation system
US8499257B2 (en) 2010-02-09 2013-07-30 Microsoft Corporation Handles interactions for human—computer interface
GB2477959A (en) * 2010-02-19 2011-08-24 Sony Europ Navigation and display of an array of selectable items
US9170666B2 (en) 2010-02-25 2015-10-27 Hewlett-Packard Development Company, L.P. Representative image
JP5659510B2 (en) * 2010-03-10 2015-01-28 ソニー株式会社 Image processing apparatus, image processing method, and program
CN102792246B (en) * 2010-03-15 2016-06-01 皇家飞利浦电子股份有限公司 For controlling the method and system of at least one device
US8756522B2 (en) * 2010-03-19 2014-06-17 Blackberry Limited Portable electronic device and method of controlling same
JP5320332B2 (en) * 2010-03-19 2013-10-23 株式会社コナミデジタルエンタテインメント GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
US20110239115A1 (en) * 2010-03-26 2011-09-29 Motorola, Inc. Selecting an avatar on a display screen of a mobile device
US8818027B2 (en) * 2010-04-01 2014-08-26 Qualcomm Incorporated Computing device interface
TWI439960B (en) 2010-04-07 2014-06-01 Apple Inc Avatar editing environment
JP5434767B2 (en) * 2010-04-16 2014-03-05 ソニー株式会社 Information processing apparatus, information processing method, and program thereof
US8810509B2 (en) * 2010-04-27 2014-08-19 Microsoft Corporation Interfacing with a computing application using a multi-digit sensor
US8928672B2 (en) 2010-04-28 2015-01-06 Mixamo, Inc. Real-time automatic concatenation of 3D animation sequences
US9539510B2 (en) * 2010-04-30 2017-01-10 Microsoft Technology Licensing, Llc Reshapable connector with variable rigidity
US20110289455A1 (en) * 2010-05-18 2011-11-24 Microsoft Corporation Gestures And Gesture Recognition For Manipulating A User-Interface
US8457353B2 (en) 2010-05-18 2013-06-04 Microsoft Corporation Gestures and gesture modifiers for manipulating a user-interface
US8396252B2 (en) 2010-05-20 2013-03-12 Edge 3 Technologies Systems and related methods for three dimensional gesture recognition in vehicles
US8878773B1 (en) 2010-05-24 2014-11-04 Amazon Technologies, Inc. Determining relative motion as input
JP2011253292A (en) * 2010-06-01 2011-12-15 Sony Corp Information processing system, method and program
US20110304649A1 (en) * 2010-06-10 2011-12-15 Microsoft Corporation Character selection
RU2016121160A (en) * 2010-06-10 2018-11-15 Конинклейке Филипс Электроникс Н.В. METHOD AND DEVICE FOR PRESENTING A CHOICE OPTION
US8749557B2 (en) * 2010-06-11 2014-06-10 Microsoft Corporation Interacting with user interface via avatar
US20110304774A1 (en) * 2010-06-11 2011-12-15 Microsoft Corporation Contextual tagging of recorded data
USD667416S1 (en) * 2010-06-11 2012-09-18 Microsoft Corporation Display screen with graphical user interface
US8670029B2 (en) * 2010-06-16 2014-03-11 Microsoft Corporation Depth camera illuminator with superluminescent light-emitting diode
US20110310010A1 (en) * 2010-06-17 2011-12-22 Primesense Ltd. Gesture based user interface
US20110317871A1 (en) * 2010-06-29 2011-12-29 Microsoft Corporation Skeletal joint recognition and tracking system
JP5700963B2 (en) * 2010-06-29 2015-04-15 キヤノン株式会社 Information processing apparatus and control method thereof
EP2402838A1 (en) * 2010-07-03 2012-01-04 Fachhochschule Dortmund Methods and device for the determination and/or feedback and/or control of the effective measurement space in motion capturing systems
AT510176B1 (en) * 2010-07-07 2012-02-15 Roman Eugenio Mag Anderl METHOD FOR CONTROLLING AN INTERACTIVE DISPLAY
EP2593847A4 (en) * 2010-07-15 2017-03-15 Hewlett-Packard Development Company, L.P. First response and second response
CN102959616B (en) 2010-07-20 2015-06-10 苹果公司 Interactive reality augmentation for natural interaction
US9201501B2 (en) 2010-07-20 2015-12-01 Apple Inc. Adaptive projector
CN102906671B (en) * 2010-07-20 2016-03-02 松下电器(美国)知识产权公司 Gesture input device and gesture input method
US8797328B2 (en) 2010-07-23 2014-08-05 Mixamo, Inc. Automatic generation of 3D character animation from 3D meshes
JP5675196B2 (en) * 2010-07-24 2015-02-25 キヤノン株式会社 Information processing apparatus and control method thereof
WO2012014590A1 (en) * 2010-07-28 2012-02-02 パイオニア株式会社 Video processing device and method
CN102348041A (en) * 2010-08-06 2012-02-08 天津三星光电子有限公司 Digital camera with function of inputting photo label by virtual keyboard
US9118832B2 (en) 2010-08-17 2015-08-25 Nokia Technologies Oy Input method
US8497897B2 (en) 2010-08-17 2013-07-30 Apple Inc. Image capture using luminance and chrominance sensors
US8719066B2 (en) * 2010-08-17 2014-05-06 Edifice Technologies Inc. Systems and methods for capturing, managing, sharing, and visualising asset information of an organization
JP5609416B2 (en) 2010-08-19 2014-10-22 ソニー株式会社 Information processing apparatus, information processing method, and program
JP5561031B2 (en) * 2010-08-30 2014-07-30 コニカミノルタ株式会社 Display processing apparatus, scroll display method, and computer program
US8655093B2 (en) 2010-09-02 2014-02-18 Edge 3 Technologies, Inc. Method and apparatus for performing segmentation of an image
US8582866B2 (en) 2011-02-10 2013-11-12 Edge 3 Technologies, Inc. Method and apparatus for disparity computation in stereo images
WO2012030872A1 (en) 2010-09-02 2012-03-08 Edge3 Technologies Inc. Method and apparatus for confusion learning
US8666144B2 (en) 2010-09-02 2014-03-04 Edge 3 Technologies, Inc. Method and apparatus for determining disparity of texture
JP5625643B2 (en) 2010-09-07 2014-11-19 ソニー株式会社 Information processing apparatus and information processing method
JP5829390B2 (en) 2010-09-07 2015-12-09 ソニー株式会社 Information processing apparatus and information processing method
US20120059647A1 (en) * 2010-09-08 2012-03-08 International Business Machines Corporation Touchless Texting Exercise
JP5178797B2 (en) * 2010-09-13 2013-04-10 キヤノン株式会社 Display control apparatus and display control method
JP5256265B2 (en) * 2010-09-17 2013-08-07 株式会社ソニー・コンピュータエンタテインメント Computer system, computer system control method, program, and information storage medium
US9870068B2 (en) 2010-09-19 2018-01-16 Facebook, Inc. Depth mapping with a head mounted display using stereo cameras and structured light
JP5167523B2 (en) * 2010-09-22 2013-03-21 島根県 Operation input device, operation determination method, and program
JP5515067B2 (en) * 2011-07-05 2014-06-11 島根県 Operation input device, operation determination method, and program
WO2012039140A1 (en) * 2010-09-22 2012-03-29 島根県 Operation input apparatus, operation input method, and program
US8538132B2 (en) 2010-09-24 2013-09-17 Apple Inc. Component concentricity
US8959013B2 (en) 2010-09-27 2015-02-17 Apple Inc. Virtual keyboard for a non-tactile three dimensional user interface
JP5598232B2 (en) 2010-10-04 2014-10-01 ソニー株式会社 Information processing apparatus, information processing system, and information processing method
KR20120046973A (en) * 2010-11-03 2012-05-11 삼성전자주식회사 Method and apparatus for generating motion information
US8861797B2 (en) * 2010-11-12 2014-10-14 At&T Intellectual Property I, L.P. Calibrating vision systems
EP2455841A3 (en) * 2010-11-22 2015-07-15 Samsung Electronics Co., Ltd. Apparatus and method for selecting item using movement of object
US8872762B2 (en) 2010-12-08 2014-10-28 Primesense Ltd. Three dimensional user interface cursor control
US8656279B2 (en) * 2010-12-14 2014-02-18 Sap Ag Global settings for the enablement of culture-based gestures
US9171264B2 (en) 2010-12-15 2015-10-27 Microsoft Technology Licensing, Llc Parallel processing machine learning decision tree training
US8920241B2 (en) * 2010-12-15 2014-12-30 Microsoft Corporation Gesture controlled persistent handles for interface guides
US20130154913A1 (en) * 2010-12-16 2013-06-20 Siemens Corporation Systems and methods for a gaze and gesture interface
US9821224B2 (en) * 2010-12-21 2017-11-21 Microsoft Technology Licensing, Llc Driving simulator control with virtual skeleton
JP5653206B2 (en) * 2010-12-27 2015-01-14 日立マクセル株式会社 Video processing device
KR101430887B1 (en) 2010-12-29 2014-08-18 엠파이어 테크놀로지 디벨롭먼트 엘엘씨 Environment-dependent dynamic range control for gesture recognition
JP5809290B2 (en) * 2011-01-05 2015-11-10 グーグル・インコーポレーテッド Method and system for facilitating text entry
US9015641B2 (en) 2011-01-06 2015-04-21 Blackberry Limited Electronic device and method of providing visual notification of a received communication
US9477311B2 (en) 2011-01-06 2016-10-25 Blackberry Limited Electronic device and method of displaying information in response to a gesture
EP2666070A4 (en) 2011-01-19 2016-10-12 Hewlett Packard Development Co Method and system for multimodal and gestural control
US8570320B2 (en) * 2011-01-31 2013-10-29 Microsoft Corporation Using a three-dimensional environment model in gameplay
US20130004928A1 (en) * 2011-01-31 2013-01-03 Jesse Ackerman Reaction training apparatus and methods of use
US20140317577A1 (en) * 2011-02-04 2014-10-23 Koninklijke Philips N.V. Gesture controllable system uses proprioception to create absolute frame of reference
CN103347437B (en) 2011-02-09 2016-06-08 苹果公司 Gaze detection in 3D mapping environment
US8970589B2 (en) 2011-02-10 2015-03-03 Edge 3 Technologies, Inc. Near-touch interaction with a stereo camera grid structured tessellations
CN106943742A (en) * 2011-02-11 2017-07-14 漳州市爵晟电子科技有限公司 One kind action amplification system
US8942917B2 (en) 2011-02-14 2015-01-27 Microsoft Corporation Change invariant scene recognition by an agent
EP2678764A4 (en) * 2011-02-22 2017-03-22 Hewlett-Packard Development Company, L.P. Control area for facilitating user input
KR101896947B1 (en) 2011-02-23 2018-10-31 엘지이노텍 주식회사 An apparatus and method for inputting command using gesture
US20120218395A1 (en) * 2011-02-25 2012-08-30 Microsoft Corporation User interface presentation and interactions
GB2488785A (en) * 2011-03-07 2012-09-12 Sharp Kk A method of user interaction with a device in which a cursor position is calculated using information from tracking part of the user (face) and an object
KR101852428B1 (en) * 2011-03-09 2018-04-26 엘지전자 주식회사 Mobile twrminal and 3d object control method thereof
TW201237773A (en) * 2011-03-15 2012-09-16 Wistron Corp An electronic system, image adjusting method and computer program product thereof
US9079313B2 (en) * 2011-03-15 2015-07-14 Microsoft Technology Licensing, Llc Natural human to robot remote control
KR101781908B1 (en) * 2011-03-24 2017-09-26 엘지전자 주식회사 Mobile terminal and control method thereof
US9842168B2 (en) 2011-03-31 2017-12-12 Microsoft Technology Licensing, Llc Task driven user intents
US9244984B2 (en) 2011-03-31 2016-01-26 Microsoft Technology Licensing, Llc Location based conversational understanding
US9760566B2 (en) 2011-03-31 2017-09-12 Microsoft Technology Licensing, Llc Augmented conversational understanding agent to identify conversation context between two humans and taking an agent action thereof
US9858343B2 (en) 2011-03-31 2018-01-02 Microsoft Technology Licensing Llc Personalization of queries, conversations, and searches
US9298287B2 (en) 2011-03-31 2016-03-29 Microsoft Technology Licensing, Llc Combined activation for natural user interface systems
US10642934B2 (en) 2011-03-31 2020-05-05 Microsoft Technology Licensing, Llc Augmented conversational understanding architecture
US20120249468A1 (en) * 2011-04-04 2012-10-04 Microsoft Corporation Virtual Touchpad Using a Depth Camera
WO2012144666A1 (en) * 2011-04-19 2012-10-26 Lg Electronics Inc. Display device and control method therof
US8928589B2 (en) * 2011-04-20 2015-01-06 Qualcomm Incorporated Virtual keyboards and methods of providing the same
US9440144B2 (en) * 2011-04-21 2016-09-13 Sony Interactive Entertainment Inc. User identified to a controller
US8620113B2 (en) 2011-04-25 2013-12-31 Microsoft Corporation Laser diode modes
US8702507B2 (en) 2011-04-28 2014-04-22 Microsoft Corporation Manual and camera-based avatar control
US10671841B2 (en) * 2011-05-02 2020-06-02 Microsoft Technology Licensing, Llc Attribute state classification
US9064006B2 (en) 2012-08-23 2015-06-23 Microsoft Technology Licensing, Llc Translating natural language utterances to keyword search queries
US9454962B2 (en) 2011-05-12 2016-09-27 Microsoft Technology Licensing, Llc Sentence simplification for spoken language understanding
US9123272B1 (en) 2011-05-13 2015-09-01 Amazon Technologies, Inc. Realistic image lighting and shading
US8619049B2 (en) 2011-05-17 2013-12-31 Microsoft Corporation Monitoring interactions between two or more objects within an environment
KR20120130466A (en) * 2011-05-23 2012-12-03 삼성전자주식회사 Device and method for controlling data of external device in wireless terminal
CN102810239A (en) * 2011-05-31 2012-12-05 鸿富锦精密工业(深圳)有限公司 Accident prevention system and method
US8760395B2 (en) 2011-05-31 2014-06-24 Microsoft Corporation Gesture recognition techniques
US20120311503A1 (en) * 2011-06-06 2012-12-06 Microsoft Corporation Gesture to trigger application-pertinent information
JP2013008365A (en) * 2011-06-23 2013-01-10 Ailive Inc Remote control with motion sensitive devices
JP5840399B2 (en) * 2011-06-24 2016-01-06 株式会社東芝 Information processing device
US9176608B1 (en) 2011-06-27 2015-11-03 Amazon Technologies, Inc. Camera based sensor for motion detection
CN103635240B (en) 2011-07-01 2015-12-16 英派尔科技开发有限公司 Based on the safety approach of the game of posture
US9377865B2 (en) 2011-07-05 2016-06-28 Apple Inc. Zoom-based gesture user interface
US9459758B2 (en) 2011-07-05 2016-10-04 Apple Inc. Gesture-based interface with enhanced features
US8881051B2 (en) 2011-07-05 2014-11-04 Primesense Ltd Zoom-based gesture user interface
US9041734B2 (en) 2011-07-12 2015-05-26 Amazon Technologies, Inc. Simulating three-dimensional features
US10049482B2 (en) 2011-07-22 2018-08-14 Adobe Systems Incorporated Systems and methods for animation recommendations
US10229538B2 (en) * 2011-07-29 2019-03-12 Hewlett-Packard Development Company, L.P. System and method of visual layering
US9030487B2 (en) * 2011-08-01 2015-05-12 Lg Electronics Inc. Electronic device for displaying three-dimensional image and method of using the same
US10088924B1 (en) 2011-08-04 2018-10-02 Amazon Technologies, Inc. Overcoming motion effects in gesture recognition
US8891868B1 (en) 2011-08-04 2014-11-18 Amazon Technologies, Inc. Recognizing gestures captured by video
JP5649535B2 (en) * 2011-08-05 2015-01-07 株式会社東芝 Command issuing device, command issuing method and program
US8798362B2 (en) * 2011-08-15 2014-08-05 Hewlett-Packard Development Company, L.P. Clothing search in images
US9030498B2 (en) 2011-08-15 2015-05-12 Apple Inc. Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
US9218063B2 (en) 2011-08-24 2015-12-22 Apple Inc. Sessionless pointing user interface
US9390318B2 (en) 2011-08-31 2016-07-12 Empire Technology Development Llc Position-setup for gesture-based game system
US9596398B2 (en) * 2011-09-02 2017-03-14 Microsoft Technology Licensing, Llc Automatic image capture
DE102011112618A1 (en) * 2011-09-08 2013-03-14 Eads Deutschland Gmbh Interaction with a three-dimensional virtual scenario
EP3328088A1 (en) * 2011-09-12 2018-05-30 INTEL Corporation Cooperative provision of personalized user functions using shared and personal devices
US8947351B1 (en) 2011-09-27 2015-02-03 Amazon Technologies, Inc. Point of view determinations for finger tracking
US20130080976A1 (en) * 2011-09-28 2013-03-28 Microsoft Corporation Motion controlled list scrolling
US8711091B2 (en) * 2011-10-14 2014-04-29 Lenovo (Singapore) Pte. Ltd. Automatic logical position adjustment of multiple screens
JP5202712B2 (en) * 2011-10-28 2013-06-05 株式会社東芝 Display device and information transmission method
US9672609B1 (en) 2011-11-11 2017-06-06 Edge 3 Technologies, Inc. Method and apparatus for improved depth-map estimation
CN103105924B (en) * 2011-11-15 2015-09-09 中国科学院深圳先进技术研究院 Man-machine interaction method and device
US10748325B2 (en) 2011-11-17 2020-08-18 Adobe Inc. System and method for automatic rigging of three dimensional characters for facial animation
US8635637B2 (en) 2011-12-02 2014-01-21 Microsoft Corporation User interface presenting an animated avatar performing a media reaction
US8657681B2 (en) 2011-12-02 2014-02-25 Empire Technology Development Llc Safety scheme for gesture-based game system
CN103135755B (en) * 2011-12-02 2016-04-06 深圳泰山在线科技有限公司 Interactive system and method
EP2602692A1 (en) * 2011-12-05 2013-06-12 Alcatel Lucent Method for recognizing gestures and gesture detector
US9100685B2 (en) 2011-12-09 2015-08-04 Microsoft Technology Licensing, Llc Determining audience state or interest using passive sensor data
CN103974752B (en) * 2011-12-19 2016-05-18 英派尔科技开发有限公司 Be used for the time-out of the game based on posture and restart scheme
US9207852B1 (en) * 2011-12-20 2015-12-08 Amazon Technologies, Inc. Input mechanisms for electronic devices
CN102495675A (en) * 2011-12-20 2012-06-13 陈岚婕 Interactive multimedia advertising control system
US20120092248A1 (en) * 2011-12-23 2012-04-19 Sasanka Prabhala method, apparatus, and system for energy efficiency and energy conservation including dynamic user interface based on viewing conditions
CN104137173B (en) 2011-12-28 2017-12-19 株式会社尼康 Display device and projection arrangement
JP2013152711A (en) * 2011-12-28 2013-08-08 Nikon Corp Projector and display device
KR101237472B1 (en) * 2011-12-30 2013-02-28 삼성전자주식회사 Electronic apparatus and method for controlling electronic apparatus thereof
KR20130081580A (en) * 2012-01-09 2013-07-17 삼성전자주식회사 Display apparatus and controlling method thereof
US20130181892A1 (en) * 2012-01-13 2013-07-18 Nokia Corporation Image Adjusting
US9223415B1 (en) 2012-01-17 2015-12-29 Amazon Technologies, Inc. Managing resource usage for task performance
EP2618316B1 (en) * 2012-01-23 2018-08-15 Novomatic AG Wheel of fortune with gesture control
US8884928B1 (en) 2012-01-26 2014-11-11 Amazon Technologies, Inc. Correcting for parallax in electronic displays
US20130198690A1 (en) * 2012-02-01 2013-08-01 Microsoft Corporation Visual indication of graphical user interface relationship
JP2013165366A (en) * 2012-02-10 2013-08-22 Sony Corp Image processing device, image processing method, and program
US9055027B2 (en) * 2012-02-13 2015-06-09 Lenovo (Beijing) Co., Ltd. Transmission method and electronic device
US20150220149A1 (en) * 2012-02-14 2015-08-06 Google Inc. Systems and methods for a virtual grasping user interface
US9791932B2 (en) * 2012-02-27 2017-10-17 Microsoft Technology Licensing, Llc Semaphore gesture for human-machine interface
WO2013130682A1 (en) * 2012-02-27 2013-09-06 5 Examples, Inc. Date entry system controllers for receiving user input line traces relative to user interfaces to determine ordered actions, and related systems and methods
US9229534B2 (en) 2012-02-28 2016-01-05 Apple Inc. Asymmetric mapping for tactile and non-tactile user interfaces
US9747495B2 (en) 2012-03-06 2017-08-29 Adobe Systems Incorporated Systems and methods for creating and distributing modifiable animated video messages
US9063574B1 (en) 2012-03-14 2015-06-23 Amazon Technologies, Inc. Motion detection systems for electronic devices
US10503373B2 (en) * 2012-03-14 2019-12-10 Sony Interactive Entertainment LLC Visual feedback for highlight-driven gesture user interfaces
US8654076B2 (en) * 2012-03-15 2014-02-18 Nokia Corporation Touch screen hover input handling
EP2828768A4 (en) * 2012-03-20 2015-10-14 A9 Com Inc Structured lighting-based content interactions in multiple environments
US9373025B2 (en) 2012-03-20 2016-06-21 A9.Com, Inc. Structured lighting-based content interactions in multiple environments
JP5646532B2 (en) * 2012-03-26 2014-12-24 ヤフー株式会社 Operation input device, operation input method, and program
CN104246682B (en) 2012-03-26 2017-08-25 苹果公司 Enhanced virtual touchpad and touch-screen
US9285895B1 (en) 2012-03-28 2016-03-15 Amazon Technologies, Inc. Integrated near field sensor for display devices
JP6068741B2 (en) * 2012-03-30 2017-01-25 シャープ株式会社 Display system
US8928590B1 (en) * 2012-04-03 2015-01-06 Edge 3 Technologies, Inc. Gesture keyboard method and apparatus
US8898687B2 (en) 2012-04-04 2014-11-25 Microsoft Corporation Controlling a media program based on a media reaction
TWI454966B (en) * 2012-04-24 2014-10-01 Wistron Corp Gesture control method and gesture control device
US8994650B2 (en) 2012-04-27 2015-03-31 Qualcomm Incorporated Processing image input to communicate a command to a remote display device
TWI476706B (en) * 2012-04-30 2015-03-11 Pixart Imaging Inc Method for outputting command by detecting object movement and system thereof
TWI485577B (en) * 2012-05-03 2015-05-21 Compal Electronics Inc Electronic apparatus and operating method thereof
CA2775700C (en) 2012-05-04 2013-07-23 Microsoft Corporation Determining a future portion of a currently presented media program
US10155168B2 (en) 2012-05-08 2018-12-18 Snap Inc. System and method for adaptable avatars
TWI590098B (en) * 2012-05-09 2017-07-01 劉鴻達 Control system using facial expressions as inputs
US8938124B2 (en) 2012-05-10 2015-01-20 Pointgrab Ltd. Computer vision based tracking of a hand
US9619036B2 (en) 2012-05-11 2017-04-11 Comcast Cable Communications, Llc System and methods for controlling a user experience
JP5713959B2 (en) * 2012-05-23 2015-05-07 株式会社東芝 Electronic device, method, and program
US9747306B2 (en) * 2012-05-25 2017-08-29 Atheer, Inc. Method and apparatus for identifying input features for later recognition
US9262068B2 (en) * 2012-05-31 2016-02-16 Opportunity Partners Inc. Interactive surface
US10114609B2 (en) 2012-05-31 2018-10-30 Opportunity Partners Inc. Computing interface for users with disabilities
US8957973B2 (en) 2012-06-11 2015-02-17 Omnivision Technologies, Inc. Shutter release using secondary camera
WO2014002803A1 (en) * 2012-06-25 2014-01-03 オムロン株式会社 Motion sensor, method for detecting object action, and game device
CN102799273B (en) * 2012-07-11 2015-04-15 华南理工大学 Interaction control system and method
TW201405443A (en) * 2012-07-17 2014-02-01 Wistron Corp Gesture input systems and methods
CN102854981A (en) * 2012-07-30 2013-01-02 成都西可科技有限公司 Body technology based virtual keyboard character input method
USD733729S1 (en) * 2012-09-04 2015-07-07 Samsung Electronics Co, Ltd. TV display screen with graphical user interface
KR102035134B1 (en) 2012-09-24 2019-10-22 엘지전자 주식회사 Image display apparatus and method for operating the same
US9423886B1 (en) 2012-10-02 2016-08-23 Amazon Technologies, Inc. Sensor connectivity approaches
US10591998B2 (en) 2012-10-03 2020-03-17 Rakuten, Inc. User interface device, user interface method, program, and computer-readable information storage medium
WO2014054718A1 (en) * 2012-10-03 2014-04-10 楽天株式会社 User interface device, user interface method, program, and computer-readable information storage medium
JP6103875B2 (en) * 2012-10-16 2017-03-29 キヤノン株式会社 Hand gesture recognition device and control method thereof
TWI475496B (en) * 2012-10-16 2015-03-01 Wistron Corp Gesture control device and method for setting and cancelling gesture operating region in gesture control device
CN102945078A (en) * 2012-11-13 2013-02-27 深圳先进技术研究院 Human-computer interaction equipment and human-computer interaction method
US9064168B2 (en) 2012-12-14 2015-06-23 Hand Held Products, Inc. Selective output of decoded message data
US20140139556A1 (en) * 2012-11-22 2014-05-22 Shanghai Powermo Information Tech. Co. Ltd. Apparatus and method for displaying software keyboards thereof
KR101450586B1 (en) * 2012-11-28 2014-10-15 (주) 미디어인터랙티브 Method, system and computer-readable recording media for motion recognition
US9977492B2 (en) * 2012-12-06 2018-05-22 Microsoft Technology Licensing, Llc Mixed reality presentation
JP5950806B2 (en) * 2012-12-06 2016-07-13 三菱電機株式会社 Input device, information processing method, and information processing program
US10051329B2 (en) * 2012-12-10 2018-08-14 DISH Technologies L.L.C. Apparatus, systems, and methods for selecting and presenting information about program content
US10474342B2 (en) * 2012-12-17 2019-11-12 Microsoft Technology Licensing, Llc Scrollable user interface control
US20140340498A1 (en) * 2012-12-20 2014-11-20 Google Inc. Using distance between objects in touchless gestural interfaces
TW201426434A (en) * 2012-12-21 2014-07-01 Ind Tech Res Inst Non-touch control system
DE102012224321B4 (en) * 2012-12-21 2022-12-15 Applejack 199 L.P. Measuring device for detecting a hitting movement of a racket, training device and method for training a hitting movement
US20140176690A1 (en) * 2012-12-21 2014-06-26 Technologies Humanware Inc. Magnification system
JP2014127124A (en) * 2012-12-27 2014-07-07 Sony Corp Information processing apparatus, information processing method, and program
CN103902192A (en) * 2012-12-28 2014-07-02 腾讯科技(北京)有限公司 Trigger control method and trigger control device for man-machine interactive operation
KR20140087787A (en) * 2012-12-31 2014-07-09 삼성전자주식회사 display apparatus and method for controlling the display apparatus therof
JP6171353B2 (en) * 2013-01-18 2017-08-02 株式会社リコー Information processing apparatus, system, information processing method, and program
JP6070211B2 (en) * 2013-01-22 2017-02-01 株式会社リコー Information processing apparatus, system, image projection apparatus, information processing method, and program
SE536989C2 (en) * 2013-01-22 2014-11-25 Crunchfish Ab Improved feedback in a seamless user interface
CA2900425C (en) 2013-02-07 2023-06-13 Dizmo Ag System for organizing and displaying information on a display device
JP5950845B2 (en) * 2013-02-07 2016-07-13 三菱電機株式会社 Input device, information processing method, and information processing program
US20140258942A1 (en) * 2013-03-05 2014-09-11 Intel Corporation Interaction of multiple perceptual sensing inputs
US9035874B1 (en) 2013-03-08 2015-05-19 Amazon Technologies, Inc. Providing user input to a computing device with an eye closure
DE102013004246A1 (en) 2013-03-12 2014-09-18 Audi Ag A device associated with a vehicle with spelling means - completion mark
DE102013004244A1 (en) * 2013-03-12 2014-09-18 Audi Ag A device associated with a vehicle with spelling means - erase button and / or list selection button
US10120540B2 (en) * 2013-03-14 2018-11-06 Samsung Electronics Co., Ltd. Visual feedback for user interface navigation on television system
US10220303B1 (en) 2013-03-15 2019-03-05 Harmonix Music Systems, Inc. Gesture-based music game
US10721448B2 (en) 2013-03-15 2020-07-21 Edge 3 Technologies, Inc. Method and apparatus for adaptive exposure bracketing, segmentation and scene organization
EP2797308A3 (en) * 2013-04-22 2015-01-07 Technologies Humanware Inc Live panning system and method
US9665595B2 (en) 2013-05-01 2017-05-30 Cloudsight, Inc. Image processing client
US9639867B2 (en) 2013-05-01 2017-05-02 Cloudsight, Inc. Image processing system including image priority
US9438947B2 (en) 2013-05-01 2016-09-06 Google Inc. Content annotation tool
US9575995B2 (en) 2013-05-01 2017-02-21 Cloudsight, Inc. Image processing methods
US9569465B2 (en) 2013-05-01 2017-02-14 Cloudsight, Inc. Image processing
US10140631B2 (en) 2013-05-01 2018-11-27 Cloudsignt, Inc. Image processing server
US10223454B2 (en) 2013-05-01 2019-03-05 Cloudsight, Inc. Image directed search
US9830522B2 (en) * 2013-05-01 2017-11-28 Cloudsight, Inc. Image processing including object selection
US9563955B1 (en) 2013-05-15 2017-02-07 Amazon Technologies, Inc. Object tracking techniques
US9829984B2 (en) * 2013-05-23 2017-11-28 Fastvdo Llc Motion-assisted visual language for human computer interfaces
CN103279188A (en) * 2013-05-29 2013-09-04 山东大学 Method for operating and controlling PPT in non-contact mode based on Kinect
US10262462B2 (en) 2014-04-18 2019-04-16 Magic Leap, Inc. Systems and methods for augmented and virtual reality
US20140368434A1 (en) * 2013-06-13 2014-12-18 Microsoft Corporation Generation of text by way of a touchless interface
CN103336579A (en) * 2013-07-05 2013-10-02 百度在线网络技术(北京)有限公司 Input method of wearable device and wearable device
WO2015017304A1 (en) * 2013-07-30 2015-02-05 Kodak Alaris Inc. System and method for creating navigable views of ordered images
US9356061B2 (en) 2013-08-05 2016-05-31 Apple Inc. Image sensor with buried light shield and vertical gate
TWI508027B (en) * 2013-08-08 2015-11-11 Huper Lab Co Ltd Three dimensional detecting device and method for detecting images thereof
JP6123562B2 (en) * 2013-08-08 2017-05-10 株式会社ニコン Imaging device
US9177410B2 (en) * 2013-08-09 2015-11-03 Ayla Mandel System and method for creating avatars or animated sequences using human body features extracted from a still image
US20150124566A1 (en) 2013-10-04 2015-05-07 Thalmic Labs Inc. Systems, articles and methods for wearable electronic devices employing contact sensors
US10188309B2 (en) 2013-11-27 2019-01-29 North Inc. Systems, articles, and methods for electromyography sensors
US10042422B2 (en) 2013-11-12 2018-08-07 Thalmic Labs Inc. Systems, articles, and methods for capacitive electromyography sensors
US11921471B2 (en) 2013-08-16 2024-03-05 Meta Platforms Technologies, Llc Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source
CN103489107B (en) * 2013-08-16 2015-11-25 北京京东尚科信息技术有限公司 A kind of method and apparatus making virtual fitting model image
TW201508609A (en) * 2013-08-19 2015-03-01 Wistron Corp Method of interacting with large display device and related interaction system
US9269012B2 (en) 2013-08-22 2016-02-23 Amazon Technologies, Inc. Multi-tracker object tracking
CN103428551A (en) * 2013-08-24 2013-12-04 渭南高新区金石为开咨询有限公司 Gesture remote control system
US11199906B1 (en) 2013-09-04 2021-12-14 Amazon Technologies, Inc. Global user input management
WO2015040020A1 (en) * 2013-09-17 2015-03-26 Koninklijke Philips N.V. Gesture enabled simultaneous selection of range and value
US10055013B2 (en) 2013-09-17 2018-08-21 Amazon Technologies, Inc. Dynamic object tracking for user interfaces
US20150091841A1 (en) * 2013-09-30 2015-04-02 Kobo Incorporated Multi-part gesture for operating an electronic personal display
US9367203B1 (en) 2013-10-04 2016-06-14 Amazon Technologies, Inc. User interface techniques for simulating three-dimensional depth
US10318100B2 (en) * 2013-10-16 2019-06-11 Atheer, Inc. Method and apparatus for addressing obstruction in an interface
US10152136B2 (en) * 2013-10-16 2018-12-11 Leap Motion, Inc. Velocity field interaction for free space gesture interface and control
MX2016005338A (en) 2013-10-23 2017-03-20 Facebook Inc Three dimensional depth mapping using dynamic structured light.
WO2015063901A1 (en) * 2013-10-30 2015-05-07 株式会社東芝 Electronic device, operation control method, and program
WO2015072150A1 (en) * 2013-11-15 2015-05-21 パナソニックIpマネジメント株式会社 Information display device and information display method
US9451434B2 (en) 2013-11-27 2016-09-20 At&T Intellectual Property I, L.P. Direct interaction between a user and a communication network
KR20150062317A (en) * 2013-11-29 2015-06-08 현대모비스 주식회사 Multimedia apparatus of an autombile
JP5770251B2 (en) * 2013-12-02 2015-08-26 日立マクセル株式会社 Operation control device
US9891712B2 (en) 2013-12-16 2018-02-13 Leap Motion, Inc. User-defined virtual interaction space and manipulation of virtual cameras with vectors
US20150169153A1 (en) * 2013-12-17 2015-06-18 Lenovo (Singapore) Pte, Ltd. Enhancing a viewing area around a cursor
CN105027031A (en) * 2013-12-19 2015-11-04 谷歌公司 Using distance between objects in touchless gestural interfaces
US9538072B2 (en) * 2013-12-23 2017-01-03 Lenovo (Singapore) Pte. Ltd. Gesture invoked image capture
US9607409B2 (en) 2013-12-23 2017-03-28 Empire Technology Development Llc Suppression of real features in see-through display
US20150185858A1 (en) * 2013-12-26 2015-07-02 Wes A. Nagara System and method of plane field activation for a gesture-based control system
CN103713823B (en) * 2013-12-30 2017-12-26 深圳泰山体育科技股份有限公司 Real-time update operates the method and system of zone position
EP3090322A4 (en) * 2013-12-31 2017-07-19 Eyefluence, Inc. Systems and methods for gaze-based media selection and editing
US9529445B2 (en) 2014-01-08 2016-12-27 Fujitsu Limited Input device and input method
JP6269316B2 (en) 2014-01-08 2018-01-31 富士通株式会社 Input device, input method, and input program
WO2015105044A1 (en) * 2014-01-10 2015-07-16 日本電気株式会社 Interface device, portable device, control device, module, control method, and program storage medium
US9720506B2 (en) * 2014-01-14 2017-08-01 Microsoft Technology Licensing, Llc 3D silhouette sensing system
US10438631B2 (en) 2014-02-05 2019-10-08 Snap Inc. Method for real-time video processing involving retouching of an object in the video
US10191536B2 (en) * 2014-02-07 2019-01-29 Koninklijke Philips N.V. Method of operating a control system and control system therefore
US9443031B2 (en) * 2014-02-13 2016-09-13 Apteryx, Inc. System and method to capture an image over the web
CN103997668B (en) * 2014-02-25 2017-12-22 华为技术有限公司 Mobile device chooses display methods and terminal device
WO2015129152A1 (en) * 2014-02-26 2015-09-03 株式会社ソシオネクスト Image recognition system and semiconductor integrated circuit
US9990046B2 (en) 2014-03-17 2018-06-05 Oblong Industries, Inc. Visual collaboration interface
US10222866B2 (en) * 2014-03-24 2019-03-05 Beijing Lenovo Software Ltd. Information processing method and electronic device
JP2015196091A (en) * 2014-04-02 2015-11-09 アップルジャック 199 エル.ピー. Sensor-based gaming system for avatar to represent player in virtual environment
CA2885880C (en) * 2014-04-04 2018-07-31 Image Searcher, Inc. Image processing including object selection
US20150301606A1 (en) * 2014-04-18 2015-10-22 Valentin Andrei Techniques for improved wearable computing device gesture based interactions
WO2015170641A1 (en) * 2014-05-08 2015-11-12 Necソリューションイノベータ株式会社 Operation screen display device, operation screen display method, and non-temporary recording medium
US10845884B2 (en) * 2014-05-13 2020-11-24 Lenovo (Singapore) Pte. Ltd. Detecting inadvertent gesture controls
USD849763S1 (en) * 2014-05-30 2019-05-28 Maria Francisca Jones Electronic device with graphical user interface
USD903660S1 (en) 2014-05-30 2020-12-01 Maria Francisca Jones Electronic device with graphical user interface
USD813242S1 (en) 2014-05-30 2018-03-20 Maria Francisca Jones Display screen with graphical user interface
JP5963806B2 (en) * 2014-05-30 2016-08-03 京セラドキュメントソリューションズ株式会社 Character input system, information processing apparatus, and character input method
US9526983B2 (en) * 2014-06-04 2016-12-27 Chih-Feng Lin Virtual reality avatar traveling control system and virtual reality avatar traveling control method
US9971492B2 (en) * 2014-06-04 2018-05-15 Quantum Interface, Llc Dynamic environment for object and attribute display and interaction
US10474317B2 (en) * 2014-06-25 2019-11-12 Oracle International Corporation Dynamic node grouping in grid-based visualizations
JP6282188B2 (en) 2014-07-04 2018-02-21 クラリオン株式会社 Information processing device
KR101567469B1 (en) 2014-07-09 2015-11-20 주식회사 버추어패브릭스 Apparatus and method for controlling virtual input device for augmented reality device
WO2016009016A1 (en) * 2014-07-17 2016-01-21 Koninklijke Philips N.V. Method of obtaining gesture zone definition data for a control system based on user input
JP6428020B2 (en) * 2014-07-24 2018-11-28 セイコーエプソン株式会社 GUI device
US9645641B2 (en) * 2014-08-01 2017-05-09 Microsoft Technology Licensing, Llc Reflection-based control activation
USD724606S1 (en) * 2014-08-29 2015-03-17 Nike, Inc. Display screen with emoticon
USD724098S1 (en) * 2014-08-29 2015-03-10 Nike, Inc. Display screen with emoticon
USD723046S1 (en) * 2014-08-29 2015-02-24 Nike, Inc. Display screen with emoticon
USD724099S1 (en) * 2014-08-29 2015-03-10 Nike, Inc. Display screen with emoticon
USD725130S1 (en) * 2014-08-29 2015-03-24 Nike, Inc. Display screen with emoticon
USD725131S1 (en) * 2014-08-29 2015-03-24 Nike, Inc. Display screen with emoticon
USD725129S1 (en) * 2014-08-29 2015-03-24 Nike, Inc. Display screen with emoticon
USD723579S1 (en) * 2014-08-29 2015-03-03 Nike, Inc. Display screen with emoticon
USD723578S1 (en) * 2014-08-29 2015-03-03 Nike, Inc. Display screen with emoticon
USD726199S1 (en) * 2014-08-29 2015-04-07 Nike, Inc. Display screen with emoticon
USD723577S1 (en) * 2014-08-29 2015-03-03 Nike, Inc. Display screen with emoticon
JP6519074B2 (en) * 2014-09-08 2019-05-29 任天堂株式会社 Electronics
US10788948B2 (en) 2018-03-07 2020-09-29 Quantum Interface, Llc Systems, apparatuses, interfaces and implementing methods for displaying and manipulating temporal or sequential objects
US9792957B2 (en) 2014-10-08 2017-10-17 JBF Interlude 2009 LTD Systems and methods for dynamic video bookmarking
TWI591514B (en) * 2014-11-07 2017-07-11 鴻海精密工業股份有限公司 System and method for generating gestures
CN107079126A (en) 2014-11-13 2017-08-18 惠普发展公司,有限责任合伙企业 Image projection
KR20210099163A (en) * 2014-12-18 2021-08-11 페이스북, 인크. Method, system and device for navigating in a virtual reality environment
KR102329124B1 (en) * 2015-01-05 2021-11-19 삼성전자주식회사 Image display apparatus and method for displaying image
USD778320S1 (en) * 2015-01-05 2017-02-07 Nike, Inc. Display screen with icon
WO2016118098A1 (en) * 2015-01-20 2016-07-28 Ozturk Gurkan A method for layout and selection of the menu elements in man-machine interface
JP6494305B2 (en) * 2015-01-29 2019-04-03 株式会社ソニー・インタラクティブエンタテインメント Information processing apparatus, display apparatus, and information processing method
JP6603024B2 (en) 2015-02-10 2019-11-06 任天堂株式会社 Information processing program, information processing apparatus, information processing system, and information processing method
JP6519075B2 (en) * 2015-02-10 2019-05-29 任天堂株式会社 INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD
JP6534011B2 (en) 2015-02-10 2019-06-26 任天堂株式会社 INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD
JP6561400B2 (en) * 2015-02-10 2019-08-21 任天堂株式会社 Information processing apparatus, information processing program, information processing system, and information processing method
JP2016157587A (en) * 2015-02-24 2016-09-01 サンケン電気株式会社 Lighting device
MX364878B (en) 2015-02-25 2019-05-09 Facebook Inc Identifying an object in a volume based on characteristics of light reflected by the object.
KR20160109304A (en) * 2015-03-10 2016-09-21 삼성전자주식회사 Remotely controller and method for controlling a screen of display apparatus
WO2016154218A1 (en) 2015-03-22 2016-09-29 Oculus Vr, Llc Depth mapping with a head mounted display using stereo cameras and structured light
US9536136B2 (en) * 2015-03-24 2017-01-03 Intel Corporation Multi-layer skin detection and fused hand pose matching
CN105045373B (en) * 2015-03-26 2018-01-09 济南大学 A kind of three-dimension gesture exchange method of user oriented mental model expression
CN104714649A (en) * 2015-03-31 2015-06-17 王子强 Kinect-based naked-eye 3D UI interaction method
KR20160123879A (en) * 2015-04-17 2016-10-26 삼성전자주식회사 Electronic apparatus and method for displaying screen thereof
CN104866096B (en) * 2015-05-18 2018-01-05 中国科学院软件研究所 A kind of method for carrying out command selection using upper arm stretching, extension information
WO2016185634A1 (en) * 2015-05-21 2016-11-24 株式会社ソニー・インタラクティブエンタテインメント Information processing device
DE102015006613A1 (en) * 2015-05-21 2016-11-24 Audi Ag Operating system and method for operating an operating system for a motor vehicle
JP6553418B2 (en) * 2015-06-12 2019-07-31 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Display control method, display control device and control program
US10503392B2 (en) * 2015-06-25 2019-12-10 Oath Inc. User interface adjustment methods and systems
WO2017006426A1 (en) * 2015-07-07 2017-01-12 日立マクセル株式会社 Display system, wearable device, and video display device
WO2017024142A1 (en) * 2015-08-04 2017-02-09 Google Inc. Input via context sensitive collisions of hands with objects in virtual reality
US10460765B2 (en) 2015-08-26 2019-10-29 JBF Interlude 2009 LTD Systems and methods for adaptive and responsive video
US10503360B2 (en) * 2015-11-30 2019-12-10 Unisys Corporation System and method for adaptive control and annotation interface
US10459597B2 (en) * 2016-02-03 2019-10-29 Salesforce.Com, Inc. System and method to navigate 3D data on mobile and desktop
US10157484B2 (en) * 2016-03-11 2018-12-18 International Business Machines Corporation Schema-driven object alignment
US10339365B2 (en) 2016-03-31 2019-07-02 Snap Inc. Automated avatar generation
CN105892665B (en) * 2016-03-31 2019-02-05 联想(北京)有限公司 Information processing method and electronic equipment
CN105912102B (en) * 2016-03-31 2019-02-05 联想(北京)有限公司 A kind of information processing method and electronic equipment
US11856271B2 (en) 2016-04-12 2023-12-26 JBF Interlude 2009 LTD Symbiotic interactive video
US11003345B2 (en) 2016-05-16 2021-05-11 Google Llc Control-article-based control of a user interface
US10474353B2 (en) 2016-05-31 2019-11-12 Snap Inc. Application control using a gesture based trigger
US10559111B2 (en) 2016-06-23 2020-02-11 LoomAi, Inc. Systems and methods for generating computer ready animation models of a human head from captured data images
EP3475920A4 (en) 2016-06-23 2020-01-15 Loomai, Inc. Systems and methods for generating computer ready animation models of a human head from captured data images
US10102423B2 (en) 2016-06-30 2018-10-16 Snap Inc. Object modeling and replacement in a video stream
US10360708B2 (en) 2016-06-30 2019-07-23 Snap Inc. Avatar based ideogram generation
JP6230666B2 (en) * 2016-06-30 2017-11-15 シャープ株式会社 Data input device, data input method, and data input program
US10529302B2 (en) 2016-07-07 2020-01-07 Oblong Industries, Inc. Spatially mediated augmentations of and interactions among distinct devices and applications via extended pixel manifold
US10348662B2 (en) 2016-07-19 2019-07-09 Snap Inc. Generating customized electronic messaging graphics
US11216069B2 (en) 2018-05-08 2022-01-04 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US10409371B2 (en) 2016-07-25 2019-09-10 Ctrl-Labs Corporation Methods and apparatus for inferring user intent based on neuromuscular signals
US11179066B2 (en) 2018-08-13 2021-11-23 Facebook Technologies, Llc Real-time spike detection and identification
US10990174B2 (en) 2016-07-25 2021-04-27 Facebook Technologies, Llc Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors
US11337652B2 (en) 2016-07-25 2022-05-24 Facebook Technologies, Llc System and method for measuring the movements of articulated rigid bodies
US11635736B2 (en) 2017-10-19 2023-04-25 Meta Platforms Technologies, Llc Systems and methods for identifying biological structures associated with neuromuscular source signals
US20200073483A1 (en) 2018-08-31 2020-03-05 Ctrl-Labs Corporation Camera-guided interpretation of neuromuscular signals
WO2018033137A1 (en) * 2016-08-19 2018-02-22 北京市商汤科技开发有限公司 Method, apparatus, and electronic device for displaying service object in video image
US20210290148A1 (en) * 2016-09-19 2021-09-23 Baylor College Of Medicine Instrumented trail making task (itmt)
WO2018057272A1 (en) 2016-09-23 2018-03-29 Apple Inc. Avatar creation and editing
DK179978B1 (en) 2016-09-23 2019-11-27 Apple Inc. Image data for enhanced user interactions
US10609036B1 (en) 2016-10-10 2020-03-31 Snap Inc. Social media post subscribe requests for buffer user accounts
US10198626B2 (en) 2016-10-19 2019-02-05 Snap Inc. Neural networks for facial modeling
US10593116B2 (en) 2016-10-24 2020-03-17 Snap Inc. Augmented reality object manipulation
US10432559B2 (en) 2016-10-24 2019-10-01 Snap Inc. Generating and displaying customized avatars in electronic messages
US10699461B2 (en) * 2016-12-20 2020-06-30 Sony Interactive Entertainment LLC Telepresence of multiple users in interactive virtual space
KR20180074400A (en) * 2016-12-23 2018-07-03 삼성전자주식회사 Display apparatus and method for displaying
US11050809B2 (en) 2016-12-30 2021-06-29 JBF Interlude 2009 LTD Systems and methods for dynamic weighting of branched video paths
CN108268227B (en) * 2017-01-04 2020-12-01 京东方科技集团股份有限公司 Display device
US11616745B2 (en) 2017-01-09 2023-03-28 Snap Inc. Contextual generation and selection of customized media content
US10242503B2 (en) 2017-01-09 2019-03-26 Snap Inc. Surface aware lens
US10242477B1 (en) 2017-01-16 2019-03-26 Snap Inc. Coded vision system
US10951562B2 (en) 2017-01-18 2021-03-16 Snap. Inc. Customized contextual media content item generation
US10454857B1 (en) 2017-01-23 2019-10-22 Snap Inc. Customized digital avatar accessories
US10558749B2 (en) * 2017-01-30 2020-02-11 International Business Machines Corporation Text prediction using captured image from an image capture device
US10255268B2 (en) 2017-01-30 2019-04-09 International Business Machines Corporation Text prediction using multiple devices
JP6809267B2 (en) * 2017-02-10 2021-01-06 富士ゼロックス株式会社 Information processing equipment, information processing systems and programs
US20180267615A1 (en) * 2017-03-20 2018-09-20 Daqri, Llc Gesture-based graphical keyboard for computing devices
USD815120S1 (en) * 2017-03-27 2018-04-10 Sony Corporation Display panel or screen with animated graphical user interface
USD868080S1 (en) 2017-03-27 2019-11-26 Sony Corporation Display panel or screen with an animated graphical user interface
US11069103B1 (en) 2017-04-20 2021-07-20 Snap Inc. Customized user interface for electronic communications
KR102455041B1 (en) 2017-04-27 2022-10-14 스냅 인코포레이티드 Location privacy management on map-based social media platforms
US10212541B1 (en) 2017-04-27 2019-02-19 Snap Inc. Selective location-based identity communication
US11893647B2 (en) 2017-04-27 2024-02-06 Snap Inc. Location-based virtual avatars
WO2018201334A1 (en) * 2017-05-03 2018-11-08 深圳市智晟达科技有限公司 Digital television system
JP6719418B2 (en) * 2017-05-08 2020-07-08 株式会社ニコン Electronics
DK180007B1 (en) 2017-05-16 2020-01-16 Apple Inc. RECORDING AND SENDING EMOJI
DK179948B1 (en) 2017-05-16 2019-10-22 Apple Inc. Recording and sending Emoji
JP6922410B2 (en) 2017-05-19 2021-08-18 富士通株式会社 Posture judgment program, posture judgment device and posture judgment method
US10679428B1 (en) 2017-05-26 2020-06-09 Snap Inc. Neural network-based image stream modification
USD857739S1 (en) * 2017-06-05 2019-08-27 Apple Inc. Display screen or portion thereof with animated graphical user interface
CN107273869B (en) * 2017-06-29 2020-04-24 联想(北京)有限公司 Gesture recognition control method and electronic equipment
US10411798B2 (en) * 2017-07-13 2019-09-10 Qualcomm Incorporated Power optimized VLC signal processing with efficient handling of ISP/VFE
US11122094B2 (en) 2017-07-28 2021-09-14 Snap Inc. Software application manager for messaging applications
US10782793B2 (en) 2017-08-10 2020-09-22 Google Llc Context-sensitive hand interaction
CN107493495B (en) * 2017-08-14 2019-12-13 深圳市国华识别科技开发有限公司 Interactive position determining method, system, storage medium and intelligent terminal
US10904615B2 (en) * 2017-09-07 2021-01-26 International Business Machines Corporation Accessing and analyzing data to select an optimal line-of-sight and determine how media content is distributed and displayed
US10586368B2 (en) 2017-10-26 2020-03-10 Snap Inc. Joint audio-video facial animation system
US10657695B2 (en) 2017-10-30 2020-05-19 Snap Inc. Animated chat presence
KR20190054397A (en) * 2017-11-13 2019-05-22 삼성전자주식회사 Display apparatus and the control method thereof
JP6463826B1 (en) * 2017-11-27 2019-02-06 株式会社ドワンゴ Video distribution server, video distribution method, and video distribution program
US11460974B1 (en) 2017-11-28 2022-10-04 Snap Inc. Content discovery refresh
WO2019108700A1 (en) 2017-11-29 2019-06-06 Snap Inc. Group stories in an electronic messaging application
KR102387861B1 (en) 2017-11-29 2022-04-18 스냅 인코포레이티드 Graphic rendering for electronic messaging applications
AU2017276290A1 (en) 2017-12-14 2019-07-04 Canon Kabushiki Kaisha Method, system and apparatus for selecting items in a graphical user interface
US10257578B1 (en) 2018-01-05 2019-04-09 JBF Interlude 2009 LTD Dynamic library display for interactive videos
US10949648B1 (en) 2018-01-23 2021-03-16 Snap Inc. Region-based stabilized face tracking
US11907423B2 (en) 2019-11-25 2024-02-20 Meta Platforms Technologies, Llc Systems and methods for contextualized interactions with an environment
EP3743892A4 (en) 2018-01-25 2021-03-24 Facebook Technologies, Inc. Visualization of reconstructed handstate information
US11150730B1 (en) 2019-04-30 2021-10-19 Facebook Technologies, Llc Devices, systems, and methods for controlling computing devices via neuromuscular signals of users
US11481030B2 (en) 2019-03-29 2022-10-25 Meta Platforms Technologies, Llc Methods and apparatus for gesture detection and classification
US11493993B2 (en) 2019-09-04 2022-11-08 Meta Platforms Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control
US10937414B2 (en) 2018-05-08 2021-03-02 Facebook Technologies, Llc Systems and methods for text input using neuromuscular information
US20190253751A1 (en) * 2018-02-13 2019-08-15 Perfect Corp. Systems and Methods for Providing Product Information During a Live Broadcast
US10726603B1 (en) 2018-02-28 2020-07-28 Snap Inc. Animated expressive icon
US10979752B1 (en) 2018-02-28 2021-04-13 Snap Inc. Generating media content items based on location information
US11310176B2 (en) 2018-04-13 2022-04-19 Snap Inc. Content suggestion system
WO2019204464A1 (en) 2018-04-18 2019-10-24 Snap Inc. Augmented expression system
KR102524586B1 (en) * 2018-04-30 2023-04-21 삼성전자주식회사 Image display device and operating method for the same
US11722764B2 (en) 2018-05-07 2023-08-08 Apple Inc. Creative camera
US10375313B1 (en) 2018-05-07 2019-08-06 Apple Inc. Creative camera
DK201870378A1 (en) 2018-05-07 2020-01-13 Apple Inc. Displaying user interfaces associated with physical activities
DK201870374A1 (en) 2018-05-07 2019-12-04 Apple Inc. Avatar creation user interface
US10592001B2 (en) 2018-05-08 2020-03-17 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US11875012B2 (en) 2018-05-25 2024-01-16 Ultrahaptics IP Two Limited Throwable interface for augmented reality and virtual reality environments
CN112261907A (en) 2018-05-29 2021-01-22 脸谱科技有限责任公司 Noise reduction shielding technology in surface electromyogram signal measurement and related system and method
US10198845B1 (en) 2018-05-29 2019-02-05 LoomAi, Inc. Methods and systems for animating facial expressions
US11601721B2 (en) * 2018-06-04 2023-03-07 JBF Interlude 2009 LTD Interactive video dynamic adaptation and user profiling
EP3807795A4 (en) 2018-06-14 2021-08-11 Facebook Technologies, LLC. User identification and authentication with neuromuscular signatures
WO2020018892A1 (en) 2018-07-19 2020-01-23 Ctrl-Labs Corporation Methods and apparatus for improved signal robustness for a wearable neuromuscular recording device
CN112424736A (en) 2018-07-19 2021-02-26 索美智能有限公司 Machine interaction
US11074675B2 (en) 2018-07-31 2021-07-27 Snap Inc. Eye texture inpainting
US11030813B2 (en) 2018-08-30 2021-06-08 Snap Inc. Video clip object tracking
CN110908568B (en) * 2018-09-18 2022-11-04 网易(杭州)网络有限公司 Control method and device for virtual object
US10896534B1 (en) 2018-09-19 2021-01-19 Snap Inc. Avatar style transformation using neural networks
CN112789577B (en) * 2018-09-20 2024-04-05 元平台技术有限公司 Neuromuscular text input, writing and drawing in augmented reality systems
US10895964B1 (en) 2018-09-25 2021-01-19 Snap Inc. Interface to display shared user groups
US10732725B2 (en) * 2018-09-25 2020-08-04 XRSpace CO., LTD. Method and apparatus of interactive display based on gesture recognition
CN112771478A (en) 2018-09-26 2021-05-07 脸谱科技有限责任公司 Neuromuscular control of physical objects in an environment
US11189070B2 (en) 2018-09-28 2021-11-30 Snap Inc. System and method of generating targeted user lists using customizable avatar characteristics
US11245658B2 (en) 2018-09-28 2022-02-08 Snap Inc. System and method of generating private notifications between users in a communication session
US10904181B2 (en) 2018-09-28 2021-01-26 Snap Inc. Generating customized graphics having reactions to electronic message content
US10698583B2 (en) 2018-09-28 2020-06-30 Snap Inc. Collaborative achievement interface
CN112822992A (en) 2018-10-05 2021-05-18 脸谱科技有限责任公司 Providing enhanced interaction with physical objects using neuromuscular signals in augmented reality environments
US10872451B2 (en) 2018-10-31 2020-12-22 Snap Inc. 3D avatar rendering
US11103795B1 (en) 2018-10-31 2021-08-31 Snap Inc. Game drawer
US11176737B2 (en) 2018-11-27 2021-11-16 Snap Inc. Textured mesh building
CN113423341A (en) 2018-11-27 2021-09-21 脸谱科技有限责任公司 Method and apparatus for automatic calibration of wearable electrode sensor system
US10902661B1 (en) 2018-11-28 2021-01-26 Snap Inc. Dynamic composite user identifier
US10861170B1 (en) 2018-11-30 2020-12-08 Snap Inc. Efficient human pose tracking in videos
US11199957B1 (en) 2018-11-30 2021-12-14 Snap Inc. Generating customized avatars based on location information
EP3667460A1 (en) * 2018-12-14 2020-06-17 InterDigital CE Patent Holdings Methods and apparatus for user -device interaction
US11055514B1 (en) 2018-12-14 2021-07-06 Snap Inc. Image face manipulation
US11516173B1 (en) 2018-12-26 2022-11-29 Snap Inc. Message composition interface
JP6717393B2 (en) * 2019-01-08 2020-07-01 株式会社ニコン Electronics
US11032670B1 (en) 2019-01-14 2021-06-08 Snap Inc. Destination sharing in location sharing system
US10939246B1 (en) 2019-01-16 2021-03-02 Snap Inc. Location-based context information sharing in a messaging system
US11107261B2 (en) 2019-01-18 2021-08-31 Apple Inc. Virtual avatar animation based on facial feature movement
US11294936B1 (en) 2019-01-30 2022-04-05 Snap Inc. Adaptive spatial density based clustering
US10984575B2 (en) 2019-02-06 2021-04-20 Snap Inc. Body pose estimation
US10656797B1 (en) 2019-02-06 2020-05-19 Snap Inc. Global event-based avatar
US10936066B1 (en) 2019-02-13 2021-03-02 Snap Inc. Sleep detection in a location sharing system
WO2020171108A1 (en) * 2019-02-19 2020-08-27 株式会社Nttドコモ Information processing device
US10964082B2 (en) 2019-02-26 2021-03-30 Snap Inc. Avatar based on weather
US10905383B2 (en) 2019-02-28 2021-02-02 Facebook Technologies, Llc Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces
JP7105984B2 (en) * 2019-03-05 2022-07-25 株式会社Nttドコモ Information processing equipment
US10852918B1 (en) 2019-03-08 2020-12-01 Snap Inc. Contextual information in chat
US11868414B1 (en) 2019-03-14 2024-01-09 Snap Inc. Graph-based prediction for contact suggestion in a location sharing system
US11852554B1 (en) 2019-03-21 2023-12-26 Snap Inc. Barometer calibration in a location sharing system
US10674311B1 (en) 2019-03-28 2020-06-02 Snap Inc. Points of interest in a location sharing system
US11166123B1 (en) 2019-03-28 2021-11-02 Snap Inc. Grouped transmission of location data in a location sharing system
US10992619B2 (en) 2019-04-30 2021-04-27 Snap Inc. Messaging system with avatar generation
DK201970531A1 (en) 2019-05-06 2021-07-09 Apple Inc Avatar integration with multiple applications
USD916872S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a graphical user interface
USD916809S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a transitional graphical user interface
USD916811S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a transitional graphical user interface
USD916871S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a transitional graphical user interface
USD916810S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a graphical user interface
US10893385B1 (en) 2019-06-07 2021-01-12 Snap Inc. Detection of a physical collision between two client devices in a location sharing system
USD917534S1 (en) * 2019-06-10 2021-04-27 Aristocrat Technologies Australia Pty Limited Display screen or portion thereof with transitional graphical user interface
CN113924568A (en) 2019-06-26 2022-01-11 谷歌有限责任公司 Radar-based authentication status feedback
US11676199B2 (en) 2019-06-28 2023-06-13 Snap Inc. Generating customizable avatar outfits
US11189098B2 (en) 2019-06-28 2021-11-30 Snap Inc. 3D object camera customization system
US11188190B2 (en) 2019-06-28 2021-11-30 Snap Inc. Generating animation overlays in a communication session
US11307747B2 (en) 2019-07-11 2022-04-19 Snap Inc. Edge gesture interface with smart interactions
US11589094B2 (en) * 2019-07-22 2023-02-21 At&T Intellectual Property I, L.P. System and method for recommending media content based on actual viewers
US11551393B2 (en) 2019-07-23 2023-01-10 LoomAi, Inc. Systems and methods for animation generation
US11868537B2 (en) 2019-07-26 2024-01-09 Google Llc Robust radar-based gesture-recognition by user equipment
US11385722B2 (en) 2019-07-26 2022-07-12 Google Llc Robust radar-based gesture-recognition by user equipment
JP7316383B2 (en) 2019-07-26 2023-07-27 グーグル エルエルシー Authentication management via IMU and radar
KR20220005081A (en) 2019-07-26 2022-01-12 구글 엘엘씨 State reduction based on IMU and radar
US11163914B2 (en) * 2019-08-01 2021-11-02 Bank Of America Corporation Managing enterprise security by utilizing a smart keyboard and a smart mouse device
US11455081B2 (en) 2019-08-05 2022-09-27 Snap Inc. Message thread prioritization interface
US10911387B1 (en) 2019-08-12 2021-02-02 Snap Inc. Message reminder interface
WO2021040742A1 (en) 2019-08-30 2021-03-04 Google Llc Input-mode notification for a multi-input node
EP3811187B1 (en) * 2019-08-30 2021-10-06 Google LLC Input methods for mobile devices
US11467672B2 (en) 2019-08-30 2022-10-11 Google Llc Context-sensitive control of radar-based gesture-recognition
WO2021040748A1 (en) 2019-08-30 2021-03-04 Google Llc Visual indicator for paused radar gestures
US11320969B2 (en) 2019-09-16 2022-05-03 Snap Inc. Messaging system with battery level sharing
US11425062B2 (en) 2019-09-27 2022-08-23 Snap Inc. Recommended content viewed by friends
US11080917B2 (en) 2019-09-30 2021-08-03 Snap Inc. Dynamic parameterized user avatar stories
US11341569B2 (en) * 2019-10-25 2022-05-24 7-Eleven, Inc. System and method for populating a virtual shopping cart based on video of a customer's shopping session at a physical store
US11218838B2 (en) 2019-10-31 2022-01-04 Snap Inc. Focused map-based context information surfacing
JP7339604B2 (en) * 2019-11-12 2023-09-06 オムロン株式会社 Motion recognition device, motion recognition method, motion recognition program, and motion recognition system
US11063891B2 (en) 2019-12-03 2021-07-13 Snap Inc. Personalized avatar notification
US11128586B2 (en) 2019-12-09 2021-09-21 Snap Inc. Context sensitive avatar captions
US11036989B1 (en) 2019-12-11 2021-06-15 Snap Inc. Skeletal tracking using previous frames
US11227442B1 (en) 2019-12-19 2022-01-18 Snap Inc. 3D captions with semantic graphical elements
US11263817B1 (en) 2019-12-19 2022-03-01 Snap Inc. 3D captions with face tracking
US11128715B1 (en) 2019-12-30 2021-09-21 Snap Inc. Physical friend proximity in chat
US11140515B1 (en) 2019-12-30 2021-10-05 Snap Inc. Interfaces for relative device positioning
US11169658B2 (en) 2019-12-31 2021-11-09 Snap Inc. Combined map icon with action indicator
US11356720B2 (en) 2020-01-30 2022-06-07 Snap Inc. Video generation system to render frames on demand
US11036781B1 (en) 2020-01-30 2021-06-15 Snap Inc. Video generation system to render frames on demand using a fleet of servers
US11284144B2 (en) 2020-01-30 2022-03-22 Snap Inc. Video generation system to render frames on demand using a fleet of GPUs
WO2021155249A1 (en) 2020-01-30 2021-08-05 Snap Inc. System for generating media content items on demand
JP7392512B2 (en) 2020-02-20 2023-12-06 沖電気工業株式会社 Information processing device, information processing method, program, and information processing system
US11619501B2 (en) 2020-03-11 2023-04-04 Snap Inc. Avatar based on trip
US11217020B2 (en) 2020-03-16 2022-01-04 Snap Inc. 3D cutout image modification
US11625873B2 (en) 2020-03-30 2023-04-11 Snap Inc. Personalized media overlay recommendation
US11818286B2 (en) 2020-03-30 2023-11-14 Snap Inc. Avatar recommendation and reply
JP2021174314A (en) * 2020-04-27 2021-11-01 横河電機株式会社 Software development environment providing system, software development environment providing method, and software development environment providing program
CN112639689A (en) * 2020-04-30 2021-04-09 华为技术有限公司 Control method, device and system based on air-separating gesture
US11956190B2 (en) 2020-05-08 2024-04-09 Snap Inc. Messaging system with a carousel of related entities
US11921998B2 (en) 2020-05-11 2024-03-05 Apple Inc. Editing features of an avatar
DK202070625A1 (en) 2020-05-11 2022-01-04 Apple Inc User interfaces related to time
CN111625179B (en) * 2020-06-02 2021-11-16 京东方科技集团股份有限公司 Graph drawing method, electronic device, and computer storage medium
US11543939B2 (en) 2020-06-08 2023-01-03 Snap Inc. Encoded image based messaging system
US11922010B2 (en) 2020-06-08 2024-03-05 Snap Inc. Providing contextual information with keyboard interface for messaging system
US11356392B2 (en) 2020-06-10 2022-06-07 Snap Inc. Messaging system including an external-resource dock and drawer
US11580682B1 (en) 2020-06-30 2023-02-14 Snap Inc. Messaging system with augmented reality makeup
CN111988522B (en) * 2020-07-28 2022-04-22 北京达佳互联信息技术有限公司 Shooting control method and device, electronic equipment and storage medium
US20220066605A1 (en) * 2020-08-26 2022-03-03 BlueStack Systems, Inc. Methods, Systems and Computer Program Products for Enabling Scrolling Within a Software Application
US11863513B2 (en) 2020-08-31 2024-01-02 Snap Inc. Media content playback and comments management
US11360733B2 (en) 2020-09-10 2022-06-14 Snap Inc. Colocated shared augmented reality without shared backend
US11452939B2 (en) 2020-09-21 2022-09-27 Snap Inc. Graphical marker generation system for synchronizing users
US11470025B2 (en) 2020-09-21 2022-10-11 Snap Inc. Chats with micro sound clips
US11910269B2 (en) 2020-09-25 2024-02-20 Snap Inc. Augmented reality content items including user avatar to share location
US11660022B2 (en) 2020-10-27 2023-05-30 Snap Inc. Adaptive skeletal joint smoothing
US11615592B2 (en) 2020-10-27 2023-03-28 Snap Inc. Side-by-side character animation from realtime 3D body motion capture
US11450051B2 (en) 2020-11-18 2022-09-20 Snap Inc. Personalized avatar real-time motion capture
US11734894B2 (en) 2020-11-18 2023-08-22 Snap Inc. Real-time motion transfer for prosthetic limbs
US11748931B2 (en) 2020-11-18 2023-09-05 Snap Inc. Body animation sharing and remixing
US11503090B2 (en) 2020-11-30 2022-11-15 At&T Intellectual Property I, L.P. Remote audience feedback mechanism
US20220198764A1 (en) * 2020-12-18 2022-06-23 Arkh, Inc. Spatially Aware Environment Relocalization
US11790531B2 (en) 2021-02-24 2023-10-17 Snap Inc. Whole body segmentation
US11809633B2 (en) 2021-03-16 2023-11-07 Snap Inc. Mirroring device with pointing based navigation
US11734959B2 (en) 2021-03-16 2023-08-22 Snap Inc. Activating hands-free mode on mirroring device
US11908243B2 (en) 2021-03-16 2024-02-20 Snap Inc. Menu hierarchy navigation on electronic mirroring devices
US11798201B2 (en) 2021-03-16 2023-10-24 Snap Inc. Mirroring device with whole-body outfits
US11544885B2 (en) 2021-03-19 2023-01-03 Snap Inc. Augmented reality experience based on physical items
US11562548B2 (en) 2021-03-22 2023-01-24 Snap Inc. True size eyewear in real time
US11868531B1 (en) 2021-04-08 2024-01-09 Meta Platforms Technologies, Llc Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof
US11636654B2 (en) 2021-05-19 2023-04-25 Snap Inc. AR-based connected portal shopping
US11714536B2 (en) 2021-05-21 2023-08-01 Apple Inc. Avatar sticker editor user interfaces
US11882337B2 (en) 2021-05-28 2024-01-23 JBF Interlude 2009 LTD Automated platform for generating interactive videos
US11776190B2 (en) * 2021-06-04 2023-10-03 Apple Inc. Techniques for managing an avatar on a lock screen
JPWO2023276216A1 (en) * 2021-06-29 2023-01-05
US11941227B2 (en) 2021-06-30 2024-03-26 Snap Inc. Hybrid search system for customizable media
US11854069B2 (en) 2021-07-16 2023-12-26 Snap Inc. Personalized try-on ads
EP4141619A1 (en) * 2021-08-26 2023-03-01 TK Elevator Innovation and Operations GmbH Touchless visualisation and command system and corresponding method and use and computer program
US11908083B2 (en) 2021-08-31 2024-02-20 Snap Inc. Deforming custom mesh based on body mesh
US11670059B2 (en) 2021-09-01 2023-06-06 Snap Inc. Controlling interactive fashion based on body gestures
US11673054B2 (en) 2021-09-07 2023-06-13 Snap Inc. Controlling AR games on fashion items
US11663792B2 (en) 2021-09-08 2023-05-30 Snap Inc. Body fitted accessory with physics simulation
US11900506B2 (en) 2021-09-09 2024-02-13 Snap Inc. Controlling interactive fashion based on facial expressions
US11734866B2 (en) 2021-09-13 2023-08-22 Snap Inc. Controlling interactive fashion based on voice
US11798238B2 (en) 2021-09-14 2023-10-24 Snap Inc. Blending body mesh into external mesh
US11836866B2 (en) 2021-09-20 2023-12-05 Snap Inc. Deforming real-world object using an external mesh
US11934477B2 (en) 2021-09-24 2024-03-19 JBF Interlude 2009 LTD Video player integration within websites
US20230116341A1 (en) * 2021-09-30 2023-04-13 Futian ZHANG Methods and apparatuses for hand gesture-based control of selection focus
US11636662B2 (en) 2021-09-30 2023-04-25 Snap Inc. Body normal network light and rendering control
US20240096033A1 (en) * 2021-10-11 2024-03-21 Meta Platforms Technologies, Llc Technology for creating, replicating and/or controlling avatars in extended reality
US11651572B2 (en) 2021-10-11 2023-05-16 Snap Inc. Light and rendering of garments
US11836862B2 (en) 2021-10-11 2023-12-05 Snap Inc. External mesh with vertex attributes
US11790614B2 (en) 2021-10-11 2023-10-17 Snap Inc. Inferring intent from pose and speech input
US11763481B2 (en) 2021-10-20 2023-09-19 Snap Inc. Mirror-based augmented reality experience
US11748958B2 (en) 2021-12-07 2023-09-05 Snap Inc. Augmented reality unboxing experience
US11880947B2 (en) 2021-12-21 2024-01-23 Snap Inc. Real-time upper-body garment exchange
US11928783B2 (en) 2021-12-30 2024-03-12 Snap Inc. AR position and orientation along a plane
US11887260B2 (en) 2021-12-30 2024-01-30 Snap Inc. AR position indicator
US11823346B2 (en) 2022-01-17 2023-11-21 Snap Inc. AR body part tracking system
US11954762B2 (en) 2022-01-19 2024-04-09 Snap Inc. Object replacement system
JP7336553B2 (en) 2022-02-07 2023-08-31 三菱電機Itソリューションズ株式会社 Process execution device, process execution method and process execution program
US20230377223A1 (en) * 2022-05-18 2023-11-23 Snap Inc. Hand-tracked text selection and modification
US11870745B1 (en) 2022-06-28 2024-01-09 Snap Inc. Media gallery sharing and management
WO2024049574A1 (en) * 2022-08-29 2024-03-07 Travertine Design Engine Llc Video game environment and avatars
US11893166B1 (en) 2022-11-08 2024-02-06 Snap Inc. User avatar movement control using an augmented reality eyewear device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6301370B1 (en) * 1998-04-13 2001-10-09 Eyematic Interfaces, Inc. Face recognition from video images
US20020041327A1 (en) * 2000-07-24 2002-04-11 Evan Hildreth Video-based image control system
US6775014B2 (en) * 2001-01-17 2004-08-10 Fujixerox Co., Ltd. System and method for determining the location of a target in a room or small area
US20060202953A1 (en) * 1997-08-22 2006-09-14 Pryor Timothy R Novel man machine interfaces and applications

Family Cites Families (75)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5565888A (en) * 1995-02-17 1996-10-15 International Business Machines Corporation Method and apparatus for improving visibility and selectability of icons
US5594469A (en) * 1995-02-21 1997-01-14 Mitsubishi Electric Information Technology Center America Inc. Hand gesture machine control system
JPH08332170A (en) 1995-06-08 1996-12-17 Matsushita Electric Ind Co Ltd Video-scope
US6335927B1 (en) 1996-11-18 2002-01-01 Mci Communications Corporation System and method for providing requested quality of service in a hybrid network
US6118888A (en) * 1997-02-28 2000-09-12 Kabushiki Kaisha Toshiba Multi-modal interface apparatus and method
JPH1139132A (en) 1997-07-15 1999-02-12 Sharp Corp Interface system
KR19990011180A (en) 1997-07-22 1999-02-18 구자홍 How to select menu using image recognition
US6750848B1 (en) * 1998-11-09 2004-06-15 Timothy R. Pryor More useful man machine interfaces and applications
US6130677A (en) 1997-10-15 2000-10-10 Electric Planet, Inc. Interactive computer vision system
US6272231B1 (en) 1998-11-06 2001-08-07 Eyematic Interfaces, Inc. Wavelet-based facial motion capture for avatar animation
US6950534B2 (en) * 1998-08-10 2005-09-27 Cybernet Systems Corporation Gesture-controlled interfaces for self-service machines and other applications
DE19839638C2 (en) * 1998-08-31 2000-06-21 Siemens Ag System for enabling self-control of the body movement sequences to be carried out by the moving person
JP2000194469A (en) 1998-12-28 2000-07-14 Nec Corp Item display controller
JP2000284879A (en) 1999-01-29 2000-10-13 Square Co Ltd Game device, command input method in video game and computer readable recording medium for recording program for providing the same method
DE19917660A1 (en) 1999-04-19 2000-11-02 Deutsch Zentr Luft & Raumfahrt Method and input device for controlling the position of an object to be graphically represented in a virtual reality
US7124374B1 (en) 2000-03-06 2006-10-17 Carl Herman Haken Graphical interface control system
EP1148411A3 (en) * 2000-04-21 2005-09-14 Sony Corporation Information processing apparatus and method for recognising user gesture
JP2001312743A (en) 2000-04-28 2001-11-09 Tomohiro Kuroda Automatic human body copy model preparing device
JP3725460B2 (en) * 2000-10-06 2005-12-14 株式会社ソニー・コンピュータエンタテインメント Image processing apparatus, image processing method, recording medium, computer program, semiconductor device
JP2002116859A (en) 2000-10-11 2002-04-19 Sony Corp Information processor, instruction recognition processing method and program storage medium
US6827579B2 (en) 2000-11-16 2004-12-07 Rutgers, The State University Of Nj Method and apparatus for rehabilitation of neuromotor disorders
JP2002157606A (en) 2000-11-17 2002-05-31 Canon Inc Image display controller, composite reality presentation system, image display control method, and medium providing processing program
US6690354B2 (en) 2000-11-19 2004-02-10 Canesta, Inc. Method for enhancing performance in a system utilizing an array of sensors that sense at least two-dimensions
JP4095768B2 (en) * 2000-11-24 2008-06-04 株式会社日立製作所 Image processing method and non-contact image input apparatus using the same
DE10100615A1 (en) * 2001-01-09 2002-07-18 Siemens Ag Hand recognition with position determination
US6600475B2 (en) * 2001-01-22 2003-07-29 Koninklijke Philips Electronics N.V. Single camera system for gesture-based input and target indication
US6697072B2 (en) * 2001-03-26 2004-02-24 Intel Corporation Method and system for controlling an avatar using computer vision
US20030095154A1 (en) * 2001-11-19 2003-05-22 Koninklijke Philips Electronics N.V. Method and apparatus for a gesture-based user interface
US6907576B2 (en) * 2002-03-04 2005-06-14 Microsoft Corporation Legibility of selected content
EP2357836B1 (en) * 2002-03-27 2015-05-13 Sanyo Electric Co., Ltd. Method and apparatus for processing three-dimensional images
KR100941948B1 (en) * 2002-05-21 2010-02-11 코닌클리케 필립스 일렉트로닉스 엔.브이. A system for selecting and entering objects and a method for entering objects from a set of objects and compuetr readable medium for storing software code for implementing the method
US7348963B2 (en) * 2002-05-28 2008-03-25 Reactrix Systems, Inc. Interactive video display system
US7298871B2 (en) 2002-06-07 2007-11-20 Koninklijke Philips Electronics N.V. System and method for adapting the ambience of a local environment according to the location and personal preferences of people in the local environment
US7883415B2 (en) 2003-09-15 2011-02-08 Sony Computer Entertainment Inc. Method and apparatus for adjusting a view of a scene being displayed according to tracked head motion
US7815507B2 (en) * 2004-06-18 2010-10-19 Igt Game machine user interface using a non-contact eye motion recognition device
US7151530B2 (en) 2002-08-20 2006-12-19 Canesta, Inc. System and method for determining an input selected by a user through a virtual interface
US6839417B2 (en) 2002-09-10 2005-01-04 Myriad Entertainment, Inc. Method and apparatus for improved conference call management
JP4035610B2 (en) * 2002-12-18 2008-01-23 独立行政法人産業技術総合研究所 Interface device
US6920942B2 (en) * 2003-01-29 2005-07-26 Varco I/P, Inc. Method and apparatus for directly controlling pressure and position associated with an adjustable choke apparatus
JP2004258766A (en) 2003-02-24 2004-09-16 Nippon Telegr & Teleph Corp <Ntt> Menu display method, device and program in interface using self-image display
JP4286556B2 (en) 2003-02-24 2009-07-01 株式会社東芝 Image display device
US7665041B2 (en) * 2003-03-25 2010-02-16 Microsoft Corporation Architecture for controlling a computer using hand gestures
JP4906227B2 (en) 2003-05-19 2012-03-28 ソニー株式会社 Imaging device
US7411575B2 (en) * 2003-09-16 2008-08-12 Smart Technologies Ulc Gesture recognition method and touch system incorporating the same
US20050080849A1 (en) 2003-10-09 2005-04-14 Wee Susie J. Management system for rich media environments
JP4824409B2 (en) * 2004-01-06 2011-11-30 株式会社ソニー・コンピュータエンタテインメント Information processing system, entertainment system, and information receiving method for information processing system
EP1553764A1 (en) 2004-01-07 2005-07-13 Thomson Licensing S.A. System and process for selecting an item in a list of items and associated products
JP3847753B2 (en) 2004-01-30 2006-11-22 株式会社ソニー・コンピュータエンタテインメント Image processing apparatus, image processing method, recording medium, computer program, semiconductor device
US20050239028A1 (en) 2004-04-03 2005-10-27 Wu Chang J R Stance guide and method of use
US7519223B2 (en) 2004-06-28 2009-04-14 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
US20060017654A1 (en) * 2004-07-23 2006-01-26 Romo Justin R Virtual reality interactivity system and method
JP4419768B2 (en) 2004-09-21 2010-02-24 日本ビクター株式会社 Control device for electronic equipment
EP1645944B1 (en) 2004-10-05 2012-08-15 Sony France S.A. A content-management interface
JP2006130221A (en) 2004-11-09 2006-05-25 Konica Minolta Medical & Graphic Inc Medical image transmission apparatus, program and storage medium
US7598942B2 (en) * 2005-02-08 2009-10-06 Oblong Industries, Inc. System and method for gesture based control system
KR100687737B1 (en) * 2005-03-19 2007-02-27 한국전자통신연구원 Apparatus and method for a virtual mouse based on two-hands gesture
JP4276640B2 (en) 2005-06-17 2009-06-10 株式会社ソニー・コンピュータエンタテインメント Information processing apparatus, information processing apparatus control method, and information processing program
KR20070006477A (en) * 2005-07-08 2007-01-11 삼성전자주식회사 Method for arranging contents menu variably and display device using the same
US20070009139A1 (en) 2005-07-11 2007-01-11 Agere Systems Inc. Facial recognition device for a handheld electronic device and a method of using the same
JP4785457B2 (en) 2005-08-08 2011-10-05 キヤノン株式会社 Display control apparatus and display control method
JP2007108941A (en) 2005-10-12 2007-04-26 Sharp Corp Apparatus control system, remote controller, apparatus, apparatus control method, apparatus control program, and computer readable recording medium recording the same apparatus control program
US7788607B2 (en) * 2005-12-01 2010-08-31 Navisense Method and system for mapping virtual coordinates
US7429988B2 (en) * 2006-03-06 2008-09-30 At&T Intellectual Property I, L.P. Methods and apparatus for convenient change of display characters on a handheld device
US8766983B2 (en) * 2006-05-07 2014-07-01 Sony Computer Entertainment Inc. Methods and systems for processing an interchange of real time effects during video communication
KR100776801B1 (en) * 2006-07-19 2007-11-19 한국전자통신연구원 Gesture recognition method and system in picture process system
KR100783552B1 (en) 2006-10-11 2007-12-07 삼성전자주식회사 Input control method and device for mobile phone
US7877707B2 (en) * 2007-01-06 2011-01-25 Apple Inc. Detecting and interpreting real-world and security gestures on touch and hover sensitive devices
US10437459B2 (en) 2007-01-07 2019-10-08 Apple Inc. Multitouch data fusion
CN101689244B (en) 2007-05-04 2015-07-22 高通股份有限公司 Camera-based user input for compact devices
US8726194B2 (en) * 2007-07-27 2014-05-13 Qualcomm Incorporated Item selection using enhanced control
US20100235786A1 (en) 2009-03-13 2010-09-16 Primesense Ltd. Enhanced 3d interfacing for remote devices
EP2409478B1 (en) * 2009-03-16 2018-11-21 Sony Mobile Communications Inc. Personalized user interface based on picture analysis
US8659658B2 (en) * 2010-02-09 2014-02-25 Microsoft Corporation Physical interaction zone for gesture-based user interfaces
US8702507B2 (en) 2011-04-28 2014-04-22 Microsoft Corporation Manual and camera-based avatar control
US8881051B2 (en) 2011-07-05 2014-11-04 Primesense Ltd Zoom-based gesture user interface

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060202953A1 (en) * 1997-08-22 2006-09-14 Pryor Timothy R Novel man machine interfaces and applications
US6301370B1 (en) * 1998-04-13 2001-10-09 Eyematic Interfaces, Inc. Face recognition from video images
US20020041327A1 (en) * 2000-07-24 2002-04-11 Evan Hildreth Video-based image control system
US6775014B2 (en) * 2001-01-17 2004-08-10 Fujixerox Co., Ltd. System and method for determining the location of a target in a room or small area

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI474264B (en) * 2013-06-14 2015-02-21 Utechzone Co Ltd Warning method for driving vehicle and electronic apparatus for vehicle

Also Published As

Publication number Publication date
US20200117322A1 (en) 2020-04-16
US20140118249A1 (en) 2014-05-01
WO2009018161A1 (en) 2009-02-05
US20140331181A1 (en) 2014-11-06
CN108399010B (en) 2021-02-19
JP6382261B2 (en) 2018-08-29
JP2010534895A (en) 2010-11-11
CN103218041A (en) 2013-07-24
US11500514B2 (en) 2022-11-15
US20090027337A1 (en) 2009-01-29
US20090031240A1 (en) 2009-01-29
US8726194B2 (en) 2014-05-13
JP5575645B2 (en) 2014-08-20
JP6037344B2 (en) 2016-12-07
US10268339B2 (en) 2019-04-23
JP2016194948A (en) 2016-11-17
CN101810003A (en) 2010-08-18
CN108399010A (en) 2018-08-14
US8659548B2 (en) 2014-02-25
US10509536B2 (en) 2019-12-17
JP2014149856A (en) 2014-08-21
US20230110688A1 (en) 2023-04-13

Similar Documents

Publication Publication Date Title
CN101810003B (en) Enhanced camera-based input
US11237625B2 (en) Interaction engine for creating a realistic experience in virtual reality/augmented reality environments
JP6031071B2 (en) User interface method and system based on natural gestures
US7519223B2 (en) Recognizing gestures and using gestures for interacting with software applications
CN104246682B (en) Enhanced virtual touchpad and touch-screen
US8514251B2 (en) Enhanced character input using recognized gestures
CN108052202A (en) A kind of 3D exchange methods, device, computer equipment and storage medium
CN105308536A (en) Dynamic user interactions for display control and customized gesture interpretation
CN102915112A (en) System and method for close-range movement tracking
Quigley � From GUI to UUI: Interfaces for Ubiquitous Computing
US11709593B2 (en) Electronic apparatus for providing a virtual keyboard and controlling method thereof
CN109074209A (en) The details pane of user interface
US11960706B2 (en) Item selection using enhanced control
Lang et al. A multimodal smartwatch-based interaction concept for immersive environments
Ahmad et al. 3D gesture-based control system using processing open source software
Mosquera et al. Identifying facial gestures to emulate a mouse: Control application in a web browser
Iacolina Interactive Spaces Natural interfaces supporting gestures and manipulations in interactive spaces
Szeghalmy et al. Comfortable mouse control using 3D depth sensor
Damaraju Sriranga An Exploration of Multi-touch Interaction Techniques
Wong Gesture based interactions for augmented virtual mirrors

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
ASS Succession or assignment of patent right

Owner name: QUALCOMM INC.

Free format text: FORMER OWNER: GESTURE TEK INC.

Effective date: 20120118

C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20120118

Address after: American California

Applicant after: Qualcomm Inc.

Address before: American California

Applicant before: Gesturetek Inc.

C14 Grant of patent or utility model
GR01 Patent grant