CN104063037A - Operating command recognition method and device as well as wearable electronic equipment - Google Patents

Operating command recognition method and device as well as wearable electronic equipment Download PDF

Info

Publication number
CN104063037A
CN104063037A CN201310085514.7A CN201310085514A CN104063037A CN 104063037 A CN104063037 A CN 104063037A CN 201310085514 A CN201310085514 A CN 201310085514A CN 104063037 A CN104063037 A CN 104063037A
Authority
CN
China
Prior art keywords
contact
input operation
viewing area
area
surveyed area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310085514.7A
Other languages
Chinese (zh)
Other versions
CN104063037B (en
Inventor
高歌
张超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201310085514.7A priority Critical patent/CN104063037B/en
Publication of CN104063037A publication Critical patent/CN104063037A/en
Application granted granted Critical
Publication of CN104063037B publication Critical patent/CN104063037B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention discloses an operating command recognition method and device as well as wearable electronic equipment. The method is applied to the wearable electronic equipment; the electronic equipment is provided with an induction unit and a display unit, and the induction unit corresponds to an induction area; when the wearable electronic equipment is worn on a user's body, eyes of a user have a visible area; when the display unit of the wearable electronic equipment displays a graphical interface, a display area corresponds to the graphical interface; the visible area comprises the display area, a part, which doesn't belong to the display area, of the visible area is the detecting area, and the induction area comprises the detecting area; and according to the method, input operation in the detecting area is acquired through the induction unit, and when the input operation is judged to span the contact surface of the detecting area and the display area, an operation instruction corresponding to the input operation is determined according to the contact surface spanned by the input operation; and the operation instruction is responded, the graphical interface is controlled to perform corresponding conversion. The method can reduce complexity of the input operation of the user.

Description

A kind of operational order recognition methods, device and Wearable electronic equipment
Technical field
The present invention relates to electronic equipment input control technical field, relate in particular a kind of operational order recognition methods, device and Wearable electronic equipment.
Background technology
Along with the widespread use of electronic equipment, realize the also variation day by day of mode of the input operation of electronic equipment.As, electronic equipment can be inputted corresponding operational order by click keys, or on the display interface of electronic equipment, touches operation and the corresponding operation of realization.
Input operation based on gesture provides possibility for carrying out flexibly the input operation of electronic equipment.While carrying out input operation based on gesture at present, the image in electronic equipment picked-up appointed area, determines input command by the image that comprises user's gesture motion being analyzed to identification.In actual applications, carrying out in the process of gesture input operation, regular need to carry out some specific operations, as backtrack menu, return to upper interface etc., if user is specifically operation more than needing to carry out in the process of carrying out input operation based on gesture, need to interrupt current input operation, and press ad-hoc location set physical button just may trigger corresponding operation, operation complexity is high.
Summary of the invention
In view of this, the invention provides a kind of operational order recognition methods, device and Wearable electronic equipment, the method can reduce the complexity of user's input operation.
For achieving the above object, the invention provides following technical scheme: a kind of operational order recognition methods, described method is applied to a Wearable electronic equipment, described Wearable electronic equipment has sensing unit and display unit, the corresponding induction region of described sensing unit, when described Wearable electronic equipment is worn on user with it time, user's eyes have a viewing area; In the time that the described display unit of described Wearable electronic equipment shows a graphical interfaces, with described graphical interfaces to there being a viewing area; Wherein, described viewing area comprises described viewing area, and the subregion that does not belong to described viewing area in described viewing area is surveyed area, and described induction region comprises described surveyed area, and described method comprises:
Obtain the input operation in described surveyed area by described sensing unit;
Judge whether described input operation crosses over the surface of contact of described surveyed area and described viewing area;
In the time that the surface of contact of described surveyed area and described viewing area is crossed in described input operation, the surface of contact of crossing over according to described input operation, determines the operational order corresponding with described input operation;
Respond described operational order, control described graphical interfaces and convert accordingly.
Preferably, describedly judge that whether described input operation crosses over the surface of contact of described surveyed area and described viewing area, comprising:
Judge whether described input operation is crossed over described surface of contact from described surveyed area and entered described viewing area;
And/or, judge whether described input operation is crossed over described surface of contact from described viewing area and entered described surveyed area;
And/or, judge whether described input operation moves into described surveyed area from a surface of contact of described surveyed area and described viewing area, and shift out described surveyed area from described surveyed area and another surface of contact of described viewing area.
Preferably, the surface of contact of described surveyed area and described viewing area at least comprises the first surface of contact and the second surface of contact, and described the first surface of contact is different from the second surface of contact;
Described in the time that the surface of contact of described surveyed area and described viewing area is crossed in described input operation, to cross over according to described input operation surface of contact, determines and the corresponding operational order of described input operation, comprising:
In the time that the first surface of contact of described surveyed area and described viewing area is crossed in described input operation, the first surface of contact of crossing over according to described input operation, determines first operational order corresponding with crossed over described the first surface of contact;
In the time that the second surface of contact of described surveyed area and described viewing area is crossed in described input operation, the second surface of contact of crossing over according to described input operation, determine second operational order corresponding with crossed over described the second surface of contact, wherein, described the second operational order is different from described the first operational order.
Preferably, described in the time that the surface of contact of described surveyed area and described viewing area is crossed in described input operation, the surface of contact of crossing over according to described input operation, determines and the corresponding operational order of described input operation, comprising:
In the time that the surface of contact of described surveyed area and viewing area is crossed in described input operation, determine surveyed area that described input operation is crossed over and the current surface of contact of viewing area, and the override mode of described current surface of contact is crossed in identification, wherein, described surveyed area and viewing area at least have two surface of contact;
The current surface of contact of crossing over according to described input operation and the override mode of crossing over described current surface of contact, determine and the corresponding operational order of described input operation;
Wherein, the override mode of input operation comprises: the first override mode that strides into described viewing area from described surveyed area; And/or, stride into the second override mode of described surveyed area from described viewing area; And/or, stride into described surveyed area from a surface of contact of described surveyed area and described viewing area, and cross over out the 3rd override mode of described surveyed area from another surface of contact of surveyed area and viewing area.
Preferably, the surface of contact that the described input operation of described foundation is crossed over, determines and the corresponding operational order of described input operation, comprising:
The surface of contact of crossing over according to described input operation, from the predetermined registration operation instruction corresponding from different surface of contact, determine and the corresponding operational order of current input operation, wherein, described predetermined registration operation instruction at least comprises to be returned to a graphical interfaces and/or shows master menu.
Preferably, described induction region can also comprise described viewing area;
Described method also comprises: obtain the input operation in described graphical interfaces by described sensing unit.
On the other hand, the present invention also provides a kind of operational order recognition device, described application of installation is in a Wearable electronic equipment, described Wearable electronic equipment has sensing unit and display unit, the corresponding induction region of described sensing unit, when described Wearable electronic equipment is worn on user with it time, user's eyes have a viewing area; In the time that the described display unit of described Wearable electronic equipment shows a graphical interfaces, with described graphical interfaces to there being a viewing area; Wherein, described viewing area comprises described viewing area, and the subregion that does not belong to described viewing area in described viewing area is surveyed area, and described induction region comprises described surveyed area, and described device comprises:
The first operation acquiring unit, for obtaining the input operation in described surveyed area by described sensing unit;
Judging unit, for judging whether described input operation crosses over the surface of contact of described surveyed area and described viewing area;
Instruction determining unit, in the time that the surface of contact of described surveyed area and described viewing area is crossed in described input operation, the surface of contact of crossing over according to described input operation, determines the operational order corresponding with described input operation;
Instruction response unit, for responding described operational order, controls described graphical interfaces and converts accordingly.
Preferably, described judging unit is specially: for judging whether described input operation is crossed over described surface of contact from described surveyed area and entered described viewing area; And/or, judge whether described input operation is crossed over described surface of contact from described viewing area and entered described surveyed area; And/or, judge whether described input operation moves into described surveyed area from a surface of contact of described surveyed area and described viewing area, and shift out described surveyed area from described surveyed area and another surface of contact of described viewing area.
Preferably, the surface of contact of described surveyed area and described viewing area at least comprises the first surface of contact and the second surface of contact, and described the first surface of contact is different from the second surface of contact;
Described instruction determining unit, comprise: the first instruction determining unit, for in the time that the first surface of contact of described surveyed area and described viewing area is crossed in described input operation, the first surface of contact of crossing over according to described input operation, determines first operational order corresponding with crossed over described the first surface of contact;
The second instruction determining unit, for in the time that the second surface of contact of described surveyed area and described viewing area is crossed in described input operation, the second surface of contact of crossing over according to described input operation, determine second operational order corresponding with crossed over described the second surface of contact, wherein, described the second operational order is different from described the first operational order.
Preferably, described instruction determining unit, comprising:
Recognition unit, for in the time that the surface of contact of described surveyed area and viewing area is crossed in described input operation, determine surveyed area that described input operation is crossed over and the current surface of contact of viewing area, and the override mode of described current surface of contact is crossed in identification, wherein, described surveyed area and viewing area at least have two surface of contact, and the override mode of input operation comprises: the first override mode that strides into described viewing area from described surveyed area; And/or, stride into the second override mode of described surveyed area from described viewing area; And/or, stride into described surveyed area from a surface of contact of described surveyed area and described viewing area, and cross over out the 3rd override mode of described surveyed area from another surface of contact of surveyed area and viewing area;
Specify and determine subelement, for the current surface of contact of crossing over according to described input operation and the override mode of crossing over described current surface of contact, determine and the corresponding operational order of described input operation.
Preferably, described appointment determining unit, be specially: in the time that the surface of contact of described surveyed area and described viewing area is crossed in described input operation, the surface of contact of crossing over according to described input operation, from the predetermined registration operation instruction corresponding from different surface of contact, determine and the corresponding operational order of current input operation, wherein, described predetermined registration operation instruction at least comprises to be returned to a graphical interfaces and/or shows master menu.
Preferably, described induction region can also comprise described viewing area;
Described device also comprises: the second operation acquiring unit, and for obtaining the input operation in described graphical interfaces by described sensing unit.
On the other hand, the present invention also provides a kind of Wearable electronic equipment, described Wearable electronic equipment has processor, and the sensing unit and the display unit that are all connected with described processor, the corresponding induction region of described sensing unit, when described Wearable electronic equipment is worn on user with it time, user's eyes have a viewing area; In the time that the described display unit of described Wearable electronic equipment shows a graphical interfaces, with described graphical interfaces to there being a viewing area; Wherein, described viewing area comprises described viewing area, and the subregion that does not belong to described viewing area in described viewing area is surveyed area, and described induction region comprises described surveyed area, described processor is built-in with the operational order recognition device described in as above any one.
Known via above-mentioned technical scheme, compared with prior art, the present invention openly provides a kind of operational order recognition methods, device and electronic equipment, the input operation that the method detects sensing unit judges, in the time judging this input operation and cross over the surface of contact of this surveyed area and viewing area, the surface of contact of crossing over according to this input operation, determine the corresponding operational order of input operation, and then the graphical interfaces presenting that responds this operational order control Wearable electronic equipment converts accordingly.Due to viewing area and surveyed area surface area larger, than being easier to location, therefore cross over the surface of contact of viewing area and viewing area by input operation, trigger the mode of determining the operational order corresponding with this input operation according to the surface of contact of crossing over, reduce the maloperation in input process, improve the accuracy of input operation, also improved the convenience of input speed and input operation.
Brief description of the drawings
In order to be illustrated more clearly in the embodiment of the present invention or technical scheme of the prior art, to the accompanying drawing of required use in embodiment or description of the Prior Art be briefly described below, apparently, accompanying drawing in the following describes is only embodiments of the invention, for those of ordinary skill in the art, do not paying under the prerequisite of creative work, other accompanying drawing can also be provided according to the accompanying drawing providing.
Fig. 1 shows the schematic flow sheet of an embodiment of a kind of operational order recognition methods of the present invention;
When Fig. 2 shows user and dresses intelligent glasses, the schematic top plan view of surveyed area, viewing area and viewing area;
When Fig. 3 shows user and dresses intelligent glasses, longitudinal cross-sectional schematic of surveyed area, viewing area and viewing area;
Fig. 4 shows the schematic flow sheet of another embodiment of a kind of operational order recognition methods of the present invention;
Fig. 5 shows the schematic flow sheet of another embodiment of a kind of operational order recognition methods of the present invention;
Fig. 6 shows the structural representation of an embodiment of a kind of operational order recognition device of the present invention;
Fig. 7 shows the structural representation of instruction determining unit in a kind of another embodiment of operational order recognition device of the present invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is clearly and completely described, obviously, described embodiment is only the present invention's part embodiment, instead of whole embodiment.Based on the embodiment in the present invention, those of ordinary skill in the art, not making the every other embodiment obtaining under creative work prerequisite, belong to the scope of protection of the invention.
The embodiment of the invention discloses a kind of operational order recognition methods, the method can improve the convenience of input operation, and then realizes being switched to fast the required interface of specific operation.
Referring to Fig. 1, show the schematic flow sheet of an embodiment of a kind of operational order recognition methods of the present invention, the method of the present embodiment is applied to Wearable electronic equipment, and this Wearable electronic equipment has sensing unit and display unit, and this sensing unit has induction region.When this wearable electronic is worn on user with it time, user's eyes have a viewing area; In the time that the display unit of this wearable electronic equipment shows a graphical interfaces, with this graphical interfaces to there being a viewing area.Wherein, this viewing area comprises this viewing area, and the region that does not belong to this viewing area in this viewing area is surveyed area, makes this viewing area and this surveyed area intersection have surface of contact.This induction region comprises this surveyed area.The method of the present embodiment comprises:
Step 101: obtain the input operation in surveyed area by this sensing unit.
Wherein, sensing unit can obtain the input operation in induction region, and induction region has comprised this surveyed area, and therefore, sensing unit can get the input operation in surveyed area.
Wherein, this input operation can be the gesture operation of user in space, can be also the touch operation on this electronic equipment.Accordingly, sensing unit can be the existing device that obtains arbitrarily user's input operation behavior, as being camera, or infrared induction unit etc.
Step 102: judge whether this input operation crosses over the surface of contact of surveyed area and viewing area.
When user dresses this wearable electronic equipment, user has by this Wearable electronic equipment and has a viewing area, thereby makes user can see any object or the action behavior in this viewing area.
The display unit of this electronic equipment shows a graphical interfaces, and to having a viewing area by graphical interfaces, wherein, the viewing area that user has while dressing this Wearable electronic equipment comprises this viewing area, and user can see this graphical interfaces.
And that user sees that the mode of this graphical interfaces has is multiple, wherein to be this display unit (for example export this graphical interfaces to one on display interface on electronic equipment, the subregion of the eyeglass on wearable glasses is display interface), and display interface is in this viewing area, and user can see this graphical interfaces.Another kind of mode is: display unit, to projects images in user's eyes, makes user can see this graphical interfaces, now can see that user the regional extent of this graphical interfaces is viewing area.
Because viewing area comprises viewing area, and the subregion that does not belong to this viewing area in viewing area is surveyed area, and therefore, viewing area and this surveyed area exist interface in space, namely the surface of contact of viewing area and surveyed area.
The input operation of obtaining according to sensing unit, can analyze current input operation and whether crossed over the surface of contact of this surveyed area and viewing area.The input operation of crossing over the surface of contact of surveyed area and viewing area can be to have crossed over this surface of contact by operating body by surveyed area to have entered viewing area, also can be operating body has been crossed over this surface of contact by viewing area and entered into this surveyed area etc., as long as operating body has at least been crossed over once the surface of contact of this surveyed area and viewing area.
Step 103: in the time that the surface of contact of surveyed area and viewing area is crossed in input operation, the surface of contact of crossing over according to input operation, determines the operational order corresponding with input operation.
If when judging current input operation and having crossed over the surface of contact of surveyed area and viewing area, explanation need to current input operation be need to input instruction to trigger, graphical interfaces to be changed, therefore, need, according to the surface of contact of crossing over, to determine and the corresponding operational order of current input operation.
Known according to description above, this viewing area has comprised viewing area and surveyed area, user can see the residing region of current input operation clearly, like this, if need to carry out the input of certain operational order, user just can make input operation carry out alternately in viewing area and surveyed area, thereby crosses over the surface of contact of this viewing area and surveyed area, and then triggers the corresponding operational order of generation.
Because the area of viewing area and surveyed area is larger, the area in these two regions is obviously greater than on this Wearable electronic equipment the area of the button arranging or show, the area that the surface of contact of this viewing area and surveyed area covers is accordingly also larger.And according to Fitts' law (Fitts ' Law, also referred to as Fitts' law), known, target is larger, more easily location, and it is shorter to point to the faster time.That is to say, locate the time of a target, depend on the distance of target and current location, and the size of target.Therefore, the surface of contact of being crossed over this surveyed area and viewing area by input operation triggers corresponding operational order, has improved convenience and the high efficiency of input operation.
Step 104: respond this operational order, control graphical interfaces and convert accordingly.
Determine after operational order, according to the operational order of determining, the graphical interfaces of controlling display unit demonstration changes accordingly.
Wherein, the mode that graphical interfaces converts also may be different, can switch whole graphical interfaces, can be also only the partial content region of the graphical interfaces of current demonstration to be changed, and can be also by overlapping portion spirte interface on the graphical interfaces of current demonstration etc.
Certainly,, according to the difference of the operational order of determining, the variation that control graphical interfaces carries out also can be different.Consider in practical application, generally need to set easily some conventional input instructions, the operational order of therefore determining can comprise return a graphical interfaces, show master menu, in current interface, carry out operation of home function button etc.
Accordingly, in the time determining operational order, the surface of contact that also can cross over according to described input operation, from the predetermined registration operation instruction corresponding from different surface of contact, determines and the corresponding operational order of current input operation.Wherein, predetermined registration operation instruction can comprise more conventional instruction, returns to a upper graphical interfaces and/or show master menu as at least comprised.
The input operation that sensing unit is detected of the present embodiment judges, in the time judging this input operation and cross over the surface of contact of this surveyed area and viewing area, the surface of contact of crossing over according to this input operation, determine the corresponding operational order of input operation, and then the graphical interfaces presenting that responds this operational order control Wearable electronic equipment converts accordingly.Due to viewing area and surveyed area surface area larger, than being easier to location, therefore cross over the surface of contact of viewing area and viewing area by input operation, trigger the mode of determining the operational order corresponding with this input operation according to the surface of contact of crossing over, reduce the maloperation in input process, improve the accuracy of input operation, also improved the convenience of input speed and input operation.
Wherein, the Wearable electronic equipment in the embodiment of the present application can be intelligent glasses, armor formula electronic equipment, may be worn on the watch style computer in wrist.This kind of wearable electronic equipment generally all has and is easy to carry, but when carrying out some control operations to carry out interface when conversion, if utilize existing mode generally all to need to carry out input operation by specific keys, operation is inconvenient.
In order to be conducive to understand the scheme of the embodiment of the present invention, this user is dressed after this Wearable electronic equipment, relation between viewing area, this viewing area and surveyed area that user has is introduced, below taking wearable electronic equipment as Brilliant Eyes mirror electronic equipment is as example.And the display unit of supposing this Brilliant Eyes mirror electronic equipment is presented at image on this eyeglass instead of directly figure is projected in eyes of user, namely on eyeglass, be provided with display unit, and then user can be with soon to the graphical interfaces presenting in this display unit.
As shown in Figure 2, for user dresses the plan structure schematic diagram of this intelligent glasses, this intelligent glasses 20 has two spectacle frames (or being called ear frame or ear holder), these two spectacle frames are connected with picture frame 201, on this picture frame, be provided with eyeglass, on this eyeglass, have display unit 202, display unit can be exported a graphical interfaces to this display unit.The corresponding viewing area 211 of this graphical interfaces, as the region between two black thick dashed line (also thinking viewing area by the area of space of two graphical interfaces) in Fig. 2 in this Fig. 2.When user operates the object of this graphical interfaces at present, can in this viewing area, carry out gesture input and also can on the display unit being contained in this viewing area, touch operation.
Under normal circumstances, when showing on the eyeglass at intelligent glasses that, after this graphical interfaces, the lens area at graphical interfaces place still has certain transmittance, user still can see through this graphical interfaces and see outside things.Therefore, the region that whole eyeglass becomes that sees through this electronic equipment from the sight line of user's eyes is viewing area 212, and as shown in Figure 2, this viewing area 212 is two regions between solid line.This viewing area 212 comprises this viewing area 211, and this viewing area also includes surveyed area 213 simultaneously.Because Fig. 2 is vertical view, only demonstrate the surveyed area of both sides, viewing area, the surface of contact of this viewing area and surveyed area is straight line in this vertical view, as bold dashed lines in figure.
On this intelligent glasses, be provided with sensing unit, this sensing unit can be responded to the operation in this surveyed area, when user's input action is crossed over any one surface of contact of this surveyed area and viewing area like this, this sensing unit all can detect corresponding input operation.
Certainly, Fig. 2 is only a user while dressing this intelligent glasses, the plan structure schematic diagram of regional, and in real space, this graphical interfaces can, in the zone line of this eyeglass, now, be surveyed area at the surrounding area of space of viewing area.For the position relationship of surveyed area, viewing area and viewing area in Fig. 2 more clearly, referring to Fig. 3, when showing user and dressing this intelligent glasses, the longitudinal cross-section schematic diagram of regional.This Fig. 3 is the sectional view vertically that in this Fig. 2, the place, optional position on direction of visual lines is done, and arbitrary section is a rectangle.Wherein, the interior zone that entity thick line surrounds is viewing area, comprised the viewing area 211 being surrounded by dotted line, and the region that solid line and dotted line surround is surveyed area 213 in viewing area.When user is in space during in input operation, user's eyes can see that its hand moves in this viewing area, if user need to trigger corresponding operational order, can be by the surface of contact of cross-domain hand this viewing area and surveyed area, it in this sectional view, is dotted portion, and then the surface of contact crossed over according to the current input operation of user of this intelligent glasses, determine the operational order corresponding with current input operation.
Certainly, this Fig. 3 also can regard the planimetric map of the intelligent glasses shown in Fig. 2 as, wherein, viewing area is whole eyeglass, viewing area is that graphical interfaces (is not considered in the figure by the region between two eyeglasses, it is the space of nose support part, but viewing area is also thought in this subregion), surveyed area is the region except viewing area in this eyeglass, if thereby the enterprising line slip operation of the plane of user at this eyeglass place, also can determine whether to cross over according to the shift position of finger the boundary line of this viewing area and surveyed area, if boundary line has been crossed in input operation, illustrate that in space this input operation crossed over the surface of contact of this viewing area and this surveyed area.
In actual applications, the mode of input operation leap surface of contact has multiple, corresponding, judges whether input operation is crossed over the surface of contact of surveyed area and described viewing area and also had multiple situation.Concrete, can comprise: judge that the surface of contact whether input operation crosses over this surveyed area and this viewing area from surveyed area enters viewing area.If sensing units sense moves to the surface of contact of this surveyed area and viewing area in to input operation from surveyed area, until shift out this surface of contact, thus enter into viewing area, can judge input operation and cross over this surface of contact.For example, taking Fig. 3 as example, the mode that this kind crossed over surface of contact is as shown in 301 the direction of arrow.
Judging whether input operation has crossed over surface of contact can also be by judging whether described input operation is crossed over the surface of contact of this viewing area and this surveyed area and entered into surveyed area from viewing area.Sensing unit can sense that the operating body in surveyed area crosses over out the surface of contact of this surveyed area and this surveyed area, and then operating body enters into the operation of this viewing area, enter into viewing area thereby determine the surface of contact that input operation crossed over surveyed area and viewing area from surveyed area.
In addition, judge whether input operation has been crossed over this surface of contact and can also be: judge that whether this input operation moves into surveyed area from a surface of contact of surveyed area and viewing area, and shift out surveyed area from this surveyed area and another surface of contact of viewing area.In actual applications, user's a continuous gesture motion may relate to multiple surface of contact, accordingly, after a surface of contact of surveyed area and viewing area is crossed in the input operation that sensing unit detects possibly, then a continuous action of cross-domain another surface of contact to this surveyed area and viewing area.For example, taking Fig. 3 as example, the mode that this kind crossed over surface of contact is as shown in the continuous action of 302 the direction of arrow.Also have one to be, between two viewing areas, can have a surveyed area.For example, in the intelligent glasses of Fig. 2, when region between two graphical interfaces is thought when surveyed area, can there is a surveyed area in the middle of viewing area 211 at Fig. 3 a, two parallel surface of contact of surveyed area in the middle of this viewing area and this viewing area existence, now, input operation can be crossed over successively two parallel surface of contact and then be entered into the viewing area of opposite side from the viewing area of a side.
Certainly, judge that whether input operation crosses over the surface of contact of this surveyed area and viewing area can be also the judgement of simultaneously carrying out the mode of above any one or a few leap surface of contact.
Referring to Fig. 4, show the schematic flow sheet of another embodiment of a kind of operational order recognition methods of the present invention, the method for the present embodiment is applied to wearable electronic equipment, and this electronic equipment has sensing unit and display unit, and this sensing unit has induction region.When this wearable electronic is worn on user with it time, user's eyes have a viewing area; In the time that the display unit of this wearable electronic equipment shows a graphical interfaces, with this graphical interfaces to there being a viewing area.Wherein, this viewing area comprises this viewing area, and the region that does not belong to this viewing area in this viewing area is surveyed area, makes this viewing area and this surveyed area intersection have surface of contact.This induction region comprises this surveyed area.The method of the present embodiment comprises:
Step 401: obtain the input operation in surveyed area by this sensing unit.
Step 402: judge whether this input operation crosses over the surface of contact of surveyed area and viewing area.
Above two steps can be similar to the corresponding steps of the command identifying method in above embodiment, do not repeat them here.
Step 403: in the time that the surface of contact of surveyed area and viewing area is crossed in input operation, according to current the crossed over surface of contact of input operation, determine and current the crossed over corresponding operational order of surface of contact.
Because the surface of contact of surveyed area and viewing area is at least two surface of contact, therefore can set in advance the corresponding relation of different surface of contact and operational order, thereby can determine operational order according to crossed over surface of contact, the surface of contact difference of crossing over, the operational order of determining also can be different.Concrete, can, from the corresponding relation of the surface of contact that sets in advance and operational order, determine the corresponding operational order of surface of contact with current leap.
The surface of contact of supposing this surveyed area and viewing area at least comprises the first surface of contact and the second surface of contact, and this first surface of contact is different from the second surface of contact.In the time that the first surface of contact of surveyed area and viewing area is crossed in input operation, the first surface of contact of crossing over according to input operation, determines first operational order corresponding with the first crossed over surface of contact; In the time that the second surface of contact of surveyed area and viewing area is crossed in input operation, the second surface of contact of crossing over according to input operation, determine second operational order corresponding with the second crossed over surface of contact, wherein, this second operational order is different from the first operational order.
It should be noted that, this first surface of contact and the second surface of contact are only used to distinguish the surface of contact that different input operations are crossed over, and contact relation of plane and quantity are not limited.Cross over and only cross over a determined operational order of surface of contact when an input operation, different from two determined operational orders of surface of contact of input operation leap.
For the ease of understanding, while still dressing this intelligent glasses with the user shown in Fig. 2 and Fig. 3, the structural representation of regional is that example is introduced.Surveyed area in Fig. 3 and viewing area have four boundary lines to be respectively the upper boundary line in figure, lower boundary line, left boundary line and right boundary line.Respectively corresponding four surface of contact of these four boundary lines in space.The operational order that on supposing, the surface of contact at boundary line place is corresponding is for returning to a upper interface, the operational order corresponding to surface of contact at lower boundary line place is demonstration master menu, the corresponding operational order of surface of contact at left boundary line place is for showing desktop, and the corresponding operational order of surface of contact at right boundary line place is for exiting current interface.In the time that user need to show master menu, can carry out in space gesture operation, gesture input is crossed over to the surface of contact at viewing area and this lower boundary line place of surveyed area, for example, the surface of contact at mobile hand boundary line place from surveyed area is crossed over enters into viewing area.Certainly user also can operate to cross over by the touch on eyeglass the surface of contact of this lower boundary line, is not limited at this.Intelligent glasses detects when the surface of contact at this lower boundary line place is crossed in user's input operation, determines operational order corresponding to input operation for showing master menu.
Step 404: respond this operational order, control display graphics interface and convert accordingly.
Respond this operational order, and then according to this operational order, determine the variation that need to carry out current graphical interfaces.For example, operational order, for showing when master menu, can be added to master menu on current graphical interfaces, makes to comprise in graphical interfaces master menu; When operational order is when returning to a upper interface, may need the whole graphical interfaces of current demonstration to switch to the graphical interfaces representing before current graphical interfaces.
Referring to Fig. 5, show the schematic flow sheet of another embodiment of a kind of operational order recognition methods of the present invention, the method of the present embodiment is applied to Wearable electronic equipment, and this Wearable electronic equipment has sensing unit and display unit, and this sensing unit has induction region.When this wearable electronic is worn on user with it time, user's eyes have a viewing area; In the time that the display unit of this wearable electronic equipment shows a graphical interfaces, with this graphical interfaces to there being a viewing area.Wherein, this viewing area comprises this viewing area, and the region that does not belong to this viewing area in this viewing area is surveyed area, makes this viewing area and this surveyed area intersection have surface of contact.This induction region comprises this surveyed area.The method of the present embodiment comprises:
Step 501: obtain the input operation in surveyed area by this sensing unit.
Step 502: judge whether this input operation crosses over the surface of contact of surveyed area and viewing area.
The operating process of this step 501 and step 502 is identical with the operating process of middle corresponding steps embodiment illustrated in fig. 1, does not repeat them here.
Step 503: in the time that the surface of contact of surveyed area and viewing area is crossed in input operation, determine surveyed area and the current surface of contact of viewing area that input operation is current crossed over, and identify the override mode of this current surface of contact of leap.
In the present embodiment, this surveyed area and viewing area at least have two surface of contact.In the time detecting that the surface of contact of viewing area and surveyed area is crossed in input operation, the surface of contact except determining that current input operation is crossed in the present embodiment, also needs identification to cross over the override mode of this current surface of contact.
Wherein, the override mode of input operation comprises: in the process of surface of contact of crossing over surveyed area and viewing area, stride into first override mode in shown region from surveyed area; And/or, stride into the second override mode of surveyed area from viewing area; And/or, stride into surveyed area from a surface of contact of surveyed area and viewing area, and cross over out the 3rd override mode of surveyed area from another surface of contact of surveyed area and viewing area.
Step 504: the current surface of contact of crossing over according to input operation and the override mode of crossing over current surface of contact, determine and the corresponding operational order of this input operation.
The override mode that needs the surface of contact of crossing over according to input operation in the present embodiment and cross over this surface of contact is determined operational order.When the surface of contact that input operation of homogeneous is not crossed over is identical, when still the leap mode of cross-domain this surface of contact is different, the corresponding operational order of input operation is not identical yet.
Still taking the intelligent glasses of Fig. 3 as example, and taking the surface of contact of crossing over upper boundary line place as example, while supposing that the surface of contact of crossing over boundary line place from surveyed area enters into viewing area, the operational order triggering is demonstration master menu; In the time crossing over the surface of contact at boundary line place this from viewing area and enter into surveyed area, corresponding operational order is for exiting master menu; Like this, if user need to show master menu, can from crossing over, surveyed area enter into viewing area by the surface of contact at boundary line place.In the time that user does not need to show master menu again, can cross over along contrary direction the surface of contact at boundary line place on this, cross over the surface of contact at boundary line place this from viewing area, thereby can trigger the operational order that exits master menu.
Step 505: respond this operational order, control display graphics interface and convert accordingly.
In above any one embodiment, the corresponding induction region of sensing unit can also comprise viewing area, corresponding, can also obtain the input operation in graphical interfaces by sensing unit, to trigger the corresponding operating to graphical interfaces in the present embodiment.
Simultaneously, because this sensing unit can comprise viewing area, therefore, input operation in sensing units sense surveyed area, whether cross over the surface of contact of this surveyed area and viewing area to judge input operation, also can be understood as by the input operation of this graphical interfaces of sensing units sense, and then judge whether input operation crosses over the surface of contact of this surveyed area and viewing area.Certainly, because surveyed area and viewing area have formed viewing area jointly, can respond to the input operation in whole viewing area by sensing unit too, and input operation is analyzed, judge whether input operation crosses over the surface of contact of surveyed area and viewing area.
Corresponding a kind of operational order recognition methods of the present invention, the present invention also provides a kind of operational order recognition device.Referring to Fig. 6, show the structural representation of an embodiment of a kind of operational order recognition device of the present invention, the application of installation of the present embodiment is in a Wearable electronic equipment, this Wearable electronic equipment has sensing unit and display unit, the corresponding induction region of this sensing unit, when Wearable electronic equipment is worn on user with it time, user's eyes have a viewing area; In the time that the display unit of Wearable electronic equipment shows a graphical interfaces, with graphical interfaces to there being a viewing area; Wherein, viewing area comprises described viewing area, and the subregion that does not belong to this viewing area in viewing area is surveyed area, this induction region inclusion test region.The device of the present embodiment comprises: the first operation acquiring unit 601, judging unit 602, instruction determining unit 603 and instruction response unit 604.
Wherein, the first operation acquiring unit 601, for obtaining the input operation in surveyed area by sensing unit.
Judging unit 602, for judging whether this input operation crosses over the surface of contact of surveyed area and viewing area.
Instruction determining unit 603, in the time that the surface of contact of surveyed area and viewing area is crossed in input operation, the surface of contact of crossing over according to input operation, determines the operational order corresponding with input operation.
Instruction response unit 604, for operation response instruction, controls graphical interfaces and converts accordingly.
The input operation that sensing unit is detected of the present embodiment judges, in the time judging this input operation and cross over the surface of contact of this surveyed area and viewing area, the surface of contact of crossing over according to this input operation, determine the corresponding operational order of input operation, and then the graphical interfaces presenting that responds this operational order control Wearable electronic equipment converts accordingly.Due to viewing area and surveyed area surface area larger, than being easier to location, therefore cross over the surface of contact of viewing area and viewing area by input operation, trigger the mode of determining the operational order corresponding with this input operation according to the surface of contact of crossing over, reduce the maloperation in input process, improve the accuracy of input operation, also improved the convenience of input speed and input operation.
Wherein, when judging unit judges the surface of contact of whether cross-domain this viewing area of input operation and surveyed area, can be in several ways, accordingly, judging unit is specially: for judging whether input operation enters viewing area from the surface of contact of described surveyed area leap viewing area and surveyed area; And/or, judge whether input operation is crossed over the surface of contact of this viewing area and surveyed area and entered surveyed area from viewing area; And/or, judge whether input operation moves into this surveyed area from a surface of contact of surveyed area and viewing area, and shift out surveyed area from surveyed area and another surface of contact of viewing area.
In practice, viewing area and surveyed area at least have two surface of contact, in order to determine the corresponding operational order of input operation, can, according to the difference of the cross-domain surface of contact of input operation, determine the operational order corresponding with surface of contact.
Concrete, the surface of contact of this surveyed area and described viewing area at least comprises the first surface of contact and the second surface of contact, described the first surface of contact is different from the second surface of contact;
Accordingly, instruction determining unit, at least comprises: the first instruction determining unit, in the time that the first surface of contact of surveyed area and viewing area is crossed in input operation, the first surface of contact of crossing over according to input operation, determines first operational order corresponding with the first crossed over surface of contact;
The second instruction determining unit, for in the time that the second surface of contact of surveyed area and viewing area is crossed in input operation, the second surface of contact of crossing over according to input operation, determine second operational order corresponding with the second crossed over surface of contact, wherein, this second operational order is different from described the first operational order.
Wherein, this first surface of contact and the second surface of contact are only used to distinguish viewing area that different input operations cross over and the different surface of contact of surveyed area.
In actual applications, can also there be other modes of determining the corresponding operational order of input operation, referring to Fig. 7, show the structural representation of this instruction determining unit in a kind of another embodiment of operational order recognition device of the present invention.The difference of the operational order recognition device of the present embodiment and a upper embodiment is, in the present embodiment, this instruction determining unit 603 comprises:
Recognition unit 6031, for in the time that the surface of contact of described surveyed area and viewing area is crossed in described input operation, determine surveyed area that described input operation is crossed over and the current surface of contact of viewing area, and the override mode of described current surface of contact is crossed in identification, wherein, described surveyed area and viewing area at least have two surface of contact, and the override mode of input operation comprises: the first override mode that strides into described viewing area from described surveyed area; And/or, stride into the second override mode of described surveyed area from described viewing area; And/or, stride into described surveyed area from a surface of contact of described surveyed area and described viewing area, and cross over out the 3rd override mode of described surveyed area from another surface of contact of surveyed area and viewing area.
Specify and determine subelement 6032, for the current surface of contact of crossing over according to described input operation and the override mode of crossing over described current surface of contact, determine and the corresponding operational order of described input operation.
In above any one embodiment, this appointment determining unit all can set in advance operational order, so that determine the satisfied operational order of current input operation from preset operational order.Accordingly, this instruction determining unit, be specially: when crossing over the surface of contact of described surveyed area and described viewing area when input operation, the surface of contact of crossing over according to input operation, from the predetermined registration operation instruction corresponding from different surface of contact, determine and the corresponding operational order of current input operation, wherein, this predetermined registration operation instruction at least comprises to be returned to a graphical interfaces and/or shows master menu.
Further, in above any one embodiment, this induction region can also comprise viewing area.
Accordingly, this device also comprises: the second operation acquiring unit, and for obtain the input operation in described graphical interfaces by sensing unit.
On the other hand, the present invention also provides a kind of Wearable electronic equipment, this Wearable electronic equipment has processor, and the sensing unit and the display unit that are all connected with this processor, the corresponding induction region of sensing unit, when Wearable electronic equipment is worn on user with it time, user's eyes have a viewing area; In the time that the display unit of Wearable electronic equipment shows a graphical interfaces, with graphical interfaces to there being a viewing area.Wherein, viewing area comprises described viewing area, and the subregion that does not belong to described viewing area in viewing area is surveyed area, and this induction region comprises described surveyed area.This processor is built-in with the operational order recognition device described in as above any one embodiment.
In this instructions, each embodiment adopts the mode of going forward one by one to describe, and what each embodiment stressed is and the difference of other embodiment, between each embodiment identical similar part mutually referring to.For the disclosed device of embodiment, because it corresponds to the method disclosed in Example, so description is fairly simple, relevant part illustrates referring to method part.
To the above-mentioned explanation of the disclosed embodiments, make professional and technical personnel in the field can realize or use the present invention.To be apparent for those skilled in the art to the multiple amendment of these embodiment, General Principle as defined herein can, in the situation that not departing from the spirit or scope of the present invention, realize in other embodiments.Therefore, the present invention will can not be restricted to these embodiment shown in this article, but will meet the widest scope consistent with principle disclosed herein and features of novelty.

Claims (13)

1. an operational order recognition methods, it is characterized in that, described method is applied to a Wearable electronic equipment, described Wearable electronic equipment has sensing unit and display unit, the corresponding induction region of described sensing unit, when described Wearable electronic equipment is worn on user with it time, user's eyes have a viewing area; In the time that the described display unit of described Wearable electronic equipment shows a graphical interfaces, with described graphical interfaces to there being a viewing area; Wherein, described viewing area comprises described viewing area, and the subregion that does not belong to described viewing area in described viewing area is surveyed area, and described induction region comprises described surveyed area, and described method comprises:
Obtain the input operation in described surveyed area by described sensing unit;
Judge whether described input operation crosses over the surface of contact of described surveyed area and described viewing area;
In the time that the surface of contact of described surveyed area and described viewing area is crossed in described input operation, the surface of contact of crossing over according to described input operation, determines the operational order corresponding with described input operation;
Respond described operational order, control described graphical interfaces and convert accordingly.
2. method according to claim 1, is characterized in that, describedly judges that whether described input operation crosses over the surface of contact of described surveyed area and described viewing area, comprising:
Judge whether described input operation is crossed over described surface of contact from described surveyed area and entered described viewing area;
And/or, judge whether described input operation is crossed over described surface of contact from described viewing area and entered described surveyed area;
And/or, judge whether described input operation moves into described surveyed area from a surface of contact of described surveyed area and described viewing area, and shift out described surveyed area from described surveyed area and another surface of contact of described viewing area.
3. method according to claim 1 and 2, is characterized in that, the surface of contact of described surveyed area and described viewing area at least comprises the first surface of contact and the second surface of contact, and described the first surface of contact is different from the second surface of contact;
Described in the time that the surface of contact of described surveyed area and described viewing area is crossed in described input operation, to cross over according to described input operation surface of contact, determines and the corresponding operational order of described input operation, comprising:
In the time that the first surface of contact of described surveyed area and described viewing area is crossed in described input operation, the first surface of contact of crossing over according to described input operation, determines first operational order corresponding with crossed over described the first surface of contact;
In the time that the second surface of contact of described surveyed area and described viewing area is crossed in described input operation, the second surface of contact of crossing over according to described input operation, determine second operational order corresponding with crossed over described the second surface of contact, wherein, described the second operational order is different from described the first operational order.
4. method according to claim 1 and 2, it is characterized in that, described in the time that the surface of contact of described surveyed area and described viewing area is crossed in described input operation, the surface of contact of crossing over according to described input operation, determine and the corresponding operational order of described input operation, comprising:
In the time that the surface of contact of described surveyed area and viewing area is crossed in described input operation, determine surveyed area that described input operation is crossed over and the current surface of contact of viewing area, and the override mode of described current surface of contact is crossed in identification, wherein, described surveyed area and viewing area at least have two surface of contact;
The current surface of contact of crossing over according to described input operation and the override mode of crossing over described current surface of contact, determine and the corresponding operational order of described input operation;
Wherein, the override mode of input operation comprises: the first override mode that strides into described viewing area from described surveyed area; And/or, stride into the second override mode of described surveyed area from described viewing area; And/or, stride into described surveyed area from a surface of contact of described surveyed area and described viewing area, and cross over out the 3rd override mode of described surveyed area from another surface of contact of surveyed area and viewing area.
5. method according to claim 1 and 2, is characterized in that, the surface of contact that the described input operation of described foundation is crossed over is determined and the corresponding operational order of described input operation, comprising:
The surface of contact of crossing over according to described input operation, from the predetermined registration operation instruction corresponding from different surface of contact, determine and the corresponding operational order of current input operation, wherein, described predetermined registration operation instruction at least comprises to be returned to a graphical interfaces and/or shows master menu.
6. method according to claim 1, is characterized in that, described induction region can also comprise described viewing area;
Described method also comprises: obtain the input operation in described graphical interfaces by described sensing unit.
7. an operational order recognition device, it is characterized in that, described application of installation is in a Wearable electronic equipment, described Wearable electronic equipment has sensing unit and display unit, the corresponding induction region of described sensing unit, when described Wearable electronic equipment is worn on user with it time, user's eyes have a viewing area; In the time that the described display unit of described Wearable electronic equipment shows a graphical interfaces, with described graphical interfaces to there being a viewing area; Wherein, described viewing area comprises described viewing area, and the subregion that does not belong to described viewing area in described viewing area is surveyed area, and described induction region comprises described surveyed area, and described device comprises:
The first operation acquiring unit, for obtaining the input operation in described surveyed area by described sensing unit;
Judging unit, for judging whether described input operation crosses over the surface of contact of described surveyed area and described viewing area;
Instruction determining unit, in the time that the surface of contact of described surveyed area and described viewing area is crossed in described input operation, the surface of contact of crossing over according to described input operation, determines the operational order corresponding with described input operation;
Instruction response unit, for responding described operational order, controls described graphical interfaces and converts accordingly.
8. device according to claim 7, is characterized in that, described judging unit is specially: for judging whether described input operation is crossed over described surface of contact from described surveyed area and entered described viewing area; And/or, judge whether described input operation is crossed over described surface of contact from described viewing area and entered described surveyed area; And/or, judge whether described input operation moves into described surveyed area from a surface of contact of described surveyed area and described viewing area, and shift out described surveyed area from described surveyed area and another surface of contact of described viewing area.
9. according to the device described in claim 7 or 8, it is characterized in that, the surface of contact of described surveyed area and described viewing area at least comprises the first surface of contact and the second surface of contact, and described the first surface of contact is different from the second surface of contact;
Described instruction determining unit, comprising:
The first instruction determining unit, for in the time that the first surface of contact of described surveyed area and described viewing area is crossed in described input operation, the first surface of contact of crossing over according to described input operation, determines first operational order corresponding with crossed over described the first surface of contact;
The second instruction determining unit, for in the time that the second surface of contact of described surveyed area and described viewing area is crossed in described input operation, the second surface of contact of crossing over according to described input operation, determine second operational order corresponding with crossed over described the second surface of contact, wherein, described the second operational order is different from described the first operational order.
10. according to the device described in claim 7 or 8, it is characterized in that, described instruction determining unit, comprising:
Recognition unit, for in the time that the surface of contact of described surveyed area and viewing area is crossed in described input operation, determine surveyed area that described input operation is crossed over and the current surface of contact of viewing area, and the override mode of described current surface of contact is crossed in identification, wherein, described surveyed area and viewing area at least have two surface of contact, and the override mode of input operation comprises: the first override mode that strides into described viewing area from described surveyed area; And/or, stride into the second override mode of described surveyed area from described viewing area; And/or, stride into described surveyed area from a surface of contact of described surveyed area and described viewing area, and cross over out the 3rd override mode of described surveyed area from another surface of contact of surveyed area and viewing area;
Specify and determine subelement, for the current surface of contact of crossing over according to described input operation and the override mode of crossing over described current surface of contact, determine and the corresponding operational order of described input operation.
11. according to the device described in claim 7 or 8, it is characterized in that, described appointment determining unit, be specially: in the time that the surface of contact of described surveyed area and described viewing area is crossed in described input operation, the surface of contact of crossing over according to described input operation, from the predetermined registration operation instruction corresponding from different surface of contact, determines and the corresponding operational order of current input operation, wherein, described predetermined registration operation instruction at least comprise return a graphical interfaces and/or show master menu.
12. devices according to claim 7, is characterized in that, described induction region can also comprise described viewing area;
Described device also comprises: the second operation acquiring unit, and for obtaining the input operation in described graphical interfaces by described sensing unit.
13. 1 kinds of Wearable electronic equipments, it is characterized in that, described Wearable electronic equipment has processor, and the sensing unit and the display unit that are all connected with described processor, the corresponding induction region of described sensing unit, when described Wearable electronic equipment is worn on user with it time, user's eyes have a viewing area; In the time that the described display unit of described Wearable electronic equipment shows a graphical interfaces, with described graphical interfaces to there being a viewing area; Wherein, described viewing area comprises described viewing area, and the subregion that does not belong to described viewing area in described viewing area is surveyed area, described induction region comprises described surveyed area, and described processor is built-in with the operational order recognition device described in as above claim 7 to 12 any one.
CN201310085514.7A 2013-03-18 2013-03-18 A kind of operational order recognition methods, device and Wearable electronic equipment Active CN104063037B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310085514.7A CN104063037B (en) 2013-03-18 2013-03-18 A kind of operational order recognition methods, device and Wearable electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310085514.7A CN104063037B (en) 2013-03-18 2013-03-18 A kind of operational order recognition methods, device and Wearable electronic equipment

Publications (2)

Publication Number Publication Date
CN104063037A true CN104063037A (en) 2014-09-24
CN104063037B CN104063037B (en) 2017-03-29

Family

ID=51550791

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310085514.7A Active CN104063037B (en) 2013-03-18 2013-03-18 A kind of operational order recognition methods, device and Wearable electronic equipment

Country Status (1)

Country Link
CN (1) CN104063037B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104750253A (en) * 2015-03-11 2015-07-01 苏州佳世达电通有限公司 Electronic device for motion sensing input conducted by user
CN107466396A (en) * 2016-03-22 2017-12-12 深圳市柔宇科技有限公司 Head-mounted display apparatus and its control method
CN107728923A (en) * 2017-10-20 2018-02-23 维沃移动通信有限公司 The processing method and mobile terminal of a kind of operation
CN108008873A (en) * 2017-11-10 2018-05-08 亮风台(上海)信息科技有限公司 A kind of operation method of user interface of head-mounted display apparatus

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102122229A (en) * 2010-02-19 2011-07-13 微软公司 Use of bezel as an input mechanism
US20110221669A1 (en) * 2010-02-28 2011-09-15 Osterhout Group, Inc. Gesture control in an augmented reality eyepiece
US20120050140A1 (en) * 2010-08-25 2012-03-01 Border John N Head-mounted display control
US8316319B1 (en) * 2011-05-16 2012-11-20 Google Inc. Efficient selection of characters and commands based on movement-inputs at a user-inerface
US20130002724A1 (en) * 2011-06-30 2013-01-03 Google Inc. Wearable computer with curved display and navigation tool
CN102884498A (en) * 2010-02-19 2013-01-16 微软公司 Off-screen gestures to create on-screen input

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102122229A (en) * 2010-02-19 2011-07-13 微软公司 Use of bezel as an input mechanism
CN102884498A (en) * 2010-02-19 2013-01-16 微软公司 Off-screen gestures to create on-screen input
US20110221669A1 (en) * 2010-02-28 2011-09-15 Osterhout Group, Inc. Gesture control in an augmented reality eyepiece
US20120050140A1 (en) * 2010-08-25 2012-03-01 Border John N Head-mounted display control
US8316319B1 (en) * 2011-05-16 2012-11-20 Google Inc. Efficient selection of characters and commands based on movement-inputs at a user-inerface
US20130002724A1 (en) * 2011-06-30 2013-01-03 Google Inc. Wearable computer with curved display and navigation tool

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104750253A (en) * 2015-03-11 2015-07-01 苏州佳世达电通有限公司 Electronic device for motion sensing input conducted by user
CN104750253B (en) * 2015-03-11 2018-10-12 苏州佳世达电通有限公司 A kind of electronic device carrying out body-sensing input for user
CN107466396A (en) * 2016-03-22 2017-12-12 深圳市柔宇科技有限公司 Head-mounted display apparatus and its control method
CN107728923A (en) * 2017-10-20 2018-02-23 维沃移动通信有限公司 The processing method and mobile terminal of a kind of operation
CN107728923B (en) * 2017-10-20 2020-11-03 维沃移动通信有限公司 Operation processing method and mobile terminal
CN108008873A (en) * 2017-11-10 2018-05-08 亮风台(上海)信息科技有限公司 A kind of operation method of user interface of head-mounted display apparatus

Also Published As

Publication number Publication date
CN104063037B (en) 2017-03-29

Similar Documents

Publication Publication Date Title
KR101812227B1 (en) Smart glass based on gesture recognition
CN102591450B (en) Information processing apparatus and operation method thereof
EP3538975B1 (en) Electronic device and methods for determining orientation of the device
KR20150014083A (en) Method For Sensing Inputs of Electrical Device And Electrical Device Thereof
JP2014071812A (en) Information processing device, display control method, and program
CN102880304A (en) Character inputting method and device for portable device
CN104866225A (en) Electronic device having touch display screen and control method therefor
CN103809792A (en) Touch display
CN108073432B (en) User interface display method of head-mounted display equipment
CN105094675B (en) A kind of man-machine interaction method and touch screen wearable device
CN104076907A (en) Control method, control device and wearable electronic equipment
CN104298340A (en) Control method and electronic equipment
CN104199547A (en) Man-machine interactive type virtual touch device, system and method
CN103176605A (en) Control device of gesture recognition and control method of gesture recognition
CN104063037A (en) Operating command recognition method and device as well as wearable electronic equipment
CN104808906A (en) Electronic equipment with touch display screen and touch display screen control method
US20160041616A1 (en) Display device and control method thereof, and gesture recognition method
CN103902174A (en) Display method and equipment
CN103530060B (en) Display device and control method, gesture identification method
CN104199548A (en) Man-machine interactive type virtual touch device, system and method
CN104866103A (en) Relative position determining method, wearable electronic equipment and terminal equipment
CN104866786A (en) Display method and electronic equipment
CN105117090A (en) Mobile communication equipment with capacitive type sliding key and operation method thereof
US10437415B2 (en) System, method, and device for controlling a display
CN105446598A (en) Icon position switching method and system, and electronic device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant