CN102799273B - Interaction control system and method - Google Patents

Interaction control system and method Download PDF

Info

Publication number
CN102799273B
CN102799273B CN201210240378.XA CN201210240378A CN102799273B CN 102799273 B CN102799273 B CN 102799273B CN 201210240378 A CN201210240378 A CN 201210240378A CN 102799273 B CN102799273 B CN 102799273B
Authority
CN
China
Prior art keywords
control
range
information
gesture
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201210240378.XA
Other languages
Chinese (zh)
Other versions
CN102799273A (en
Inventor
徐向民
梁卓锐
黄彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN201210240378.XA priority Critical patent/CN102799273B/en
Publication of CN102799273A publication Critical patent/CN102799273A/en
Application granted granted Critical
Publication of CN102799273B publication Critical patent/CN102799273B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides an interaction control system. The interaction control system comprises a front-end information acquisition unit, an information storage unit and a control range prediction unit, wherein the front-end information acquisition unit is used for acquiring gesture control information in real time; the information storage unit is used for storing the gesture control information of a current time period; and the control range prediction unit is used for judging a control state of a current moment according to the information stored by the information storage unit and generating gesture control range predicting information according to the control state of the current moment. The invention also provides an interaction control method. The interaction control system and the interaction control method can be used for predicting and prompting a gesture control range, so that the gesture control can be realized more accurately.

Description

Intersection control routine and interaction control method thereof
Technical field
The present invention relates to interactive controlling technology, particularly relate to a kind of intersection control routine that can predict gesture range of control and point out and interaction control method thereof.
Background technology
Along with the development of computer vision technique, gesture interaction technology is progressively widely applied.Based on the gesture interaction of computer vision, its core utilizes the technology such as computer vision, image procossing to process the video sequence that image capture device collects, and obtains the understanding to gesture controlling behavior, thus carry out respective response.
Aerial gesture controls as a kind of interactive means emerging in man-machine interaction, brings more natural interactive experience, is widely used in intelligent television operation.But compared with the contact interactive mode such as keyboard and mouse, aerial gesture controls the feature with non-precision.In order to overcome non-precision problem, need the interbehavior of interaction prompts and intelligence intuitively to predict, with make full use of computing machine itself arithmetic capability and with the corrdinated adjustment between user, realize the control object of convenient intelligence, improve interactive experience.
Summary of the invention
Based on this, provide a kind of intersection control routine that can predict gesture range of control and point out and interaction control method thereof, control so that gesture can be realized more accurately.
A kind of intersection control routine, comprising: front-end information acquiring unit, for obtaining gesture control information; Information memory cell, for storing the described gesture control information of present period; Range of control predicting unit, the information for storing according to information memory cell judges current time state of a control, and produces prediction gesture range of control information according to this current time state of a control.
Wherein in an embodiment, this front-end information acquiring unit is also for obtaining interactive interface information and being stored in information memory cell.
Wherein in an embodiment, this intersection control routine comprises range of control Tip element further, the prediction gesture range of control information that this range of control Tip element produces for contrasting interactive interface information and range of control predicting unit, judge the interactive interface information in now forecast gesture range of control, and this prediction gesture range of control and control are made mistakes point out.
Wherein in an embodiment, the gesture got is controlled image and carries out analyzing and processing by this front-end information acquiring unit, obtain the relative position information of the gesture in each moment, and comparing calculation is carried out to the gesture relative position between adjacent two width images, obtain corresponding human hand movement speed.
Wherein in an embodiment, the movement velocity of the image information that this range of control predicting unit obtains according to this front-end information acquiring unit and staff, determines state of a control.
Wherein in an embodiment, this state of a control comprises in setting in motion, motion, motion arrives, motion turns to; When range of control predicting unit is according to when judging state that staff is in motion at present, then produce corresponding fan-shaped range; When range of control predicting unit judge that staff is in setting in motion at present, motion arrives or move the state turned to time, then produce corresponding oval range of control.
Wherein in an embodiment, described range of control Tip element carries out faulty operation detection and error correction according to the number of the interface icon existed in estimation range or matching degree, if there is not the interface icon that can match with staff action in estimation range, then point out error in operation, as only existence anduniquess icon in estimation range, then automatic absorbing is in target icon.
A kind of interaction control method, comprising: the control information of Real-time Obtaining gesture also stores; And judge current time state of a control according to the gesture control information of stored present period, and produce the gesture range of control information of prediction according to this current time state of a control.
Wherein in an embodiment, comprise further: obtain interactive interface information and store; And contrast interactive interface information and the prediction gesture range of control information that produces of range of control predicting unit, judge the interactive interface information in now forecast gesture range of control, and this prediction gesture range of control and control are made mistakes point out.
Wherein in an embodiment, to this prediction gesture range of control with control the step pointed out of makeing mistakes and comprise: carry out faulty operation detection and error correction according to the number of the interface icon existed in estimation range or matching degree, if there is not the interface icon that can match with staff action in estimation range, then point out error in operation, as only existence anduniquess icon in estimation range, then automatic absorbing is in target icon.
Intersection control routine of the present invention and exchange method thereof, have following beneficial effect:
(1) solve prediction and the prompting problem of range of control in aerial gesture interaction process, make user have a sense organ clearly to current control status; (2) due to the range of control of future time can be predicted, a kind of detection to error situation and bearing calibration can be provided, improve Consumer's Experience.
Accompanying drawing explanation
Fig. 1 is the functional block diagram of the intersection control routine device that one embodiment of the invention provides;
Fig. 2 is gesture motion control rate and time state relation schematic diagram;
Fig. 3 a, 3b are the schematic diagram of gesture range of control prediction;
Fig. 4 is the schematic diagram that gesture range of control produces prompting;
Fig. 5 is the process flow diagram of the interaction control method of one embodiment of the invention;
Description of reference numerals:
100, intersection control routine, 10, front-end information acquiring unit, 20, information memory cell, 30, range of control predicting unit, 40, range of control Tip element.
Embodiment
Below in conjunction with multiple accompanying drawing, the technical program is described further.
Refer to Fig. 1, one embodiment of the invention provides a kind of intersection control routine 100, and it can carry out the prediction of gesture range of control and prompting.This intersection control routine 100 comprises front-end information acquiring unit 10, information memory cell 20, range of control predicting unit 30 and range of control Tip element 40.
In one embodiment, this front-end information acquiring unit 10 is for obtaining gesture control information and interactive interface information.This information memory cell 20 is for storing the gesture control information of present period and described interactive interface information.This range of control predicting unit 30 judges current time state of a control for the gesture control information stored according to information memory cell 20, and produces prediction gesture range of control information according to this current time state of a control.The interactive interface information that this range of control Tip element 40 stores for comparative information storage unit 20 and the prediction gesture range of control information that range of control predicting unit produces, judge the interactive interface information in now forecast gesture range of control, and this prediction gesture range of control and control are made mistakes point out.
Incorporated by reference to Fig. 2, concrete, this front-end information acquiring unit 10, continues to catch the gesture motion image in gesture interaction process as front end sensors by an image input device (not shown).In the present embodiment, this image input device is common camera or depth camera, and computing machine continues to obtain the sequence of video images in image capture device.More specifically, the image sequence captured is carried out analyzing and processing by this front-end information acquiring unit 10, obtain the relative position information [x (t) of the gesture of each moment t, y (t)], and comparing calculation is carried out to the gesture relative position between adjacent two width images, thus obtaining corresponding gesture (staff) motion velocity information v (t), this gesture motion velocity information v (t) is the wherein a kind of of gesture control information.
This information memory cell 20 can store the gesture control information of present period and past period.The gesture control information that this information memory cell 20 stores can represent that single or one group controls fragment, and this information memory cell 20 has certain Memorability and study property to this control fragment.
This information memory cell 20 is simultaneously for storing interactive interface information, and this interactive interface information comprises the information of task context, the size of interface icon and the position distribution of interface icon.In the present embodiment, this task context refers to one or more operational orders that can perform continuously, such as, for completing " search engine " such task, it needs the step performed to be: input retrieving head in the search box, click search button, drag down and browse retrieval link, click lower one page (optional), click Object linking, enter the new page, and the context of this task is just called in the wherein front and back correlation step order of of above-mentioned multiple step.This interactive interface information comprises the size of the icon at whole interface and the information of position distribution.
This range of control predicting unit 30, for the gesture control information stored according to information memory cell 20, judges the state of a control at current time gesture, and according to the state of a control of this current time, produces the gesture range of control information in future time.Concrete, the image information that this range of control predicting unit 30 obtains according to above-mentioned front-end information acquiring unit 10 and the movement velocity of staff calculated, determine the state of controlling behavior, namely state of a control.Finally, according to determined state of a control PREDICTIVE CONTROL scope.In the present embodiment, this state of a control is divided in setting in motion, motion, motion arrives, motion turns to four kinds of states.This range of control predicting unit 30 judges that the principle of state of a control is as follows: in gesture control procedure, the movement velocity of hand follows general physics law with control object: (1) hand is moving in the process of source location from starting point, movement velocity accelerates to peak from 0, then decelerates to 0 by peak; (2), time near target or in direction of motion generation larger change, movement velocity is slack-off; (3) distance current location in target location is far away, and the motion average velocity of hand is larger.According to above rule, the controlling behavior of people, according to movement velocity, is divided into above-mentioned one of four states by this range of control predicting unit 30: in setting in motion, motion, motion arrives, moving turns to.In control procedure, according to this one of four states, the range of control of staff gesture is predicted accordingly.As shown in Fig. 3 a, 3b, the areal prediction region that to define with the position of hand be reference point, represents with fan-shaped or ellipse.Wherein, Fig. 3 a represents and is in motion state (in motion) relatively fast, and under this state, motion prediction wider range is comparatively far away, in fan-shaped; Fig. 3 b represents the motion state (comprise setting in motion, motion arrives, moving turns to) be in relatively at a slow speed, and under this state, motion prediction scope is less comparatively near, ovalize.In the present embodiment, when range of control predicting unit 30 judges according to the movement velocity of gesture the state that staff is in motion at present, then corresponding fan-shaped range is produced; Otherwise, when range of control predicting unit 30 judge that staff is in setting in motion at present, motion arrives or move the state turned to time, then produce corresponding oval range of control.
Refer to Fig. 4, the interactive interface information that this range of control Tip element 40 stores for comparative information storage unit 20 and the prediction gesture range of control information that range of control predicting unit produces, judge the interactive interface information in now forecast gesture range of control, and by this prediction gesture range of control and control whether to make mistakes and carry out pointing out and feeding back to user.Concrete, range of control Tip element 40 have read the interactive interface information in information memory cell 20 in advance, then the estimation range information above-mentioned range of control predicting unit 30 obtained with it correspondence compares, thus judges the interactive interface information being arranged in estimation range.After the interface icon judging to be arranged in estimation range, on the one hand this estimation range is pointed out; On the other hand, faulty operation detection and error correction is carried out according to the number of the interface icon existed in estimation range or matching degree.Such as, if there is not the interface icon that can match with staff action in estimation range, then error in operation is pointed out.As only existence anduniquess icon in estimation range, then automatic absorbing is in target icon, thus realizes operating fast and easily.
In addition, in long reciprocal process, this intersection control routine 100 utilizes information memory cell 20, automatically the existing operation history of contrast and the similarity degree operated at present, and this similarity degree provides auxiliary analytical information by the prediction that certain weight is control area.
Refer to Fig. 5, one embodiment of the invention also provides a kind of interaction control method.Said method comprising the steps of:
Step S501: obtain gesture control information and interactive interface information.
Step S503: store present period and the gesture control information of past period and store described interactive interface information.In the present embodiment, described interactive interface information comprises task context, interface icon size and interface icon distributed intelligence.
Step S505: the gesture control information according to stored present period judges current time state of a control, and the gesture range of control information producing prediction according to this current time state of a control, namely appear at the gesture range of control information of future time.
Step S507: the interactive interface information that contrast stores and the prediction gesture range of control information that step S505 produces, judges the interactive interface information in now forecast gesture range of control, and points out.Described prompting operation comprises points out prediction gesture range of control, and carry out faulty operation detection and error correction according to the number of the interface icon existed in estimation range or matching degree, if there is not the interface icon that can match with staff action in estimation range, then point out error in operation; As only existence anduniquess icon in estimation range, then automatic absorbing is in target icon, thus realizes operating fast and easily.
Intersection control routine 100 of the present invention and exchange method thereof, have following beneficial effect:
(1) solve prediction and the prompting problem of range of control in aerial gesture interaction process, make user have a sense organ clearly to current control status; (2) due to the range of control of future time can be predicted, a kind of detection to error situation and bearing calibration can be provided, improve Consumer's Experience.
The above embodiment only have expressed several embodiment of the present invention, and it describes comparatively concrete and detailed, but therefore can not be interpreted as the restriction to the scope of the claims of the present invention.It should be pointed out that for the person of ordinary skill of the art, without departing from the inventive concept of the premise, can also make some distortion and improvement, these all belong to protection scope of the present invention.

Claims (7)

1. an intersection control routine, comprising:
Front-end information acquiring unit, for the control information of Real-time Obtaining gesture and interactive interface information;
Information memory cell, for storing described gesture control information and the interactive interface information of present period;
Range of control predicting unit, the information for storing according to information memory cell judges current time state of a control, and produces prediction gesture range of control information according to this current time state of a control; And
Range of control Tip element, for contrasting the prediction gesture range of control information that interactive interface information and range of control predicting unit produce, judge the interactive interface information in now forecast gesture range of control, and this prediction gesture range of control and control are made mistakes point out.
2. intersection control routine as claimed in claim 1, it is characterized in that, the gesture got is controlled image and carries out analyzing and processing by this front-end information acquiring unit, obtain the relative position information of the gesture in each moment, and comparing calculation is carried out to the gesture relative position between adjacent two width images, obtain corresponding human hand movement speed.
3. intersection control routine as claimed in claim 2, is characterized in that, the movement velocity of the image information that this range of control predicting unit obtains according to this front-end information acquiring unit and staff, determines state of a control.
4. intersection control routine as claimed in claim 3, is characterized in that, this state of a control comprises in setting in motion, motion, motion arrives, motion turns to; When range of control predicting unit is according to when judging state that staff is in motion at present, then produce corresponding fan-shaped range; When range of control predicting unit judge that staff is in setting in motion at present, motion arrives or move the state turned to time, then produce corresponding oval range of control.
5. intersection control routine as claimed in claim 1, it is characterized in that, range of control Tip element carries out faulty operation detection and error correction according to the number of the interface icon existed in estimation range or matching degree, if there is not the interface icon that can match with staff action in estimation range, then point out error in operation, as only existence anduniquess icon in estimation range, then automatic absorbing is in target icon.
6. an interaction control method, comprising:
The control information of Real-time Obtaining gesture and interactive interface information also store; And
Gesture control information according to stored present period judges current time state of a control, and produces the gesture range of control information of prediction according to this current time state of a control;
Contrast interactive interface information and the prediction gesture range of control information that produces of range of control predicting unit, judge the interactive interface information in now forecast gesture range of control, and make mistakes to this prediction gesture range of control and control and point out.
7. interaction control method as claimed in claim 6, is characterized in that, comprises this prediction gesture range of control and controlling step pointed out of makeing mistakes:
Faulty operation detection and error correction is carried out according to the number of the interface icon existed in estimation range or matching degree, if there is not the interface icon that can match with staff action in estimation range, then point out error in operation, as only existence anduniquess icon in estimation range, then automatic absorbing is in target icon.
CN201210240378.XA 2012-07-11 2012-07-11 Interaction control system and method Expired - Fee Related CN102799273B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210240378.XA CN102799273B (en) 2012-07-11 2012-07-11 Interaction control system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210240378.XA CN102799273B (en) 2012-07-11 2012-07-11 Interaction control system and method

Publications (2)

Publication Number Publication Date
CN102799273A CN102799273A (en) 2012-11-28
CN102799273B true CN102799273B (en) 2015-04-15

Family

ID=47198397

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210240378.XA Expired - Fee Related CN102799273B (en) 2012-07-11 2012-07-11 Interaction control system and method

Country Status (1)

Country Link
CN (1) CN102799273B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103558958B (en) * 2013-10-29 2017-04-12 宇龙计算机通信科技(深圳)有限公司 Application program function calling method and terminal
KR101534742B1 (en) * 2013-12-10 2015-07-07 현대자동차 주식회사 System and method for gesture recognition of vehicle
CN103885597B (en) * 2014-03-27 2017-06-13 广东威创视讯科技股份有限公司 Space input recognition method and system
CN104486654A (en) * 2014-12-15 2015-04-01 四川长虹电器股份有限公司 Method for providing guidance and television
EP3243120A4 (en) * 2015-01-09 2018-08-22 Razer (Asia-Pacific) Pte Ltd. Gesture recognition devices and gesture recognition methods
CN104777900A (en) * 2015-03-12 2015-07-15 广东威法科技发展有限公司 Gesture trend-based graphical interface response method
CN107967058B (en) * 2017-12-07 2021-04-13 联想(北京)有限公司 Information processing method, electronic device, and computer-readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1885343A (en) * 2005-06-21 2006-12-27 三星电子株式会社 Apparatus and method for displaying 3-dimensional graphics
WO2009018161A1 (en) * 2007-07-27 2009-02-05 Gesturetek, Inc. Enhanced camera-based input
CN102193626A (en) * 2010-03-15 2011-09-21 欧姆龙株式会社 Gesture recognition apparatus, method for controlling gesture recognition apparatus, and control program
CN102272697A (en) * 2008-12-31 2011-12-07 惠普开发有限公司 Computing device and method for selecting display regions responsive to non-discrete directional input actions and intelligent content analysis
US8166421B2 (en) * 2008-01-14 2012-04-24 Primesense Ltd. Three-dimensional user interface

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1885343A (en) * 2005-06-21 2006-12-27 三星电子株式会社 Apparatus and method for displaying 3-dimensional graphics
WO2009018161A1 (en) * 2007-07-27 2009-02-05 Gesturetek, Inc. Enhanced camera-based input
US8166421B2 (en) * 2008-01-14 2012-04-24 Primesense Ltd. Three-dimensional user interface
CN102272697A (en) * 2008-12-31 2011-12-07 惠普开发有限公司 Computing device and method for selecting display regions responsive to non-discrete directional input actions and intelligent content analysis
CN102193626A (en) * 2010-03-15 2011-09-21 欧姆龙株式会社 Gesture recognition apparatus, method for controlling gesture recognition apparatus, and control program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
一种融合Kalman预测和Mean-shift搜索的视频运动目标跟踪新方法;柳宗浦等;《光电子技术》;20090331;第29卷(第1期);第30页到第33页 *

Also Published As

Publication number Publication date
CN102799273A (en) 2012-11-28

Similar Documents

Publication Publication Date Title
CN102799273B (en) Interaction control system and method
CN105373785B (en) Gesture identification detection method and device based on deep neural network
CN110472496B (en) Traffic video intelligent analysis method based on target detection and tracking
US11221681B2 (en) Methods and apparatuses for recognizing dynamic gesture, and control methods and apparatuses using gesture interaction
CN105807926B (en) A kind of unmanned plane man-machine interaction method based on three-dimensional continuous dynamic hand gesture recognition
CN108845668B (en) Man-machine interaction system and method
CN108197589B (en) Semantic understanding method, apparatus, equipment and the storage medium of dynamic human body posture
US8373654B2 (en) Image based motion gesture recognition method and system thereof
US10074186B2 (en) Image search system, image search apparatus, and image search method
CN106548151B (en) Target analyte detection track identification method and system towards intelligent robot
CN101976330B (en) Gesture recognition method and system
US20130307765A1 (en) Contactless Gesture-Based Control Method and Apparatus
US20140071042A1 (en) Computer vision based control of a device using machine learning
CN102999152A (en) Method and system for gesture recognition
CN102831439A (en) Gesture tracking method and gesture tracking system
CN106681354A (en) Flight control method and flight control device for unmanned aerial vehicles
CN103440033A (en) Method and device for achieving man-machine interaction based on bare hand and monocular camera
CN112507918A (en) Gesture recognition method
CN103105924A (en) Man-machine interaction method and device
CN111656313A (en) Screen display switching method, display device and movable platform
CN103793056A (en) Mid-air gesture roaming control method based on distance vector
CN103197761B (en) Gesture identification method and device
US8970479B1 (en) Hand gesture detection
CN112379781B (en) Man-machine interaction method, system and terminal based on foot information identification
CN104571884B (en) The device and method of the touch of user terminal for identification

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20150415

Termination date: 20210711