WO2016048262A1 - Cursor control using images - Google Patents

Cursor control using images Download PDF

Info

Publication number
WO2016048262A1
WO2016048262A1 PCT/US2014/056734 US2014056734W WO2016048262A1 WO 2016048262 A1 WO2016048262 A1 WO 2016048262A1 US 2014056734 W US2014056734 W US 2014056734W WO 2016048262 A1 WO2016048262 A1 WO 2016048262A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
movement
interest
point
cursor
Prior art date
Application number
PCT/US2014/056734
Other languages
French (fr)
Inventor
Bohuan CHEN
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/US2014/056734 priority Critical patent/WO2016048262A1/en
Publication of WO2016048262A1 publication Critical patent/WO2016048262A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

One example implementation controls a cursor by analyzing a first image and second image, calculating movement of a first device based on the first image and the second image, and controlling a cursor of a second device to move based on the movement.

Description

BACKGROUND
[0001] Users use cursors to select icons and/or control a computing device (e.g., a computer, a server, etc.). In many instances, users control cursors using a cursor device, such as a mouse, a trackball, a touchpad, etc. The example cursor devices may use optical sensors, mechanical sensors, haptic sensors, etc. to detect movement and/or user interaction to control the cursors. The example devices send information corresponding to the control, location, and/or movement of the device to a computer and/or display to present the cursor and its corresponding location on the display.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] FIG. 1 illustrates an example environment of use for a cursor controller constructed in accordance with an aspect of this disclosure may be implemented.
[0003] FIG. 2 illustrates a block diagram of an example mobile device that may be used to implement the mobile device of FIG. 1 , including the example cursor controller of FIG. 1 , in accordance with an aspect of the disclosure.
[0004] FIG. 3 a block diagram of an example cursor controller that may be implemented by the cursor controller of FIGS. 1 and/or 2 in accordance with an aspect of the disclosure.
[0005] FIG. 4 is a flow diagram of an example image analysis and control output performed by the cursor controller of FIG. 3 in accordance with an aspect of the disclosure.
[0006] FIG. 5 is a flowchart representative of example machine readable instructions that may be executed to implement the touch detector of FIG. 3 in accordance with an aspect of the disclosure. [0007] FIG. 6 is a flowchart representative of an example portion of the example machine readable instructions of FIG. 5 to implement the cursor controller of FIG. 3 in accordance with an aspect of the disclosure.
[0008] FIG. 7 is a flowchart representative of another example portion of the example machine readable instructions of FIG. 5 to implement the cursor controller of FIG. 3 in accordance with an aspect of the disclosure.
[0009] FIG. 8 is a block diagram of an example processor platform capable of executing the instructions of FIGS. 5, 6, and/or 7 to implement the cursor controller of FIG. 3 in accordance with an aspect of the disclosure.
[0010] Wherever possible, the same reference numbers will be used throughout the drawing(s) and accompanying written description to refer to the same or like parts. As used in this patent, stating that any part (e.g., a layer, film, area, or plate) is in any way positioned on (e.g., positioned on, located on, disposed on, or formed on, etc.) another part, means that the referenced part is either in contact with the other part, or that the referenced part is above the other part with at least one intermediate part located therebetween. Stating that any part is in contact with another part means that there is no intermediate part between the two parts.
DETAILED DESCRIPTION
[0011] Examples disclosed herein involve analyzing images to determine movement of a device and controlling a cursor based on the determined movement. Examples disclosed herein enable a camera and/or images from a camera of a mobile device to be used to control a cursor of a computer when the mobile device is moved. Accordingly, examples disclosed herein allow a user to use his or her mobile device (e.g., a smartphone, a tablet computer, a personal digital assistant (PDA), a digital music player (e.g., an mp3 player), a smart watch, smart band, or any other smart apparel, etc.) to control a cursor of a second device, such as a laptop computer and/or desktop computer (e.g., in a similar fashion as a mouse).
[0012] As computing and mobile technology continues to advance, control device implementations for computers has stayed relatively similar for decades. That is largely due to the intuitiveness of a mouse, which is a device that can be physically moved by a user to cause corresponding movements of a cursor on a display. Touchpads and/or touchscreens, which involve users dragging and/or touching the pads/screens to control the cursor, are commonly used in connection with laptop computers and/or tablet computers due to their compact nature and/or convenience, though for many implementations (e.g., drawing, gaming, software/web navigation, etc.) a mouse may be preferred over the touchpad and/or touchscreen. Accordingly, when a user seeks to control his or her computer (e.g., tablet, laptop, desktop, etc.) using a mouse, and a mouse is not readily available, it may be useful to use another device to control the computer in a similar fashion as a mouse. Examples disclosed herein may calculate movement of a first device (e.g., a mobile device) using images to move and/or control a cursor of a second device (e.g., a computer).
[0013] An example method disclosed herein includes analyzing a first image and a second image, calculating movement of a first device based on the first image and the second image, and causing a cursor on a second device to move based on the movement. An example apparatus includes an image analyzer to analyze images, a movement calculator to determine movement based on the images, and a control output to cause a cursor to move based on the determined movement.
[0014] FIG. 1 illustrates an example environment of use 100 for a cursor controller 1 10 constructed in accordance with the teachings of this disclosure. In the example environment 100 of FIG. 1 , a mobile device 120 may communicate with a computer 130 via communication link 140. The computer 130 in the illustrated example of FIG. 1 includes a display 132. The mobile device 120 and the computer 130 are placed on a surface 134 (e.g., a desktop, a counter top, etc.) in the example of FIG. 1 . Example communication link 140 may be any wired (e.g., universal serial bus (USB)) and/or wireless communication link (e.g., Bluetooth™, Wifi, etc.).
[0015] The example mobile device 120 of FIG. 1 includes a camera 122 and a touchscreen 124. In the illustrated example, the camera 122 is a front facing camera that captures images on a same side of the mobile device 120 that the touchscreen 124 is located. Accordingly, the camera 122, in the illustrated example of FIG. 1 faces away from the surface 134. In some examples, the camera 122 and/or another camera on a backside of the mobile device 120 (i.e., the side of the mobile device opposite the touchscreen 124) faces toward the surface 134 and may be used by the cursor controller 1 10 in accordance with the teachings of this disclosure. In some examples, the mobile device 120 includes a sensor and/or system of sensors (e.g., an acceleromefer, a light detector, a motion detector, a gyroscope, etc.). in examples disclosed herein, the cursor controller 1 10 uses the features of the mobile device 120 (e.g., the camera 122, the touchscreen 124, sensors, etc.) to control the cursor 150 on the computer 130.
[0016] In examples disclosed herein, the cursor controller 1 10 of FIG. 1 controls a cursor 150 of the computer 130. The example cursor controller 1 10 uses images received from the camera 122 to determine movement of the mobile device 120 along the surface 134 of FIG. 1 . The cursor controller 1 10 may analyze such images and instruct the computer 130 to move the cursor 150 based on the images captured by the camera 122. The example cursor controller 1 10 may further cause control of the computer 130 via navigation instructions (e.g., back/forward, scroll up/down, etc.) and/or click instructions (e.g., right-click select, left-click select, etc.) received via the mobile device 120 (e.g., via a user interface of the mobile device 120).
[0017] FIG. 2 is a block diagram of an example mobile device 1 10 including a cursor controller 120. The mobile device 1 10 and/or cursor controller 120 of FIG. 2 may be used to implement the mobile device 120 and/or cursor controller 1 10 of FIG. 1 . in the illustrated example of FIG. 1 , the cursor controller 1 10 communicates with a camera 210, sensors 220, a communication interface 230, and/or a user interface 240 of the mobile device 1 10 via a communication bus 250.
[0018] The example cursor controller 1 10 of FIG. 2 may receive images and/or a stream of images (e.g., a video stream) from the camera 210 of the mobile device 120. The example camera 210 may be implemented by a front facing camera of the mobile device 120 (e.g., the camera 122 of FIG. 1 ) and/or a rear facing camera of the mobile device 120 of FIG. 2. The example cursor controller 1 10 may use the images in accordance with the teachings of this disclosure to determine a movement of the mobile device 120 of FIG. 2, and forward such information to the communication interface 230. The example communication interface 230 may use Bluetooth™, Wi-Fi, and/or any other suitable communication protocol and/or communication technique (e.g., infra-red (IR), near field communication (NFC), radio frequency (RF), etc.) to communicate with a computer (e.g., the computer 130 of FIG. 1 ) to control a cursor (e.g., the cursor 150 of FIG. 1 ) of the computer.
[0019] In some examples, the sensors 220 of the mobile device 120 may be used to determine a movement of the mobile device 120. For example an accelerometer may be used to determine a change in velocity of the mobile device 120 while movement is calculated from one point to another. In another example, a gyroscope may be used to determine a rotation of the mobile device 120 about an axis, in some examples, the sensors 220 may be used to identify when the mobile device 120 has initially been moved. Accordingly, in response to detecting the movement of the mobile device 120 (e.g., via the accelerometer, via the gyroscope, etc.), the mobile device 120 may instruct the camera 210 to begin capturing a series of images or to stream video to be used in accordance with the teachings of this disclosure. In some examples, a timer may be implemented to determine a length of time that the mobile device 120 has been still, in such examples, the camera 210 may be deactivated so as to save battery power of the mobile device 120. Furthermore, in similar examples, the camera 122 may pause streaming images and/or video to the cursor controller 1 10 when the mobile device 120 is still and/or until movement is detected (e.g., to save battery power).
[0020] The example user interface 240 may be implemented by any input device (e.g., button(s), a keyboard, a touchscreen, etc.) and/or output devices (e.g., a display, a light emitter (e.g., a light emitting diode (LED)), etc.). The example user interface 240 of the mobile device 120 may be used to implement control of the cursor 150 of FIG. 1 . For example, buttons of the mobile device 120 and/or a touchscreen of the mobile device 120 may be used to implement buttons to control the cursor 150 to select items (e.g., icons, buttons, etc.) and/or navigate software (e.g., a browser, an application, etc.) running on the computer 130. More specifically, a touchscreen of the mobile device 120 may display a left-dick" button, a "right-dick" button, a "back" button, a scroll wheel, or any other control that may be implemented by a cursor controller such a mouse or trackball to control the cursor 150 of FIG. 1 . In some examples, buttons of the mobile device 120 (e.g., a camera button, a volume button, etc.) may be used to control the cursor 150 and/or the computer 13(3 of FIG. 1 (e.g., to click or select an icon, window, etc.). In such examples, the user interface 240 may provide user input information (e.g., when and/or which buttons are "clicked" or used) to the cursor controller 1 10 to pass on as control instructions to the computer 130.
[0021] The example user interface 240 may be used to facilitate user interaction with the cursor controller 120. For example, a user may adjust settings (e.g., sensitivity, ratio of movement, speed of cursor, etc.) of the cursor controller 1 10 and/or mobile device 120 (e.g., frequency at which images are to be captured, period of time before deactivating camera after being still, etc.). Accordingly, the user interface 240 of the mobile device 120 may be used in association with the cursor controller 1 10 to control the cursor 1 50 of FIG. 1 .
[0022] In the illustrated examples of FIGS. 1 and 2, the cursor controller 1 10 is located on the mobile device 120. In some examples, the cursor controller 1 10 may be located on the computer 130. In such examples, a mobile device may forward images from a camera of the mobile device to the computer 130 of FIG. 1 for analysis. For example, any suitable techniques and/or communication may be used to forward images and/or video captured by the camera 122 to the computer 130. For example, a dedicated communication link may be established between the mobile device 120 and the computer 130 to forward images from the camera 122. Accordingly, the cursor controller 1 10 may perform operations in accordance with the teachings of this disclosure on the mobile device 120 and/or the computer 130 to control the cursor 150.
[0023] FIG. 3 is a block diagram of an example cursor controller 1 10 that may be used to implement the cursor controller 1 10 of FIGS. 1 and/or 2. The example cursor controller 1 10 of FIG. 3 may be partially or entirely located on a mobile device (e.g., the mobile device 120) and/or a computer (e.g., the computer 130) to a control a cursor (e.g., the cursor 150). The example cursor controller 1 10 of FIG. 3 includes an image analyzer 310, a movement calculator 320, and a control output 330. A communication bus 340 facilitates
communication between the image analyzer 310, the movement calculator 320, and/or the control output 330. The example cursor controller 1 10 of FIG. 3 may use the image analyzer 310 to identify points of interest in a series of images received from a camera, the movement calculator 320 to determine a movement of a mobile device (e.g., a mobile device including the camera), and the control output to send instructions to control a cursor based on the movement, as disclosed herein.
[0024] The example image analyzer 310 of FIG. 3 analyzes images received from a camera (e.g., a camera of a mobile device, such as the camera 122) of a device (e.g., the mobile device 120 of FIG. 1 ). The image analyzer 310 may be used to identify points of interest in the received images that may be used by the movement calculator 320 to determine movement of a device (e.g., the mobile device 120). Accordingly, the image analyzer 310 identifies a point of interest in a series of images and provides information associated with the point of interest to the movement calculator 320. Such information may include coordinates of the point of interest within the captured images. For example, the point of interest may have x-y coordinates relative to an x-axis of the images and a y-axis of the images, in such an example, the images may have corresponding reference points and/or a same reference point (e.g., a center of the images and/or a corresponding corner of the images) used to determine the coordinates, in such an example, the image analyzer 310 identifies the point of interest in the images and corresponding coordinates of the point of interest.
[0025] An example point of interest identified by the image analyzer 310 may be any identifiable object or mark in an image. For example, a point of interest may contrast with surrounding content in the image. More specifically, point(s) of interest may include a light fixture on a ceiling, or a pattern on a ceiling (e.g., drop ceiling tiles, lighting patterns), patterns on a desktop (e.g., a mouse pad may include points of interest for use with the cursor controller). In some examples, a point of interest or multiple points of interest may be identified using edge detection, image segmentation, motion detection, and/or any other image processing technique. An example Image analysis is further described below in connection with FIG. 4.
[0026] The example movement calculator 320 of FIG. 3 calculates a movement of a device (e.g., the mobile device 120) that provides images to the cursor controller 1 10. Example movement may be caused by a user and may be similar to how a user would move a mouse to control the cursor 150 of the computer 130 of FIG. 1 . The movement calculator 320 calculates a difference between coordinates of a point of interest (or multiple points of interest) to determine movement of the mobile device 120. in some examples, when multiple points of interest are identified in a plurality of images, the movement calculator 320 may average or calculate a weighted average of differences between coordinates of corresponding points of interest. After calculating the movement of the mobile device 120, the movement calculator 320 provides movement information to the control output 330.
[0027] In some examples, the movement calculator 320 may analyze coordinates of multiple points of interest to determine rotational movement of the mobile device 120. For example, the movement calculator 320 may calculate a difference between multiple sets of coordinates corresponding to the multiple points of interest to determine an axis of rotation between and/or around the multiple points of interest. As a more specific example, if a light in a ceiling and points of a ceiling tile pattern are used as points of interest, and if coordinates of the light are the same or similar in two images, but an angle of the pattern (and corresponding coordinates of the pattern) are different, the movement calculator 320 may determine that the mobile device was rotated with a rotational axis corresponding to the location of the light (e.g., the mobile device 120 was rotated about the light and/or the camera 122).
[0028] The example control output 330 of FIG. 3 provides control information to a device (e.g., the computer 130) based on movement of another device (e.g., the mobile device 120) calculated by the movement calculator 320. Accordingly, the control output 330 may instruct the computer 130 to move the cursor 150 based on the calculated movement of the mobile device 120. The example control output 330 of FIG. 3 may also provide navigation instructions and/or click instructions to the computer 130 based on user input received via a user interface (e.g., the user interface 240) of the mobile device 120. Navigation instructions may include scroll instructions (e.g., to scroll up or down a page), back or forward instructions (e.g., to advance to a next application, item, window, and/or web page, return to a previous application, item, window, or web page, etc.). Click instructions may include instructions to select items, icons, buttons, etc. and/or navigate software running on the computer 130. Accordingly, a user may move the mobile device 120 in a similar fashion as a computer mouse and activate buttons and/or a touchscreen of the mobile device 120 to control the cursor 150 of FIG. 130.
[0029] While an example manner of implementing the cursor controller 1 10 of FIGS. 1 and/or 2 is illustrated in FIG. 3, at least one of the elements, processes and/or devices illustrated in FIG. 3 may be combined, divided, rearranged, omitted, eliminated and/or implemented in any other way. Further, the image analyzer 310, the movement calculator 320, the control output 330, and/or, more generally, the example cursor controller 1 10 of FIG. 3 may be implemented by hardware and/or any combination of hardware, software and/or firmware. Thus, for example, any of the image analyzer 310, the movement calculator 320, the control output 330, and/or, more generally, the example cursor controller 1 10 of FIG. 3 could be implemented by at least one of an analog or digital circuit, a logic circuit, a programmable processor, an application specific integrated circuit (ASIC), a programmable logic device (PLD) and/or a field programmable logic device (FPLD). When reading any of the apparatus or system claims of this patent to cover a purely software and/or firmware implementation, at least one of the image analyzer 310, the movement calculator 320, and/or the control output 330 is/are hereby expressly defined to include a tangible computer readable storage device or storage disk such as a memory, a digital versatile disk (DVD), a compact disk (CD), a Blu-ray disk, etc. storing the software and/or firmware. Further still, the example cursor controller 1 10 of FIG. 3 may include at least one element, process, and/or device in addition to, or instead of, those illustrated in FIG. 3, and/or may include more than one of any or all of the illustrated elements, processes and devices.
[0030] FIG. 4 is a flow diagram of an example image analysis 400 and control output 450 performed by the cursor controller of FIG. 3. In the illustrated example of FIG. 4, a cursor control analysis 400 is performed by the image analyzer 310 (frames 401 , 402) and/or the movement calculator 320 (frame 403) and a cursor control instruction (frame 404) to control a cursor 150 is provided by the control output 330. In the illustrated example of FIG. 4, frame 401 represents a first image captured by a camera (e.g., the camera 122) including a point of interest 410. The example point of interest 410 may be any mark, anomaly, feature, etc. identified in the image of frame 401 . Frame 402 of FIG. 4
represents a second image captured by the camera including the point of interest 410. The example images of the frames 401 , 402 may be images received in an image stream and/or video stream received from the mobile device 120 FIGS. 1 and/or 2. in the illustrated example of FIG. 4, a location of the point of interest 410 moves from point A to point B, as shown in frame 403. The example movement calculator 320 calculates the movement (e.g., distance, direction, rotation, etc.) of the mobile device 120 based on the difference of the location of the point of interest 410 (i.e., a difference in the coordinates of point A and point B in frames 401 , 402).
[0031] Although the point of interest 410 in FIG. 4 is illustrated as a "dot," the point of interest 410 may take any shape, form, and/or pattern. For example, the point of interest 410 may include a plurality of points of interest (e.g., several points defining an edge identified by edge detection, several points of a pattern, etc.). As a more specific example, the image analyzer 310 may identify a particular object or portion of an object (e.g., a light fixture, a ceiling fan, a pattern on a ceiling or wail, a wall hanging, a person, etc.), and use that particular object as a point of interest in determining movement of the mobile device 120. The point of interest may be used along with reference points (e.g., a center of the images, a corresponding same corner of the images, etc.) of the captured images and/or stream of images (video). Accordingly, in such an example, rotation of the mobile object 120 may be determined based on a configuration of the object (e.g., the point of interest 410) in a series of images (e.g., at ieast two images).
[0032] The example cursor control instruction 450 is illustrated in the frame 451 of FIG. 4. The frame 451 of FIG. 4 represents images of a display of the computer 130 showing movement (represented by a dotted line) of the cursor 150 from point C to point D. In the illustrated example, point C represents a location of the cursor 1 50 when the first image of frame 401 was captured and point D represents a location of the cursor 150 after the second image of frame 402 was captured and image analysis/movement calculation was performed by the image analyzer 310 and/or movement calculator 320. Accordingly, based on the change in location of the point of interest 410 (from point A to point B), the cursor 150 in frame 451 is moved from point C to point D on a display of the computer 130 based on instructions from the control output 330. As such, the cursor 150 moves a distance and/or direction corresponding to the movement of the mobile device 120 as identified by the movement of the point of interest in the images of the frames 401 , 402.
[0033] Flowcharts representative of example machine readable
instructions for implementing the cursor controller 1 10 of FIG. 3 are shown in FIGS. 5, 6, and/or 7. in this example, the machine readable instructions comprise a program(s)/process(es) for execution by a processor such as the processor 812 shown in the example processor platform 800 discussed below in connection with FIG. 8. The program(s)/process(es) may be embodied in software stored on a tangible computer readable storage medium such as a CD- ROM, a floppy disk, a hard drive, a digital versatile disk (DVD), a Biu-ray disk, or a memory associated with the processor 812, but the entire
program(s)/process(es) and/or parts thereof could alternatively be executed by a device other than the processor 812 and/or embodied in firmware or dedicated hardware. Further, although the example prog ram (s)/process(es) is/are described with reference to the flowcharts illustrated in FIGS. 5, 6, and/or 7, many other methods of implementing the example cursor controller 1 10 may alternatively be used. For example, the order of execution of the blocks may be changed, and/or some of the blocks described may be changed, eliminated, or combined.
[0034] The process 500 of FIG. 5 begins with an initiation of the cursor controller 1 10 (e.g., upon startup, upon instructions from a user, upon startup of a device implementing the cursor controller 1 10 (e.g., the mobile device 120 and/or the computer 130), etc.). At block 510 of the example process 500 of FIG. 5, the image analyzer 510 analyzes a first and second image. The example first image and second image may be images from a video stream and/or image stream received from a mobile device, such as the mobile device 120 of FIG. 1 . At block 520, the example movement calculator 320 calculates movement of a first device (e.g., the mobile device 120) based on the first image and the second image. For example, at block 520, the movement calculator 320 may measure a difference between locations of a point of interest within the first image and the second image.
[0035] At block 530 of the example process 50(3 of FIG. 5, the control output 330 may control a cursor of a second device (e.g., the computer 130) to move based on the movement. For example, the control output 330 may instruct the computer 130 to cause the cursor 150 to move via a communication link, such as Bluetooth™, Wi-Fi, or any other suitable communication link, in some examples, after block 530, control may return to block 510 to continue to control a cursor. In such example, the second image may be considered the first image of block 510 and a new image (e.g., a third image) may be considered the second image of block 510. Accordingly, the process 500 may be executed each time an image is received and/or captured by the camera 122 of the mobile device 120. Accordingly, execution of the process 500 of FIG. 5 may cause a cursor of a computer to move by using images captured by a camera of a mobile device (e.g., the camera 122 of the mobile device 120). As such, in such an example, the mobile device 120 may be used as a mouse for the example computer 130.
[0036] FIG. 6 is a flowchart representative of an example process 600 that may be executed to implement the block 510 of FIG. 5. The example process 600 may be executed to implement the image analyzer 310 of FIG. 3 to analyze images captured by the camera 122 of the mobile device 120 of FiG. 1 . The example process 600 of FIG. 6 begins with an initiation of the cursor controller 1 10. At block 610, the image analyzer 310 receives a first image. The example image may be an image from a video stream and/or image stream captured by the camera 122. As discussed herein, the point of interest may be any identifiable mark, object, etc. of the image. At block 620, the image analyzer 310 identifies a point of interest in the first image, in some examples, at block 620, the image analyzer 310 may determine coordinates of the point of interest. For example, such coordinates may be x-y coordinates corresponding to an x-axis and y-axis of the first image.
[0037] At block 630, the image analyzer 310 receives a second image (e.g., a subsequent image received in the video stream and/or image stream). At block 640, the image analyzer 310 identifies the point of interest in the second image (and corresponding coordinates of the point of interest). After block 640, the process 600 ends. In some examples, after block 640, control may advance to block 520 of FIG. 5 and/or the process 700 of FIG. 7.
[0038] FiG. 7 is a flowchart representative of an example process 700 that may be executed to implement the block 520 of FiG. 5. The example process 70(3 may be executed to implement the movement calculator 320 of FIG. 3 to calculate movement of the mobile device 120 of FiG. 1 . The example process 700 of FIG. 7 begins with an initiation of the cursor controller 1 10. At block 710, the movement calculator 320 identifies and/or determines coordinates of a point of interest in the first image. In some examples, the coordinates of the point of interest in the first image may be provided by the image analyzer 310. At block 720, the movement calculator 320 determines coordinates of the point of interest in a second image. Similar to block 710, image analyzer 310 may provide the coordinates of the point of interest in the second image to the movement calculator 320.
[0039] In the example process 700 of FIG. 7, at block 730, the movement calculator 320 compares the coordinates of the point of interest in the first image and the second image. Accordingly, by comparing the coordinates of the images, the movement calculator 320 may determine a difference between the coordinates in the first image and the second image by using the x-axes and y- axes of the first and second images and points of reference. At block 740 of FIG.
7, the movement calculator 320 calculates a distance and direction of movement of a first device based on the compared coordinates of the point of interest in the first image and the second image. In some examples, when the point of interest includes multiple points of interest and/or an identifiable configuration, at block 740, the movement calculator 320 may further determine a rotation of the first device based on a rotated configuration of the multiple points of interest and/or a difference in the configuration of the point of interest in the first image and the second image. After block 740, the process 700 ends. In some examples, after block 740, control may advance to block 530 of FIG. 5.
[0040] As mentioned above, the example processes of FIGS. 5, 6, and/or 7 may be implemented using coded instructions (e.g., computer and/or machine readable instructions) stored on a tangible computer readable storage medium such as a hard disk drive, a flash memory, a read-only memory (ROM), a compact disk (CD), a digital versatile disk (DVD), a cache, a random-access memory (RAM) and/or any other storage device or storage disk in which information is stored for any duration (e.g., for extended time periods,
permanently, for brief instances, for temporarily buffering, and/or for caching of the information). As used herein, the term tangible computer readable storage medium is expressly defined to include any type of computer readable storage device and/or storage disk and to exclude propagating signals and to exclude transmission media. As used herein, "tangible computer readable storage medium" and "tangible machine readable storage medium" are used
interchangeably. Additionally or alternatively, the example processes of FIGS. 5,
8, and/or 7 may be implemented using coded instructions (e.g., computer and/or machine readable instructions) stored on a non-transitory computer and/or machine readable medium such as a hard disk drive, a flash memory, a readonly memory, a compact disk, a digital versatile disk, a cache, a random-access memory and/or any other storage device or storage disk in which information is stored for any duration (e.g., for extended time periods, permanently, for brief instances, for temporarily buffering, and/or for caching of the information). As used herein, the term non-transitory computer readabie medium is expressly defined to include any type of computer readable storage device and/or storage disk and to exclude propagating signals and to exclude transmission media. As used herein, when the phrase "at least" is used as the transition term in a preamble of a claim, it is open-ended in the same manner as the term
"comprising" is open ended. As used herein the term "a" or "an" may mean "at least one," and therefore, "a" or "an" do not necessarily limit a particular element to a single element when used to describe the element.
[0041] FIG. 8 is a block diagram of an example processor platform 800 capable of executing the instructions of FIGS. 5, 6, and/or 7 to implement the cursor controller 1 10 of FIG. 3. The example processor platform 800 may be or may be included in any type of apparatus, such as a server, a personal computer, a mobile device (e.g., a cell phone, a smart phone, a tablet, etc.), a personal digital assistant (PDA), an internet appliance, a DVD player, a CD player, a digital video recorder, a Biu-ray player, a gaming console, a personal video recorder, a set top box, or any other type of computing device.
[0042] The processor platform 800 of the illustrated example of FIG. 8 includes a processor 812. The processor 812 of the illustrated example is hardware. For example, the processor 812 can be implemented by at least one integrated circuit, logic circuit, microprocessor or controller from any desired family or manufacturer.
[0043] The processor 812 of the illustrated example includes a local memory 813 (e.g., a cache). The processor 812 of the illustrated example is in communication with a main memory including a volatile memory 814 and a nonvolatile memory 816 via a bus 818. The volatile memory 814 may be
implemented by Synchronous Dynamic Random Access Memory (SDRAM), Dynamic Random Access Memory (DRAM), RAMBUS Dynamic Random Access Memory (RDRAM) and/or any other type of random access memory device. The non-volatile memory 816 may be implemented by flash memory and/or any other desired type of memory device. Access to the main memory 814, 816 is controlled by a memory controller. [0044] The processor platform 800 of the illustrated example also includes an interface circuit 820. The interface circuit 820 may be implemented by any type of interface standard, such as an Ethernet interface, a universal serial bus (USB), and/or a PCI express interface.
[0045] In the illustrated example, at least one input device 822 is connected to the interface circuit 820. The input device(s) 822 permif(s) a user to enter data and commands into the processor 812. The input device(s) can be implemented by, for example, an audio sensor, a microphone, a camera (still or video), a keyboard, a button, a mouse, a touchscreen, a track-pad, a trackball, isopoint and/or a voice recognition system.
[0046] At least one output device 824 is also connected to the interface circuit 820 of the illustrated example. The output device(s) 824 can be implemented, for example, by display devices (e.g., a light emitting diode (LED), an organic light emitting diode (OLED), a liquid crystal display, a cathode ray tube display (CRT), a touchscreen, a tactile output device, a light emitting diode (LED), a printer and/or speakers). The interface circuit 820 of the illustrated example, thus, typically includes a graphics driver card, a graphics driver chip or a graphics driver processor.
[0047] The interface circuit 820 of the illustrated example also includes a communication device such as a transmitter, a receiver, a transceiver, a modem and/or network interface card to facilitate exchange of data with external machines (e.g., computing devices of any kind) via a network 826 (e.g., an Ethernet connection, a digital subscriber line (DSL), a telephone line, coaxial cable, a cellular telephone system, etc.).
[0048] The processor platform 800 of the illustrated example also includes at least one mass storage device 828 for storing software and/or data. Examples of such mass storage device(s) 828 include floppy disk drives, hard drive disks, compact disk drives, Biu-ray disk drives, RAID systems, and digital versatile disk (DVD) drives.
[0049] The coded instructions 832 of FIGS. 5, 6, and/or 7 may be stored in the mass storage device 828, in the local memory 813 in the volatile memory 814, in the non-volatile memory 816, and/or on a removable tangible computer readable storage medium such as a CD or DVD.
[0050] From the foregoing, it will be appreciated that the above disclosed examples enable a mobile device to be used as a cursor control device (e.g., a mouse, a trackball, etc.) to control a cursor on a computer or other type of computing device (e.g., a server, a set top box, a game console, etc.). in examples disclosed herein, images captured by a camera of the mobile device may be used to calculate movement of the mobile device based on an analysis of the images. An example point of interest may be tracked through a series of images to determine a movement of the mobile device relative to the point of interest. Based on the calculated movement, instructions may be sent to the computer or other computing device (e.g., via Bluetooth, Wifi, or any other suitable communication technique) to control a cursor.
[0051] Although certain example methods, apparatus and articles of manufacture have been disclosed herein, the scope of coverage of this patent is not limited thereto. On the contrary, this patent covers all methods, apparatus and articles of manufacture fairly falling within the scope of the claims of this patent.

Claims

CLAIMS What Is Claimed Is:
1 . A method comprising:
analyzing, by at least one processor, a first image and second image; calculating movement of a first device based on the first image and the second image; and
controlling a cursor of a second device to move based on the movement.
2. The method as defined in claim 1 , wherein the cursor of the second device is to move a distance and direction corresponding to the movement of the first device.
3. The method as defined in claim 1 , wherein analyzing the first image and the second image comprises:
identifying a point of interest in the first image; and
identifying the point of interest in the second image.
4. The method as defined in claim 3, wherein the point of interest is a same identifiable object in the first image and the second image.
5. The method as defined in claim 3, wherein calculating the movement comprises:
identifying first coordinates of the point of interest in the first image and second coordinates of the point of interest in the second image; and
calculating a distance and a direction of the movement of the first device based on a difference between the first coordinates and the second coordinates.
6. The method as defined in claim 1 , wherein the first image and the second image are from a stream of images from a camera of the first device.
7. The method as defined in claim 1 , wherein causing the cursor of the second device to move comprises sending, from the first device, instructions to the second device to move the cursor.
8. The method as defined in claim 1 , wherein analyzing the first image and the second image begins in response to defecting movement of the first device via a sensor of the first device.
9. An apparatus comprising:
an image analyzer to analyze a first image and a second image;
a movement calculator to determine movement of a first device based on the first image and the second image, the first image and the second image captured by a camera of the first device; and a control output to cause a cursor of a second device to move based on the determined movement of the mobile device.
10. The apparatus as defined in claim 9, wherein the image analyzer is further to:
identify a point of interest in the first image and the same point of interest in the second image;
determine first coordinates of the point of interest in the first image and second coordinates of the point of interest in the second image; and
provide the first coordinates and the second coordinates to the movement calculator,
1 1 . The apparatus as defined in claim 10, wherein the movement calculator is further to:
calculate a difference between the first coordinates and the second coordinates; and
determine a distance and direction of the movement based on the difference.
12. The apparatus as defined in claim 9, wherein the control output causes the cursor of the second device to move via a wireless communication link between the first device and the second device.
13. A non-transitory computer readable storage medium comprising instructions that, when executed, cause a machine to at least:
analyze a first image and a second image;
calculate movement of a first device based on a point of interest in the first image and the second image; and
instruct a cursor of a second device to move based on the calculated movement of the first device.
14. The non-transitory computer readable medium as defined in claim 13, further comprising instructions, when executed, that cause the machine to:
calculate a difference between the first location and the second location based on coordinates of the point of interest within the first image and the second image;
determine a distance and direction of the movement from the difference.
15. The non-transitory computer readable medium as defined in claim 13, further comprising instructions, when executed, that cause the machine to:
determine a rotation of the movement based on a first configuration of the point of interest in the first image and a second configuration of the point of interest in the second image.
PCT/US2014/056734 2014-09-22 2014-09-22 Cursor control using images WO2016048262A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2014/056734 WO2016048262A1 (en) 2014-09-22 2014-09-22 Cursor control using images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2014/056734 WO2016048262A1 (en) 2014-09-22 2014-09-22 Cursor control using images

Publications (1)

Publication Number Publication Date
WO2016048262A1 true WO2016048262A1 (en) 2016-03-31

Family

ID=55581591

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/056734 WO2016048262A1 (en) 2014-09-22 2014-09-22 Cursor control using images

Country Status (1)

Country Link
WO (1) WO2016048262A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1005156A1 (en) * 1998-11-26 2000-05-31 STMicroelectronics S.r.l. Wireless pointing device for remote cursor control
US20010042245A1 (en) * 1998-10-13 2001-11-15 Ryuichi Iwamura Remote control system
US20010045940A1 (en) * 1999-07-06 2001-11-29 Hansen Karl C. Computer presentation system and method with optical tracking of wireless pointer
US20120235908A1 (en) * 2009-03-10 2012-09-20 Henty David L Multi-directional remote control system and method with automatic cursor speed control
US20120320198A1 (en) * 2011-06-17 2012-12-20 Primax Electronics Ltd. Imaging sensor based multi-dimensional remote controller with multiple input mode

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010042245A1 (en) * 1998-10-13 2001-11-15 Ryuichi Iwamura Remote control system
EP1005156A1 (en) * 1998-11-26 2000-05-31 STMicroelectronics S.r.l. Wireless pointing device for remote cursor control
US20010045940A1 (en) * 1999-07-06 2001-11-29 Hansen Karl C. Computer presentation system and method with optical tracking of wireless pointer
US20120235908A1 (en) * 2009-03-10 2012-09-20 Henty David L Multi-directional remote control system and method with automatic cursor speed control
US20120320198A1 (en) * 2011-06-17 2012-12-20 Primax Electronics Ltd. Imaging sensor based multi-dimensional remote controller with multiple input mode

Similar Documents

Publication Publication Date Title
US11429244B2 (en) Method and apparatus for displaying application
JP6370893B2 (en) System and method for performing device actions based on detected gestures
JP6420486B2 (en) Techniques for distinguishing between intended and unintended gestures on a wearable touch-sensing fabric
US20120054670A1 (en) Apparatus and method for scrolling displayed information
US9880655B2 (en) Method of disambiguating water from a finger touch on a touch sensor panel
EP2565768B1 (en) Mobile terminal and method of operating a user interface therein
US9201521B2 (en) Storing trace information
US20090251432A1 (en) Electronic apparatus and control method thereof
US9081417B2 (en) Method and device for identifying contactless gestures
KR20130058752A (en) Apparatus and method for proximity based input
KR20140051590A (en) Method and apparatus for displaying ui of touch device
US20150077381A1 (en) Method and apparatus for controlling display of region in mobile device
US9696850B2 (en) Denoising touch gesture input
KR20140091302A (en) Method and apparatus for displaying scrolling information in electronic device
EP2728456B1 (en) Method and apparatus for controlling virtual screen
US20150286313A1 (en) Large feature biometrics using capacitive touchscreens
KR102096070B1 (en) Method for improving touch recognition and an electronic device thereof
US10394442B2 (en) Adjustment of user interface elements based on user accuracy and content consumption
US20160306449A1 (en) Avoiding accidental cursor movement when contacting a surface of a trackpad
WO2018133211A1 (en) Screen switching method for dual-screen electronic device, and dual-screen electronic device
WO2016048262A1 (en) Cursor control using images
US20170285770A1 (en) Enhanced user interaction with a device
WO2016018331A1 (en) Cursor locator
US20140035876A1 (en) Command of a Computing Device
US9727236B2 (en) Computer input device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14902396

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14902396

Country of ref document: EP

Kind code of ref document: A1