WO2009042399A2 - Method and device for associating objects - Google Patents

Method and device for associating objects Download PDF

Info

Publication number
WO2009042399A2
WO2009042399A2 PCT/US2008/075784 US2008075784W WO2009042399A2 WO 2009042399 A2 WO2009042399 A2 WO 2009042399A2 US 2008075784 W US2008075784 W US 2008075784W WO 2009042399 A2 WO2009042399 A2 WO 2009042399A2
Authority
WO
WIPO (PCT)
Prior art keywords
touch sensitive
user interface
area
display screen
location
Prior art date
Application number
PCT/US2008/075784
Other languages
French (fr)
Other versions
WO2009042399A3 (en
Inventor
Jian Sun
Original Assignee
Motorola, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Motorola, Inc. filed Critical Motorola, Inc.
Priority to MX2010003243A priority Critical patent/MX2010003243A/en
Priority to EP08799388A priority patent/EP2203807A2/en
Priority to KR1020117030347A priority patent/KR20120013439A/en
Priority to BRPI0818011A priority patent/BRPI0818011A8/en
Priority to CN200880108327A priority patent/CN101809532A/en
Publication of WO2009042399A2 publication Critical patent/WO2009042399A2/en
Publication of WO2009042399A3 publication Critical patent/WO2009042399A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/3827Portable transceivers
    • H04B1/3833Hand-held transceivers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/40Circuits
    • H04B1/401Circuits for selecting or indicating operating mode
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates generally to the field of user interfaces and user control of an electronic device.
  • Portable handheld electronic devices such as handheld wireless communications devices (e.g. cellphones) and personal digital assistants (PDA's) that are easy to transport are becoming commonplace.
  • handheld electronic devices come in a variety of different form factors and support many features and functions.
  • a problem with such devices is the restriction on user interfaces given their small size. For example keypads with a limited number of keys, display screens with a limited number of icons compared with personal computers with a full keyboard and a large screen with sophisticated graphical user interfaces including the use of a mouse.
  • keypads with a limited number of keys display screens with a limited number of icons compared with personal computers with a full keyboard and a large screen with sophisticated graphical user interfaces including the use of a mouse.
  • complicated tasks involving multiple applications must be performed using numerous menu driven operations that are time consuming and inconvenient for users.
  • a touch sensitive keypad may be used to receive scribed strokes of a user's finger in order to input data such as scribed letters which can then be displayed on a non-touch sensitive screen.
  • a full QWERTY keyboard may be temporarily connected to the electronic device for data entry or other user interface intensive tasks.
  • FIG. 1 is a schematic block diagram illustrating circuitry of an electronic device in accordance with the invention
  • FIG. 2A and 2B illustrate an electronic device comprising a touch sensitive keypad integrated into an array of user actuable input keys in exploded perspective and section views respectively;
  • FIG. 3A and 3B illustrate operation of an electronic device touch sensitive display screen and touch sensitive keypad according to an embodiment
  • FIG. 4 illustrates a flow chart for an algorithm according to an embodiment
  • Fig. 5 illustrates a flow chart for an algorithm according to another embodiment.
  • a method of associating objects in an electronic device comprising identifying a first object in response to detecting an initial contact of a scribed stroke at a location of a first area of a touch sensitive user interface which corresponds with the first object, identifying a second object in response to detecting a final contact of the scribed stroke at a location of a second area of the touch sensitive user interface which corresponds with the second object, associating the first object with the second object.
  • One of the first and second areas of the touch sensitive user interface is a touch sensitive display screen and the other area of the touch sensitive user interface is a touch sensitive keypad.
  • An object refers to an entity that represents some underlying data, state, function, operation or application.
  • one of the objects may be data such as an email address from a contacts database, and the other object may be a temporary storage location or another application such as an email client.
  • Associating one object with another refers to copying or moving the contents of one object to another and/or executing one of the objects using the contents of the other object; or to linking the first object to the second object for example as a short-cut key to an application.
  • an email client one object
  • objects and associations have been given above, the skilled person will recognise that these terms are not so limited and will be familiar with other examples of computing objects and associations.
  • the first area of the touch sensitive user interface is the touch sensitive display screen
  • the second area (or other area) is the touch sensitive keypad.
  • a user may drag-and-drop an email address from a contacts database open on the display screen to a temporary storage location associated with a key on the touch sensitive keypad. By touching the email address, and dragging this over the screen to the appropriate key of the touch sensitive keypad, the email address is stored and may be retrieved later; for example to copy into another application such as an email client newly displayed on the display screen.
  • the first area of the touch sensitive user interface is the touch sensitive keypad
  • the second area (or other area) is the touch sensitive display screen.
  • embodiments of the invention described herein may be comprised of one or more conventional processors and unique stored program instructions that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of associating objects in an electronic device described herein.
  • the non-processor circuits may include, but are not limited to, a radio receiver, a radio transmitter, signal drivers, clock circuits, power source circuits, and user input devices. As such, these functions may be interpreted as steps of a method to perform user function activation on an electronic device.
  • FIG. 1 there is a schematic diagram illustrating an electronic device 100, typically a wireless communications device, in the form of a mobile station or mobile telephone comprising a radio frequency communications unit 102 coupled to be in communication with a processor 103.
  • the electronic device 100 also has a touch sensitive user interface 170.
  • the first area of the touch sensitive user interface comprises a touch sensitive display screen 105
  • the second area (or other area) of the touch sensitive user interface comprises a touch sensitive keypad 165.
  • the first area of the touch sensitive user interface can be the touch sensitive keypad 165 and the second area (or other area) of the touch sensitive user interface can the touch sensitive display screen 105.
  • an alert module 115 typically contains an alert speaker, vibrator motor and associated drivers.
  • the touch sensitive display screen 105, touch sensitive keypad 165 and alert module 115 are coupled to be in communication with the processor 103.
  • the touch sensitive display screen 105 and the touch sensitive keypad 165 of the touch sensitive user interface 170 will be located adjacent each other in order to facilitate user operation.
  • the processor 103 includes an encoder/decoder 111 with an associated code Read Only Memory (ROM) 112 for storing data for encoding and decoding voice or other signals that may be transmitted or received by the electronic device 100.
  • the processor 103 also includes a micro-processor with object association function 113 coupled, by a common data and address bus 117, to the encoder/decoder 111, a character Read Only Memory (ROM) 114, radio frequency communications unit 102, a Random Access Memory (RAM) 104, static programmable memory 116 and a Removable User Identity Module (RUIM) interface 118.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • RUIM Removable User Identity Module
  • the static programmable memory 116 and a RUIM card 119 (commonly referred to as a Subscriber Identity Module (SIM) card) operatively coupled to the RUIM interface 118 each can store, amongst other things, Preferred Roaming Lists (PRLs), subscriber authentication data, selected incoming text messages and a Telephone Number Database (TND phonebook) comprising a number field for telephone numbers and a name field for identifiers associated with one of the numbers in the name field.
  • PRLs Preferred Roaming Lists
  • TDD phonebook Telephone Number Database
  • the RUIM card 119 and static memory 116 may also store passwords for allowing accessibility to password-protected functions on the electronic device 100.
  • the micro-processor with object association function 113 has ports for coupling to the display screen 105, the keypad, the alert module 115, microphone 135 and a communications speaker 140 that are integral with the device.
  • the character Read Only Memory 114 stores code for decoding or encoding text messages that may be received by the radio frequency communications unit 102.
  • the character Read Only Memory 114, RUIM card 119, and static memory 116 may also store Operating Code (OC) for the micro-processor with object association function 113 and code for performing functions associated with the electronic device 100.
  • OC Operating Code
  • the radio frequency communications unit 102 is a combined receiver and transmitter having a common antenna 107.
  • the radio frequency communications unit 102 has a transceiver 108 coupled to the antenna 107 via a radio frequency amplifier 109.
  • the transceiver 108 is also coupled to a combined modulator/demodulator 110 that couples the radio frequency communications unit 102 to the processor 103.
  • the touch sensitive user interface 170 detects manual contact from a user's finger or stylus on either or both of the display screen 105 and the keypad 165.
  • the detected manual contacts are interpreted by the processor 103 as points or lines of contact or touch across an xy co-ordinate system of the first (105) and second (165) area of the touch sensitive user interface 170.
  • the interpretation of the detected manual contacts as points or lines of contact by the processor 103 will typically be implemented with the execution of program code as will be appreciated by those skilled in the art. In alternative embodiments, this function may be achieved using an ASIC or equivalent hardware.
  • FIG 2A and 2B illustrate in more detail an example touch sensitive keypad arrangement.
  • Touch sensitive display screens 105 will be well known to those skilled in the art and is not further described here.
  • the touch sensitive keypad 165 comprises a number of user input keys 265 which are integrated in an overlaying relation with capacitive sensor array 272 that which detects changes in capacitance corresponding to the presence of a user's digit or other object such as a stylus.
  • the touch sensitive keypad 165 or second area of the touch sensitive user interface 170 allows for receiving user contact, touch points or lines of contact with the keypad 165.
  • Detection of a finger or stylus does not require pressure against the capacitive sensor array 272 or user input keys 265, but typically just a light touch or contact against the surface of the keypad 165; or even just close proximity. It is therefore possible to integrate the user input keys 265 and the capacitive sensor array 272, as the keys 265 require physical pressure or a tactile force for actuation whereas the capacitive sensors of the capacitive sensor array 272 do not. Thus it is possible to detect manual contact at the keypad 165 without actuating any of the user input keys 265.
  • An example of a touch sensitive keypad 165 is the finger writing recognition tablet on the A668 mobile phone available from Motorola Incorporated. AS shown, the user input keys 265 each have a plunger that passes through apertures 275 in the capacitive sensor array 272 and contact respective dome switches 280 on a switch substrate 285.
  • capacitive sensors are typically used, other sensor arrays may alternatively be used such as ultrasound sensors to detect the user input object's position.
  • the "activation" of a sensor may be configured to correspond to contact between a user input object such as a finger and the surface of the tablet, or even close proximity of the distal end of a user input object with the sensor such that actual physical contact with the tablet surface may not be required.
  • the changes in capacitance detected at the capacitive sensor array 272 are translated into a contact location on an xy grid by the processor 103.
  • the points or strokes of contact may be captured by an ink trajectory processor as ink trajectories with respect to the co-ordinate system of the touch sensitive keypad 165.
  • These inks or manual contact locations are then forwarded to the processor 103 and interpreted as manual contact locations for further processing as described in more detail below.
  • a suitable ink trajectory processor may be that used in the MotorolaTM A688 mobile phone.
  • FIG 3 A and 3B illustrate an electronic device 100 which comprises a touch sensitive display screen 105 and a touch sensitive keypad 165 having a number of user actuable keys for entering user data and controls.
  • the keypad 165 includes a send key (top left) 365 for sending messages or as a call key for voice communication, and a close key (top right) which can be used to close applications and terminate a voice call.
  • the display screen 105 includes a number of icons 320 corresponding to various applications or functions that the user of the electronic device 100 may use.
  • FIG 3A and 3B also illustrate a method of using the electronic device 100 (typically a mobile phone).
  • a user's finger 310 can be used to drag an icon from the touch sensitive display screen 105 to the touch sensitive keypad 165.
  • the icon is associated with a BluetoothTM application or first object. Movement of the (BluetoothTM) icon 320 across the touch sensitive display screen 105 is indicated by the partially drawn icon 325 which corresponds with the point of contact of the finger 310 across the display screen 105.
  • the user's finger 310 moves from the touch sensitive display screen 105 to the touch sensitive keypad 165 as shown in FIG 3B.
  • the user's finger 310 is touching the send key 365.
  • the send key 365 in this example is associated with a storage location or second object.
  • the BluetoothTM application or first object is associated with the storage location or second object.
  • the initial contact of a scribed stroke or user "drag" operation is detected which corresponds to the location of an icon 320 on the touch sensitive display screen 105.
  • a final contact of the scribed stroke or a user "drop" operation is detected which corresponds to the location of a key 365 on the touch sensitive keypad 165.
  • the final contact corresponds the lifting off of the user's finger 310 from the keypad 165.
  • a first object BluetoothTM application
  • This shortcut to the BluetoothTM application may then be used subsequently, for example when a different application is open or displayed on the display screen 105.
  • the BluetoothTM application may be dragged from the send key over to the email causing the email to be sent via BluetoothTM.
  • the step of associating one object with another might be achieved by actuating a key (365) on the keypad 165 instead of simply terminating contact by lifting the user's finger 310 from the keypad 165.
  • a touch sensitive keypad 165 may not be needed, and instead the icon 325 may be dragged across to the edge of the touch sensitive screen 105 and then a key 265 may be actuated to associate the object represented by the icon (BluetoothTM application) with the object represented by the actuated key (storage location).
  • FIG 4 illustrates in more detail a method of associating objects in an electronic device 100.
  • This method 400 will typically be implemented by executing a software program from the static memory 116 on the microprocessor with object association function 113 which receives inputs from the touch sensitive user interface 170.
  • the method 400 is activated on the electronic device 100 by the user selecting an object association mode at step 405, for example by selecting a menu option.
  • the method then monitors the first area of the touch sensitive user interface or touch sensitive display screen 105 in this embodiment in order to detect an initial contact of a scribed stroke at a location corresponding to a first object at step 410.
  • the scribed stroke corresponds to the movement of the point of contact of a user's finger 310 or stylus across the first and second areas of the touch sensitive user interface 105 and 165.
  • the location corresponding to the first object may be indicated by an icon 320 as previously described; for example the BluetoothTM application icon of FIG 3 A.
  • the method terminates at step 415. If however an initial contact is detected (410Y), then in response the first object (BluetoothTM application) is identified according to the location of the detected initial contact at step 420. For example if the initial contact is at the location of the BluetoothTM icon 320, then the BluetoothTM application is identified as the first object.
  • the method 400 determines whether the point of contact moves over the first area of the touch sensitive user interface at step 425. If not (425N), this means there is no scribed stroke, and in fact there is only stationary or temporary contact and the method then performs conventional object execution at step 430.
  • the BluetoothTM application is launched or executed and the method then terminates. If however the point of contact moves (425Y), then the method displays on the touch sensitive screen movement of the icon 320 corresponding to or following movement of the point of contact of the scribed stroke over the display screen 105 at step 435. This movement of the icon was shown in FIG 3 A by the partially drawn icon 325 following the user's finger across the display screen 105.
  • the method 400 determines whether the scribed stroke or point of contact extends or moves to the other area of the touch sensitive user interface or touch sensitive keypad 165 in this embodiment at step 440. This may be implemented by detecting touch at any location on the keypad 165. If the scribed stroke doesn't extend onto the touch sensitive keypad 165 (440N), then the method returns to the step of determining whether movement of the point of contact or the scribed stroke moves over the touch sensitive display screen 105 at step 425. If however the scribed stroke does extend onto the touch sensitive keypad 165 (440N), then the method displays on the display screen 105 an indication of the key 265, 365 on the touch sensitive keypad 165 corresponding to the point of contact of the scribed stroke at step 445.
  • An example indication 330 is shown in FIG 3B which displays both a label for the first object, in this case BluetoothTM, together with a label for the key, in this case "Send".
  • Alternative indications may be used, for example simply displaying the symbol printed on the key 265 which is currently being touched by the user.
  • the method 400 then monitors the second area of the touch sensitive user interface or keypad 165 to detect a final contact of the scribed stroke at a location corresponding to a second object at step 450.
  • Detecting a final contact may comprise detecting lift off of the finger 310 or stylus from the keypad 165, and if this is at a key 265 which is associated with a second object (450Y), then in response the method identifies the second object at step 455.
  • the second object eg temporary storage location
  • the method 400 returns to determine whether the scribed stroke still extends over the second area of the touch sensitive user interface at step 440. Whilst locations of the second area of the touch sensitive user interface 170 which correspond to a second object have been described as also corresponding to keys 265, 365, this need not be the case. For example, the second objects may be assigned simply to xy coordinates on the keypad 165 and can be identified solely using the indication 330 in the display screen 105.
  • the method identifies the second object in response to detecting actuation of a key on the keypad which corresponds with the second object.
  • actuation of the key follows termination of the scribed stroke on the touch sensitive display screen.
  • association of two objects can cover a variety of actions including moving or copying content from one object to another, storing the content of the first object (in the second object - a temporary storage location), or providing a shortcut or other link from one object to another.
  • this application may be automatically executed upon associating the first and second objects. For example a BluetoothTM object may be started when associated with an email object in order to send the email over a BluetoothTM connection.
  • FIG 5 illustrates a method of associating objects in an electronic device 100 in accordance with an alternative embodiment, in which an object is dragged from the keypad 165 to the screen 105.
  • the method 500 is activated on the electronic device 100 by the user selecting an object association mode at step 505, for example by selecting a menu option.
  • the method attempts to detect an initial contact of a scribed stroke at a location of the first area of the touch sensitive user interface which corresponds with a first object at step 510.
  • the first area of the touch sensitive user interface in this embodiment is the touch sensitive keypad 165 instead of the touch sensitive display screen 105.
  • the scribed stroke corresponds to the movement of the point of contact of a user's finger 310 or stylus across the first and second areas of the touch sensitive user interface 165 and 105.
  • the location corresponding to the first object may be a key 265 as previously described; for example the send key FIG 3A.
  • the method terminates at step 515. If however an initial contact is detected (510Y), then the first object is identified according to the location of the detected initial contact at step 520.
  • This first object may be the contents of a temporary storage location associated with the send key 365, for example a contacts email address.
  • the object may be an application such as BluetoothTM.
  • the method 500 determines whether the point of contact moves over the first area of the touch sensitive user interface (the keypad 165) at step 525. If not (525N), this means there is no scribed stroke, and in fact there is only stationary or temporary contact and the method then performs conventional object execution at step 530. For example if the send key is merely touched by the user's finger 310, then the BluetoothTM application may be launched or executed and the method then terminates. If the object associated with the send key is content, then no action is taken. If however the point of contact moves (525Y), then the method displays on the display screen 105 an indication of the key on the touch sensitive keypad 165 corresponding to the point of contact of the scribed stroke at step 535. An example indication 330 is shown in FIG 3B which displays both a label for the first object, in this case BluetoothTM, together with a label for the key 365, in this case Send.
  • the method 500 determines whether the scribed stroke or point of contact extends or moves to the second area of the touch sensitive user interface (the display screen 105) at step 540. This may be implemented by detecting touch at any location on the display screen 105; or within a limited region of the display screen 105 adjacent the keypad 115 for example. If the scribed stroke doesn't extend onto the touch sensitive display screen 105 (540N), then the method returns to the step of determining whether movement of the point of contact or the scribed stroke moves over the touch sensitive display screen 105.
  • the method displays movement of an icon 320 corresponding to the first object and following movement of the point of contact of the scribed stroke over the display screen 105 at step 545.
  • An example of this movement of the icon is shown in FIG 3 A by the partially drawn icon 325 following the user's finger across the display screen 105.
  • the method 500 attempts to detect a final contact of the scribed stroke at a location of the second area of the touch sensitive user interface which corresponds to a second object at step 550.
  • the second area of the touch sensitive user interface is the display screen 105 in this embodiment.
  • Detecting a final contact may comprise detecting lift off of the finger 310 or stylus from the display screen 105, and if this is at an icon 320 which is associated with a second object (550Y), then the method identifies the second object at step 555.
  • the second object eg user application
  • the method 500 returns to determine whether the scribed stroke still extends over the second area of the touch sensitive user interface at step 540.
  • association of two objects can cover a variety of actions including copying or moving the content from one object to another, storing the content of the first object (in the second object - a temporary storage location), or providing a shortcut or other link from one object to another. Where one of the objects is an application, this may be automatically executed upon associating the first and second objects. For example a BluetoothTM object may be started when associated with an email object in order to send the email over a BluetoothTM connection.
  • inventions may also be used to store objects, for example storing content or an application in a temporary storage location (second object). These stored objects may be persisted for example in non-volatile memory. This might allow for example a draft SMS message to be saved even after switching the device off, or to customise the device keys to perform a shortcut to an application.
  • the embodiment provides a number of advantages and functions. For example seamless drag and drop operations across the display and the keypad, as well as object storage through a drag and drop operation, from a mobile device display to its keypad. Object transfer through a drag and drop operation, from the mobile device keypad to its display. The ability to persist the object storage across mobile device power cycles and/or to quickly switch applications.
  • the device can be configured such that the drag and drop operation from the display to the keypad effectively stores and assigns the (first) object to the destination key (second object). While the drag and drop operation from the keypad to the display effectively applies the stored (first) object to the dropped location (second object).
  • the semantics of applying a stored object is application and object specific. For example, in the above scenario, applying the stored "Bluetooth" object to any screen may be configured to launch the BluetoothTM application, and this serves as an easy way of customizing a shortcut key.
  • dropping operations include: drop a contact to SMS screen to start editing a message to the person; drop a URL to a browser to navigate to the web page; drop a text fragment to any editor to paste the text.
  • Switching screens in mobile devices has always been troublesome. For example where a user is editing an SMS, and he then wants to copy some web contents to the message, he needs to launch the browser.
  • the problem with known solutions is that after the browser is launched, the user has no quick way to go back to the SMS editing screen.
  • the user can always close the browser screen, however this is typically sub-optimal and may not always be what the user wants.
  • the screen can be treated as another type of object.
  • the user Prior to launching the browser, the user can drag the entire screen (through some designated area, such as the screen header) to a key. After launching the browser and copying the content, the user can drag the key into the display, effectively restoring the SMS edit screen.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Input From Keyboards Or The Like (AREA)

Abstract

A method (400) of associating objects in an electronic device (100), the method (400) performs identifying a first object (420) in response to detecting an initial contact of a scribed stroke (410) at a location of a first area of a touch sensitive user interface (170) which corresponds with the first object. Next there is performed identifying a second object (455) in response to detecting a final contact (450) of the scribed stroke at a location of a second area of the touch sensitive user interface (170) which corresponds with the second object. Then the method (400) performs associating the first object with the second object (460) and wherein one of the first and second areas of the touch sensitive user interface (170) is a touch sensitive display screen (105) and the other area of the touch sensitive user interface (170) is a touch sensitive keypad (165).

Description

METHOD AND DEVICE FOR ASSOCIATING OBJECTS
FIELD OF THE INVENTION
[0001] The present invention relates generally to the field of user interfaces and user control of an electronic device.
BACKGROUND OF THE INVENTION
[0002] Portable handheld electronic devices such as handheld wireless communications devices (e.g. cellphones) and personal digital assistants (PDA's) that are easy to transport are becoming commonplace. Such handheld electronic devices come in a variety of different form factors and support many features and functions.
[0003] A problem with such devices is the restriction on user interfaces given their small size. For example keypads with a limited number of keys, display screens with a limited number of icons compared with personal computers with a full keyboard and a large screen with sophisticated graphical user interfaces including the use of a mouse. As small electronic devices become more powerful there is a desire to perform more complicated tasks however this is restricted by the limited nature of their user interfaces. Typically complicated tasks involving multiple applications must be performed using numerous menu driven operations that are time consuming and inconvenient for users.
[0004] Various efforts have been made to improve user interfaces on small portable electronics devices, including the use of touch sensitive display screens which allow a user to employ a soft keyboard for example, or actuate an application icon using contact with the display screen. In alternative arrangements, a touch sensitive keypad may be used to receive scribed strokes of a user's finger in order to input data such as scribed letters which can then be displayed on a non-touch sensitive screen. In yet further alternative arrangement, a full QWERTY keyboard may be temporarily connected to the electronic device for data entry or other user interface intensive tasks. BRIEF DESCRIPTION OF THE DRAWINGS
[0005] In order that the invention may be readily understood and put into practical effect, reference will now be made to an exemplary embodiment as illustrated with reference to the accompanying figures, where like reference numerals refer to identical or functionally similar elements throughout the separate views. The figures together with a detailed description below, are incorporated in and form part of the specification, and serve to further illustrate the embodiments and explain various principles and advantages, in accordance with the present invention where:
[0006] Fig. 1 is a schematic block diagram illustrating circuitry of an electronic device in accordance with the invention;
[0007] Fig. 2A and 2B illustrate an electronic device comprising a touch sensitive keypad integrated into an array of user actuable input keys in exploded perspective and section views respectively;
[0008] Fig. 3A and 3B illustrate operation of an electronic device touch sensitive display screen and touch sensitive keypad according to an embodiment;
[0009] Fig. 4 illustrates a flow chart for an algorithm according to an embodiment; and
[0010] Fig. 5 illustrates a flow chart for an algorithm according to another embodiment.
[0011] Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of embodiments of the present invention. DETAILED DESCRIPTION
[0012] In general terms in one aspect there is provided a method of associating objects in an electronic device, and comprising identifying a first object in response to detecting an initial contact of a scribed stroke at a location of a first area of a touch sensitive user interface which corresponds with the first object, identifying a second object in response to detecting a final contact of the scribed stroke at a location of a second area of the touch sensitive user interface which corresponds with the second object, associating the first object with the second object. One of the first and second areas of the touch sensitive user interface is a touch sensitive display screen and the other area of the touch sensitive user interface is a touch sensitive keypad.
[0013] An object refers to an entity that represents some underlying data, state, function, operation or application. For example one of the objects may be data such as an email address from a contacts database, and the other object may be a temporary storage location or another application such as an email client. Associating one object with another refers to copying or moving the contents of one object to another and/or executing one of the objects using the contents of the other object; or to linking the first object to the second object for example as a short-cut key to an application. In another example, an email client (one object) may be executed and open a new email using the email address from the other object (a contact), or the contents of one object (eg email address from contacts database) may be copied into a temporary storage location (the other object). This enables drag-and-drop operations to be carried out on a small electronics device. Whilst examples of objects and associations have been given above, the skilled person will recognise that these terms are not so limited and will be familiar with other examples of computing objects and associations.
[0014] In an embodiment, the first area of the touch sensitive user interface is the touch sensitive display screen, and the second area (or other area) is the touch sensitive keypad. In such an embodiment, a user may drag-and-drop an email address from a contacts database open on the display screen to a temporary storage location associated with a key on the touch sensitive keypad. By touching the email address, and dragging this over the screen to the appropriate key of the touch sensitive keypad, the email address is stored and may be retrieved later; for example to copy into another application such as an email client newly displayed on the display screen. In an alternative embodiment, the first area of the touch sensitive user interface is the touch sensitive keypad, and the second area (or other area) is the touch sensitive display screen.
[0015] Before describing in detail embodiments that are in accordance with the present invention, it should be observed that the embodiments reside primarily in combinations of method steps and device components related to associating objects in an electronic device. Accordingly, the device components and method steps have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
[0016] In this document, relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element proceeded by "comprises ...a" does not, without more constraints, preclude the existence of additional identical elements in the method, or device that comprises the element. Also, throughout this specification the term "key" has the broad meaning of any key, button or actuator having a dedicated, variable or programmable function that is actuatable by a user.
[0017] It will be appreciated that embodiments of the invention described herein may be comprised of one or more conventional processors and unique stored program instructions that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of associating objects in an electronic device described herein. The non-processor circuits may include, but are not limited to, a radio receiver, a radio transmitter, signal drivers, clock circuits, power source circuits, and user input devices. As such, these functions may be interpreted as steps of a method to perform user function activation on an electronic device. Alternatively, some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more application specific integrated circuits (ASICs), in which each function or some combinations of certain of the functions are implemented as custom logic. Of course, a combination of the two approaches could be used. Thus, methods and means for these functions have been described herein. Further, it is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions and programs and ICs with minimal experimentation.
[0018] Referring to FIG. 1, there is a schematic diagram illustrating an electronic device 100, typically a wireless communications device, in the form of a mobile station or mobile telephone comprising a radio frequency communications unit 102 coupled to be in communication with a processor 103. The electronic device 100 also has a touch sensitive user interface 170. In this embodiment, the first area of the touch sensitive user interface comprises a touch sensitive display screen 105, the second area (or other area) of the touch sensitive user interface comprises a touch sensitive keypad 165. However, the first area of the touch sensitive user interface can be the touch sensitive keypad 165 and the second area (or other area) of the touch sensitive user interface can the touch sensitive display screen 105. There is also an alert module 115 that typically contains an alert speaker, vibrator motor and associated drivers. The touch sensitive display screen 105, touch sensitive keypad 165 and alert module 115 are coupled to be in communication with the processor 103. Typically the touch sensitive display screen 105 and the touch sensitive keypad 165 of the touch sensitive user interface 170 will be located adjacent each other in order to facilitate user operation.
[0019] The processor 103 includes an encoder/decoder 111 with an associated code Read Only Memory (ROM) 112 for storing data for encoding and decoding voice or other signals that may be transmitted or received by the electronic device 100. The processor 103 also includes a micro-processor with object association function 113 coupled, by a common data and address bus 117, to the encoder/decoder 111, a character Read Only Memory (ROM) 114, radio frequency communications unit 102, a Random Access Memory (RAM) 104, static programmable memory 116 and a Removable User Identity Module (RUIM) interface 118. The static programmable memory 116 and a RUIM card 119 (commonly referred to as a Subscriber Identity Module (SIM) card) operatively coupled to the RUIM interface 118 each can store, amongst other things, Preferred Roaming Lists (PRLs), subscriber authentication data, selected incoming text messages and a Telephone Number Database (TND phonebook) comprising a number field for telephone numbers and a name field for identifiers associated with one of the numbers in the name field. The RUIM card 119 and static memory 116 may also store passwords for allowing accessibility to password-protected functions on the electronic device 100.
[0020] The micro-processor with object association function 113 has ports for coupling to the display screen 105, the keypad, the alert module 115, microphone 135 and a communications speaker 140 that are integral with the device.
[0021] The character Read Only Memory 114 stores code for decoding or encoding text messages that may be received by the radio frequency communications unit 102. In this embodiment the character Read Only Memory 114, RUIM card 119, and static memory 116 may also store Operating Code (OC) for the micro-processor with object association function 113 and code for performing functions associated with the electronic device 100.
[0022] The radio frequency communications unit 102 is a combined receiver and transmitter having a common antenna 107. The radio frequency communications unit 102 has a transceiver 108 coupled to the antenna 107 via a radio frequency amplifier 109. The transceiver 108 is also coupled to a combined modulator/demodulator 110 that couples the radio frequency communications unit 102 to the processor 103.
[0023] The touch sensitive user interface 170 detects manual contact from a user's finger or stylus on either or both of the display screen 105 and the keypad 165. The detected manual contacts are interpreted by the processor 103 as points or lines of contact or touch across an xy co-ordinate system of the first (105) and second (165) area of the touch sensitive user interface 170. The interpretation of the detected manual contacts as points or lines of contact by the processor 103 will typically be implemented with the execution of program code as will be appreciated by those skilled in the art. In alternative embodiments, this function may be achieved using an ASIC or equivalent hardware.
[0024] FIG 2A and 2B illustrate in more detail an example touch sensitive keypad arrangement. Touch sensitive display screens 105 will be well known to those skilled in the art and is not further described here. The touch sensitive keypad 165 comprises a number of user input keys 265 which are integrated in an overlaying relation with capacitive sensor array 272 that which detects changes in capacitance corresponding to the presence of a user's digit or other object such as a stylus. The touch sensitive keypad 165 or second area of the touch sensitive user interface 170 allows for receiving user contact, touch points or lines of contact with the keypad 165. Detection of a finger or stylus does not require pressure against the capacitive sensor array 272 or user input keys 265, but typically just a light touch or contact against the surface of the keypad 165; or even just close proximity. It is therefore possible to integrate the user input keys 265 and the capacitive sensor array 272, as the keys 265 require physical pressure or a tactile force for actuation whereas the capacitive sensors of the capacitive sensor array 272 do not. Thus it is possible to detect manual contact at the keypad 165 without actuating any of the user input keys 265. An example of a touch sensitive keypad 165 is the finger writing recognition tablet on the A668 mobile phone available from Motorola Incorporated. AS shown, the user input keys 265 each have a plunger that passes through apertures 275 in the capacitive sensor array 272 and contact respective dome switches 280 on a switch substrate 285.
[0025] Whilst capacitive sensors are typically used, other sensor arrays may alternatively be used such as ultrasound sensors to detect the user input object's position. Similarly the "activation" of a sensor may be configured to correspond to contact between a user input object such as a finger and the surface of the tablet, or even close proximity of the distal end of a user input object with the sensor such that actual physical contact with the tablet surface may not be required.
[0026] The changes in capacitance detected at the capacitive sensor array 272 are translated into a contact location on an xy grid by the processor 103. Alternatively the points or strokes of contact may be captured by an ink trajectory processor as ink trajectories with respect to the co-ordinate system of the touch sensitive keypad 165. These inks or manual contact locations are then forwarded to the processor 103 and interpreted as manual contact locations for further processing as described in more detail below. A suitable ink trajectory processor may be that used in the Motorola™ A688 mobile phone.
[0027] FIG 3 A and 3B illustrate an electronic device 100 which comprises a touch sensitive display screen 105 and a touch sensitive keypad 165 having a number of user actuable keys for entering user data and controls. The keypad 165 includes a send key (top left) 365 for sending messages or as a call key for voice communication, and a close key (top right) which can be used to close applications and terminate a voice call. The display screen 105 includes a number of icons 320 corresponding to various applications or functions that the user of the electronic device 100 may use.
[0028] Together, FIG 3A and 3B also illustrate a method of using the electronic device 100 (typically a mobile phone). A user's finger 310 can be used to drag an icon from the touch sensitive display screen 105 to the touch sensitive keypad 165. In this example, the icon is associated with a BluetoothTM application or first object. Movement of the (BluetoothTM) icon 320 across the touch sensitive display screen 105 is indicated by the partially drawn icon 325 which corresponds with the point of contact of the finger 310 across the display screen 105. The user's finger 310 moves from the touch sensitive display screen 105 to the touch sensitive keypad 165 as shown in FIG 3B. Here the user's finger 310 is touching the send key 365. The send key 365 in this example is associated with a storage location or second object. By dragging the icon 320 to the send key 365, the BluetoothTM application or first object is associated with the storage location or second object. In order to identify the first object, the initial contact of a scribed stroke or user "drag" operation is detected which corresponds to the location of an icon 320 on the touch sensitive display screen 105. In order to identify the second object with which to associate the first object, a final contact of the scribed stroke or a user "drop" operation is detected which corresponds to the location of a key 365 on the touch sensitive keypad 165.
[0029] The final contact corresponds the lifting off of the user's finger 310 from the keypad 165. Thus a first object (BluetoothTM application) is associated (shortcut link) to a second object (storage location). This shortcut to the BluetoothTM application may then be used subsequently, for example when a different application is open or displayed on the display screen 105. When a user has completed an email the BluetoothTM application may be dragged from the send key over to the email causing the email to be sent via BluetoothTM.
[0030] In an alternative embodiment, the step of associating one object with another might be achieved by actuating a key (365) on the keypad 165 instead of simply terminating contact by lifting the user's finger 310 from the keypad 165. This means that in some embodiments, a touch sensitive keypad 165 may not be needed, and instead the icon 325 may be dragged across to the edge of the touch sensitive screen 105 and then a key 265 may be actuated to associate the object represented by the icon (BluetoothTM application) with the object represented by the actuated key (storage location).
[0031] FIG 4 illustrates in more detail a method of associating objects in an electronic device 100. This method 400 will typically be implemented by executing a software program from the static memory 116 on the microprocessor with object association function 113 which receives inputs from the touch sensitive user interface 170. The method 400 is activated on the electronic device 100 by the user selecting an object association mode at step 405, for example by selecting a menu option. The method then monitors the first area of the touch sensitive user interface or touch sensitive display screen 105 in this embodiment in order to detect an initial contact of a scribed stroke at a location corresponding to a first object at step 410. The scribed stroke corresponds to the movement of the point of contact of a user's finger 310 or stylus across the first and second areas of the touch sensitive user interface 105 and 165. The location corresponding to the first object may be indicated by an icon 320 as previously described; for example the BluetoothTM application icon of FIG 3 A.
[0032] If no initial contact is detected (410N), for example after a predetermined time, then the method terminates at step 415. If however an initial contact is detected (410Y), then in response the first object (BluetoothTM application) is identified according to the location of the detected initial contact at step 420. For example if the initial contact is at the location of the BluetoothTM icon 320, then the BluetoothTM application is identified as the first object. The method 400 then determines whether the point of contact moves over the first area of the touch sensitive user interface at step 425. If not (425N), this means there is no scribed stroke, and in fact there is only stationary or temporary contact and the method then performs conventional object execution at step 430. For example if the BluetoothTM icon 320 is merely touched by the user's finger 310, then the BluetoothTM application is launched or executed and the method then terminates. If however the point of contact moves (425Y), then the method displays on the touch sensitive screen movement of the icon 320 corresponding to or following movement of the point of contact of the scribed stroke over the display screen 105 at step 435. This movement of the icon was shown in FIG 3 A by the partially drawn icon 325 following the user's finger across the display screen 105.
[0033] The method 400 then determines whether the scribed stroke or point of contact extends or moves to the other area of the touch sensitive user interface or touch sensitive keypad 165 in this embodiment at step 440. This may be implemented by detecting touch at any location on the keypad 165. If the scribed stroke doesn't extend onto the touch sensitive keypad 165 (440N), then the method returns to the step of determining whether movement of the point of contact or the scribed stroke moves over the touch sensitive display screen 105 at step 425. If however the scribed stroke does extend onto the touch sensitive keypad 165 (440N), then the method displays on the display screen 105 an indication of the key 265, 365 on the touch sensitive keypad 165 corresponding to the point of contact of the scribed stroke at step 445. An example indication 330 is shown in FIG 3B which displays both a label for the first object, in this case BluetoothTM, together with a label for the key, in this case "Send". Alternative indications may be used, for example simply displaying the symbol printed on the key 265 which is currently being touched by the user.
[0034] The method 400 then monitors the second area of the touch sensitive user interface or keypad 165 to detect a final contact of the scribed stroke at a location corresponding to a second object at step 450. Detecting a final contact may comprise detecting lift off of the finger 310 or stylus from the keypad 165, and if this is at a key 265 which is associated with a second object (450Y), then in response the method identifies the second object at step 455. The second object (eg temporary storage location) is identified according to the location of the detected final contact (eg send key). If however a final contact is not detected after a predetermined time or a final contact is detected which does not correspond with a second object (450N), for example the final contact is between keys or is over a key not assigned to a second object, then the method 400 returns to determine whether the scribed stroke still extends over the second area of the touch sensitive user interface at step 440. Whilst locations of the second area of the touch sensitive user interface 170 which correspond to a second object have been described as also corresponding to keys 265, 365, this need not be the case. For example, the second objects may be assigned simply to xy coordinates on the keypad 165 and can be identified solely using the indication 330 in the display screen 105.
[0035] In an alternative embodiment using a non touch sensitive keypad, the method identifies the second object in response to detecting actuation of a key on the keypad which corresponds with the second object. In this case, actuation of the key follows termination of the scribed stroke on the touch sensitive display screen.
[0036] Once a second object has been identified at step 455, the method associates the first and second objects at step 460. As described previously, association of two objects can cover a variety of actions including moving or copying content from one object to another, storing the content of the first object (in the second object - a temporary storage location), or providing a shortcut or other link from one object to another. Where one of the objects is an application, this application may be automatically executed upon associating the first and second objects. For example a BluetoothTM object may be started when associated with an email object in order to send the email over a BluetoothTM connection.
[0037] FIG 5 illustrates a method of associating objects in an electronic device 100 in accordance with an alternative embodiment, in which an object is dragged from the keypad 165 to the screen 105. The method 500 is activated on the electronic device 100 by the user selecting an object association mode at step 505, for example by selecting a menu option. The method then attempts to detect an initial contact of a scribed stroke at a location of the first area of the touch sensitive user interface which corresponds with a first object at step 510. The first area of the touch sensitive user interface in this embodiment is the touch sensitive keypad 165 instead of the touch sensitive display screen 105. As previously described, the scribed stroke corresponds to the movement of the point of contact of a user's finger 310 or stylus across the first and second areas of the touch sensitive user interface 165 and 105. The location corresponding to the first object may be a key 265 as previously described; for example the send key FIG 3A. [0038] If no initial contact is detected (510N), for example after a predetermined time, then the method terminates at step 515. If however an initial contact is detected (510Y), then the first object is identified according to the location of the detected initial contact at step 520. This first object may be the contents of a temporary storage location associated with the send key 365, for example a contacts email address. In another example, the object may be an application such as BluetoothTM. The method 500 then determines whether the point of contact moves over the first area of the touch sensitive user interface (the keypad 165) at step 525. If not (525N), this means there is no scribed stroke, and in fact there is only stationary or temporary contact and the method then performs conventional object execution at step 530. For example if the send key is merely touched by the user's finger 310, then the BluetoothTM application may be launched or executed and the method then terminates. If the object associated with the send key is content, then no action is taken. If however the point of contact moves (525Y), then the method displays on the display screen 105 an indication of the key on the touch sensitive keypad 165 corresponding to the point of contact of the scribed stroke at step 535. An example indication 330 is shown in FIG 3B which displays both a label for the first object, in this case BluetoothTM, together with a label for the key 365, in this case Send.
[0039] The method 500 then determines whether the scribed stroke or point of contact extends or moves to the second area of the touch sensitive user interface (the display screen 105) at step 540. This may be implemented by detecting touch at any location on the display screen 105; or within a limited region of the display screen 105 adjacent the keypad 115 for example. If the scribed stroke doesn't extend onto the touch sensitive display screen 105 (540N), then the method returns to the step of determining whether movement of the point of contact or the scribed stroke moves over the touch sensitive display screen 105. If however the scribed stroke does extend onto the touch sensitive display screen 105 (540Y), then the method displays movement of an icon 320 corresponding to the first object and following movement of the point of contact of the scribed stroke over the display screen 105 at step 545. An example of this movement of the icon is shown in FIG 3 A by the partially drawn icon 325 following the user's finger across the display screen 105.
[0040] The method 500 then attempts to detect a final contact of the scribed stroke at a location of the second area of the touch sensitive user interface which corresponds to a second object at step 550. Unlike in the method 400 of FIG. 4, the second area of the touch sensitive user interface is the display screen 105 in this embodiment. Detecting a final contact may comprise detecting lift off of the finger 310 or stylus from the display screen 105, and if this is at an icon 320 which is associated with a second object (550Y), then the method identifies the second object at step 555. The second object (eg user application) is identified according to the location of the detected final contact which is typically indicated by an on-screen icon 320. If however a final contact is not detected after a predetermined time or a final contact is detected which does not correspond with a second object (550N), for example the final contact is between icons or is over an icon not assigned a second object, then the method 500 returns to determine whether the scribed stroke still extends over the second area of the touch sensitive user interface at step 540.
[0041] Once a second object has been identified at step 555, the method associates the first and second objects at step 560. As described previously, association of two objects can cover a variety of actions including copying or moving the content from one object to another, storing the content of the first object (in the second object - a temporary storage location), or providing a shortcut or other link from one object to another. Where one of the objects is an application, this may be automatically executed upon associating the first and second objects. For example a BluetoothTM object may be started when associated with an email object in order to send the email over a BluetoothTM connection.
[0042] Various example uses of the embodiments have already been described, including copying the contents of the first object into the second object, and optionally executing the second object in the same drag and drop user operation. This avoids the use of multiple menu selections which is time consuming and inconvenient for the user. This is one example of transferring objects. The embodiments may also be used to store objects, for example storing content or an application in a temporary storage location (second object). These stored objects may be persisted for example in non-volatile memory. This might allow for example a draft SMS message to be saved even after switching the device off, or to customise the device keys to perform a shortcut to an application.
[0043] The embodiment provides a number of advantages and functions. For example seamless drag and drop operations across the display and the keypad, as well as object storage through a drag and drop operation, from a mobile device display to its keypad. Object transfer through a drag and drop operation, from the mobile device keypad to its display. The ability to persist the object storage across mobile device power cycles and/or to quickly switch applications.
[0044] As mentioned described, due to the display size restriction, existing small device user interface designs do not support object drag & drop operations. Storing and using objects are usually done through on-screen menu which provides copy and paste functionality. However the embodiments take fewer steps to achieve the task. Instead of invoking menus and make selections, the embodiments use the convenient drag and drop method to perform the operation. They are also more flexible in terms of dropping objects as the device can give continuous user interface feedback when objects are being moved but before they are dropped. For example when a user is editing an MMS and wants to insert a picture, he can drag the picture object over the MMS text, and while moving the object, the MMS editor will layout the MMS contents dynamically and give instant preview of what happens if the image is inserted at the current location. This allows user to see the editing effects without committing the operation. Only if the user is satisfied with the preview, he will proceed to drop the object which completes the operation. This provides a seamless editing experience that cannot be achieved using menu-based operations.
[0045] In an embodiment the device can be configured such that the drag and drop operation from the display to the keypad effectively stores and assigns the (first) object to the destination key (second object). While the drag and drop operation from the keypad to the display effectively applies the stored (first) object to the dropped location (second object). As will be appreciated by those skilled in the art, the semantics of applying a stored object is application and object specific. For example, in the above scenario, applying the stored "Bluetooth" object to any screen may be configured to launch the BluetoothTM application, and this serves as an easy way of customizing a shortcut key.
[0046] Further example dropping operations include: drop a contact to SMS screen to start editing a message to the person; drop a URL to a browser to navigate to the web page; drop a text fragment to any editor to paste the text.
[0047] Switching screens in mobile devices has always been troublesome. For example where a user is editing an SMS, and he then wants to copy some web contents to the message, he needs to launch the browser. The problem with known solutions is that after the browser is launched, the user has no quick way to go back to the SMS editing screen. The user can always close the browser screen, however this is typically sub-optimal and may not always be what the user wants. In an embodiment, the screen can be treated as another type of object. Prior to launching the browser, the user can drag the entire screen (through some designated area, such as the screen header) to a key. After launching the browser and copying the content, the user can drag the key into the display, effectively restoring the SMS edit screen.
[0048] In the foregoing specification, specific embodiments of the present invention have been described. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the present invention as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present invention. The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential features or elements of any or all the claims. The invention is defined solely by the appended claims including any amendments made during the pendency of this application and all equivalents of those claims.

Claims

WE CLAIM:
1. A method of associating objects in an electronic device, the method comprising: identifying a first object in response to detecting an initial contact of a scribed stroke at a location of a first area of a touch sensitive user interface which corresponds with the first object; identifying a second object in response to detecting a final contact of the scribed stroke at a location of a second area of the touch sensitive user interface which corresponds with the second object; and associating the first object with the second object, wherein one of the first and second areas of the touch sensitive user interface is a touch sensitive display screen and the other area of the touch sensitive user interface is a touch sensitive keypad.
2. A method as claimed in claim 1, wherein the location of the first area of the touch sensitive user interface corresponding to the first object comprises an icon on the touch sensitive display screen, and the location of the second area of the touch sensitive user interface corresponding to the second object comprises a key on the touch sensitive keypad.
3. A method as claimed in claim 1, wherein the location of the first area of the touch sensitive user interface corresponding to the first object comprises a key on the touch sensitive tablet, and the location of the second area of the touch sensitive user interface corresponding to the second object comprises an icon on the touch sensitive display screen.
4. A method as claimed in claim 2, further comprising displaying on the touch sensitive display screen an indication of the key on the touch sensitive keypad corresponding to the point of contact in the scribed stroke at the touch sensitive keypad.
5. A method as claimed in claim 4, further comprising displaying on the touch sensitive display screen movement of the icon of a said object corresponding to movement of the point of contact of the scribed stroke at the touch sensitive display screen.
6. A method as claimed in claim 1, wherein associating the first object with the second object comprises copying content from the first object into the second object.
7. A method as claimed in claim 6, wherein the second object is a temporary storage location.
8. A method as claimed in claim 6, wherein the second object is an application which is automatically executed upon associating the first and second objects.
9. A method of associating objects in an electronic device, the method comprising: identifying a first object in response to detecting an initial contact of a scribed stroke at a location of a first area of a touch sensitive display screen which corresponds with the first object; and identifying a second object in response to detecting actuation of a key on a keypad which corresponds with the second object, said actuation of the key following termination of the scribed stroke on the touch sensitive display screen; and associating the first object with the second object.
10. An electronic device comprising: a touch sensitive user interface for receiving scribed strokes and having a first area and a second area; a processor arranged to identify a first object in response to detecting an initial contact of the scribed stroke at a location of the first area of the touch sensitive user interface corresponding to the first object, identify a second object in response to detecting a final contact of the scribed stroke at a location of the second area of the touch sensitive user interface corresponding to the second object, and associate the first object with the second object, wherein one of the first and second areas of the touch sensitive user interface is a touch sensitive display screen and the other area of the touch sensitive user interface is a touch sensitive keypad.
11. An electronic device as claimed in claim 10, wherein the touch sensitive display screen is arranged to display an icon at the location of the first area of the touch sensitive user interface corresponding to the first object, and wherein the location of the second area of the touch sensitive user interface corresponding to the second object comprises a key on the touch sensitive keypad.
12. An electronic device as claimed in claim 10, wherein the touch sensitive display screen is arranged to display an icon at the location of the first area of the touch sensitive user interface corresponding to the second object, and wherein the location of the second area of the touch sensitive user interface corresponding to the first object comprises a key on the touch sensitive keypad.
13. An electronic device as claimed in claim 10, wherein the touch sensitive display screen is arranged to display an indication of the key on the touch sensitive keypad corresponding to the point of contact in the scribed stroke at the touch sensitive keypad.
14. An electronic device as claimed in claim 13, wherein the touch sensitive display screen is arranged to display movement of the icon of a said object corresponding to movement of the point of contact of the scribed stroke at the touch sensitive display screen.
15. An electronic device as claimed in claim 10, wherein associating the first object with the second object comprises copying content from the first object into the second object.
16. An electronic device as claimed in claim 15, wherein the second object is a temporary storage location.
17. An electronic device as claimed in claim 10, wherein the second object is an application which is automatically executed upon associating the first and second objects.
PCT/US2008/075784 2007-09-24 2008-09-10 Method and device for associating objects WO2009042399A2 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
MX2010003243A MX2010003243A (en) 2007-09-24 2008-09-10 Method and device for associating objects.
EP08799388A EP2203807A2 (en) 2007-09-24 2008-09-10 Method and device for associating objects
KR1020117030347A KR20120013439A (en) 2007-09-24 2008-09-10 Method for associating objects in electronic device
BRPI0818011A BRPI0818011A8 (en) 2007-09-24 2008-09-10 METHOD AND DEVICE TO ASSOCIATE OBJECTS
CN200880108327A CN101809532A (en) 2007-09-24 2008-09-10 Method and device for associating objects

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/859,915 2007-09-24
US11/859,915 US20090079699A1 (en) 2007-09-24 2007-09-24 Method and device for associating objects

Publications (2)

Publication Number Publication Date
WO2009042399A2 true WO2009042399A2 (en) 2009-04-02
WO2009042399A3 WO2009042399A3 (en) 2009-06-18

Family

ID=40394430

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2008/075784 WO2009042399A2 (en) 2007-09-24 2008-09-10 Method and device for associating objects

Country Status (8)

Country Link
US (1) US20090079699A1 (en)
EP (1) EP2203807A2 (en)
KR (2) KR101152008B1 (en)
CN (1) CN101809532A (en)
BR (1) BRPI0818011A8 (en)
MX (1) MX2010003243A (en)
RU (1) RU2446441C2 (en)
WO (1) WO2009042399A2 (en)

Families Citing this family (150)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8018440B2 (en) 2005-12-30 2011-09-13 Microsoft Corporation Unintentional touch rejection
EP1947562A3 (en) * 2007-01-19 2013-04-03 LG Electronics Inc. Inputting information through touch input device
US8351666B2 (en) * 2007-11-15 2013-01-08 General Electric Company Portable imaging system having a seamless form factor
TWI407336B (en) * 2007-11-22 2013-09-01 Htc Corp Electronic devices and input modules thereof
US8063879B2 (en) 2007-12-20 2011-11-22 Research In Motion Limited Method and handheld electronic device including first input component and second touch sensitive input component
US8340634B2 (en) 2009-01-28 2012-12-25 Headwater Partners I, Llc Enhanced roaming services and converged carrier networks with device assisted services and a proxy
US8402111B2 (en) 2009-01-28 2013-03-19 Headwater Partners I, Llc Device assisted services install
US8548428B2 (en) 2009-01-28 2013-10-01 Headwater Partners I Llc Device group partitions and settlement platform
US8626115B2 (en) 2009-01-28 2014-01-07 Headwater Partners I Llc Wireless network service interfaces
US8391834B2 (en) 2009-01-28 2013-03-05 Headwater Partners I Llc Security techniques for device assisted services
US8630192B2 (en) 2009-01-28 2014-01-14 Headwater Partners I Llc Verifiable and accurate service usage monitoring for intermediate networking devices
US8406748B2 (en) 2009-01-28 2013-03-26 Headwater Partners I Llc Adaptive ambient services
US8589541B2 (en) 2009-01-28 2013-11-19 Headwater Partners I Llc Device-assisted services for protecting network capacity
US8635335B2 (en) 2009-01-28 2014-01-21 Headwater Partners I Llc System and method for wireless network offloading
US8924469B2 (en) 2008-06-05 2014-12-30 Headwater Partners I Llc Enterprise access control and accounting allocation for access networks
US8275830B2 (en) 2009-01-28 2012-09-25 Headwater Partners I Llc Device assisted CDR creation, aggregation, mediation and billing
US8898293B2 (en) 2009-01-28 2014-11-25 Headwater Partners I Llc Service offer set publishing to device agent with on-device service selection
US8346225B2 (en) 2009-01-28 2013-01-01 Headwater Partners I, Llc Quality of service for device assisted services
US8924543B2 (en) 2009-01-28 2014-12-30 Headwater Partners I Llc Service design center for device assisted services
US8725123B2 (en) 2008-06-05 2014-05-13 Headwater Partners I Llc Communications device with secure data path processing agents
US8832777B2 (en) 2009-03-02 2014-09-09 Headwater Partners I Llc Adapting network policies based on device service processor configuration
US8284170B2 (en) 2008-09-30 2012-10-09 Apple Inc. Touch screen device, method, and graphical user interface for moving on-screen objects without using a cursor
EP2175354A1 (en) * 2008-10-07 2010-04-14 Research In Motion Limited Portable electronic device and method of controlling same
US9442648B2 (en) 2008-10-07 2016-09-13 Blackberry Limited Portable electronic device and method of controlling same
KR101534109B1 (en) * 2008-12-23 2015-07-07 삼성전자주식회사 Capacitive touch panel and touch system having the same
US8606911B2 (en) 2009-03-02 2013-12-10 Headwater Partners I Llc Flow tagging for service policy implementation
US10248996B2 (en) 2009-01-28 2019-04-02 Headwater Research Llc Method for operating a wireless end-user device mobile payment agent
US9578182B2 (en) 2009-01-28 2017-02-21 Headwater Partners I Llc Mobile device and service management
US10200541B2 (en) 2009-01-28 2019-02-05 Headwater Research Llc Wireless end-user device with divided user space/kernel space traffic policy system
US9647918B2 (en) 2009-01-28 2017-05-09 Headwater Research Llc Mobile device and method attributing media services network usage to requesting application
US8351898B2 (en) 2009-01-28 2013-01-08 Headwater Partners I Llc Verifiable device assisted service usage billing with integrated accounting, mediation accounting, and multi-account
US9955332B2 (en) 2009-01-28 2018-04-24 Headwater Research Llc Method for child wireless device activation to subscriber account of a master wireless device
US10237757B2 (en) 2009-01-28 2019-03-19 Headwater Research Llc System and method for wireless network offloading
US9572019B2 (en) 2009-01-28 2017-02-14 Headwater Partners LLC Service selection set published to device agent with on-device service selection
US9706061B2 (en) 2009-01-28 2017-07-11 Headwater Partners I Llc Service design center for device assisted services
US9351193B2 (en) 2009-01-28 2016-05-24 Headwater Partners I Llc Intermediate networking devices
US10715342B2 (en) 2009-01-28 2020-07-14 Headwater Research Llc Managing service user discovery and service launch object placement on a device
US10057775B2 (en) 2009-01-28 2018-08-21 Headwater Research Llc Virtualized policy and charging system
US10492102B2 (en) 2009-01-28 2019-11-26 Headwater Research Llc Intermediate networking devices
US9980146B2 (en) 2009-01-28 2018-05-22 Headwater Research Llc Communications device with secure data path processing agents
US9755842B2 (en) 2009-01-28 2017-09-05 Headwater Research Llc Managing service user discovery and service launch object placement on a device
US9253663B2 (en) 2009-01-28 2016-02-02 Headwater Partners I Llc Controlling mobile device communications on a roaming network based on device state
US11218854B2 (en) 2009-01-28 2022-01-04 Headwater Research Llc Service plan design, user interfaces, application programming interfaces, and device management
US10798252B2 (en) 2009-01-28 2020-10-06 Headwater Research Llc System and method for providing user notifications
US10264138B2 (en) 2009-01-28 2019-04-16 Headwater Research Llc Mobile device and service management
US10779177B2 (en) 2009-01-28 2020-09-15 Headwater Research Llc Device group partitions and settlement platform
US9858559B2 (en) 2009-01-28 2018-01-02 Headwater Research Llc Network service plan design
US9954975B2 (en) 2009-01-28 2018-04-24 Headwater Research Llc Enhanced curfew and protection associated with a device group
US9270559B2 (en) 2009-01-28 2016-02-23 Headwater Partners I Llc Service policy implementation for an end-user device having a control application or a proxy agent for routing an application traffic flow
US10783581B2 (en) 2009-01-28 2020-09-22 Headwater Research Llc Wireless end-user device providing ambient or sponsored services
US10484858B2 (en) 2009-01-28 2019-11-19 Headwater Research Llc Enhanced roaming services and converged carrier networks with device assisted services and a proxy
US8745191B2 (en) 2009-01-28 2014-06-03 Headwater Partners I Llc System and method for providing user notifications
US10326800B2 (en) 2009-01-28 2019-06-18 Headwater Research Llc Wireless network service interfaces
US8893009B2 (en) 2009-01-28 2014-11-18 Headwater Partners I Llc End user device that secures an association of application to service policy with an application certificate check
US9557889B2 (en) 2009-01-28 2017-01-31 Headwater Partners I Llc Service plan design, user interfaces, application programming interfaces, and device management
US9392462B2 (en) 2009-01-28 2016-07-12 Headwater Partners I Llc Mobile end-user device with agent limiting wireless data communication for specified background applications based on a stored policy
US10064055B2 (en) 2009-01-28 2018-08-28 Headwater Research Llc Security, fraud detection, and fraud mitigation in device-assisted services systems
US10841839B2 (en) 2009-01-28 2020-11-17 Headwater Research Llc Security, fraud detection, and fraud mitigation in device-assisted services systems
US9565707B2 (en) 2009-01-28 2017-02-07 Headwater Partners I Llc Wireless end-user device with wireless data attribution to multiple personas
US8793758B2 (en) 2009-01-28 2014-07-29 Headwater Partners I Llc Security, fraud detection, and fraud mitigation in device-assisted services systems
JP4904375B2 (en) * 2009-03-31 2012-03-28 京セラ株式会社 User interface device and portable terminal device
KR101510484B1 (en) 2009-03-31 2015-04-08 엘지전자 주식회사 Mobile Terminal And Method Of Controlling Mobile Terminal
US8836648B2 (en) 2009-05-27 2014-09-16 Microsoft Corporation Touch pull-in gesture
JP4799655B2 (en) * 2009-10-05 2011-10-26 株式会社東芝 Small equipment
KR101651128B1 (en) * 2009-10-05 2016-08-25 엘지전자 주식회사 Mobile terminal and method for controlling application execution thereof
US9213414B1 (en) * 2009-11-13 2015-12-15 Ezero Technologies Llc Keyboard with integrated touch control
CN102656869B (en) 2009-12-22 2015-11-25 摩托罗拉移动公司 For the method and apparatus of n-back test in the electronic device
US8479107B2 (en) * 2009-12-31 2013-07-02 Nokia Corporation Method and apparatus for fluid graphical user interface
US8239785B2 (en) * 2010-01-27 2012-08-07 Microsoft Corporation Edge gestures
US20110185320A1 (en) * 2010-01-28 2011-07-28 Microsoft Corporation Cross-reference Gestures
US20110185299A1 (en) * 2010-01-28 2011-07-28 Microsoft Corporation Stamp Gestures
US8261213B2 (en) 2010-01-28 2012-09-04 Microsoft Corporation Brush, carbon-copy, and fill gestures
US9411504B2 (en) 2010-01-28 2016-08-09 Microsoft Technology Licensing, Llc Copy and staple gestures
US9519356B2 (en) * 2010-02-04 2016-12-13 Microsoft Technology Licensing, Llc Link gestures
US20110191719A1 (en) * 2010-02-04 2011-08-04 Microsoft Corporation Cut, Punch-Out, and Rip Gestures
US20110191704A1 (en) * 2010-02-04 2011-08-04 Microsoft Corporation Contextual multiplexing gestures
US9367205B2 (en) * 2010-02-19 2016-06-14 Microsoft Technolgoy Licensing, Llc Radial menus with bezel gestures
US20110209098A1 (en) * 2010-02-19 2011-08-25 Hinckley Kenneth P On and Off-Screen Gesture Combinations
US9965165B2 (en) * 2010-02-19 2018-05-08 Microsoft Technology Licensing, Llc Multi-finger gestures
US9310994B2 (en) * 2010-02-19 2016-04-12 Microsoft Technology Licensing, Llc Use of bezel as an input mechanism
US8799827B2 (en) * 2010-02-19 2014-08-05 Microsoft Corporation Page manipulations using on and off-screen gestures
US9274682B2 (en) * 2010-02-19 2016-03-01 Microsoft Technology Licensing, Llc Off-screen gestures to create on-screen input
US8751970B2 (en) * 2010-02-25 2014-06-10 Microsoft Corporation Multi-screen synchronous slide gesture
US20110209089A1 (en) * 2010-02-25 2011-08-25 Hinckley Kenneth P Multi-screen object-hold and page-change gesture
US8473870B2 (en) * 2010-02-25 2013-06-25 Microsoft Corporation Multi-screen hold and drag gesture
US9454304B2 (en) * 2010-02-25 2016-09-27 Microsoft Technology Licensing, Llc Multi-screen dual tap gesture
US8707174B2 (en) * 2010-02-25 2014-04-22 Microsoft Corporation Multi-screen hold and page-flip gesture
US20110209058A1 (en) * 2010-02-25 2011-08-25 Microsoft Corporation Multi-screen hold and tap gesture
US8539384B2 (en) * 2010-02-25 2013-09-17 Microsoft Corporation Multi-screen pinch and expand gestures
US9075522B2 (en) * 2010-02-25 2015-07-07 Microsoft Technology Licensing, Llc Multi-screen bookmark hold gesture
US20120159395A1 (en) 2010-12-20 2012-06-21 Microsoft Corporation Application-launching interface for multiple modes
US8689123B2 (en) 2010-12-23 2014-04-01 Microsoft Corporation Application reporting in an application-selectable user interface
US8612874B2 (en) 2010-12-23 2013-12-17 Microsoft Corporation Presenting an application change through a tile
US8493357B2 (en) * 2011-03-04 2013-07-23 Integrated Device Technology, Inc Mechanical means for providing haptic feedback in connection with capacitive sensing mechanisms
US9154826B2 (en) 2011-04-06 2015-10-06 Headwater Partners Ii Llc Distributing content and service launch objects to mobile devices
EP2695037B1 (en) * 2011-04-06 2019-06-12 Headwater Research LLC Managing service user discovery and service launch object placement on a device
US9158445B2 (en) 2011-05-27 2015-10-13 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
US9104307B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US8893033B2 (en) 2011-05-27 2014-11-18 Microsoft Corporation Application notifications
US9104440B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US9658766B2 (en) 2011-05-27 2017-05-23 Microsoft Technology Licensing, Llc Edge gesture
US8928336B2 (en) 2011-06-09 2015-01-06 Ford Global Technologies, Llc Proximity switch having sensitivity control and method therefor
US8975903B2 (en) 2011-06-09 2015-03-10 Ford Global Technologies, Llc Proximity switch having learned sensitivity and method therefor
US10004286B2 (en) 2011-08-08 2018-06-26 Ford Global Technologies, Llc Glove having conductive ink and method of interacting with proximity sensor
US20130057587A1 (en) 2011-09-01 2013-03-07 Microsoft Corporation Arranging tiles
US9146670B2 (en) 2011-09-10 2015-09-29 Microsoft Technology Licensing, Llc Progressively indicating new content in an application-selectable user interface
US9143126B2 (en) 2011-09-22 2015-09-22 Ford Global Technologies, Llc Proximity switch having lockout control for controlling movable panel
US20130100030A1 (en) * 2011-10-19 2013-04-25 Oleg Los Keypad apparatus having proximity and pressure sensing
US8994228B2 (en) 2011-11-03 2015-03-31 Ford Global Technologies, Llc Proximity switch having wrong touch feedback
US10112556B2 (en) 2011-11-03 2018-10-30 Ford Global Technologies, Llc Proximity switch having wrong touch adaptive learning and method
BR112014010649B1 (en) * 2011-11-03 2022-01-11 Glowbl COMMUNICATION INTERFACE, COMMUNICATION METHOD, AND MEMORY
US8878438B2 (en) 2011-11-04 2014-11-04 Ford Global Technologies, Llc Lamp and proximity switch assembly and method
US9831870B2 (en) 2012-04-11 2017-11-28 Ford Global Technologies, Llc Proximity switch assembly and method of tuning same
US9520875B2 (en) 2012-04-11 2016-12-13 Ford Global Technologies, Llc Pliable proximity switch assembly and activation method
US9065447B2 (en) 2012-04-11 2015-06-23 Ford Global Technologies, Llc Proximity switch assembly and method having adaptive time delay
US8933708B2 (en) 2012-04-11 2015-01-13 Ford Global Technologies, Llc Proximity switch assembly and activation method with exploration mode
US9660644B2 (en) 2012-04-11 2017-05-23 Ford Global Technologies, Llc Proximity switch assembly and activation method
US9559688B2 (en) 2012-04-11 2017-01-31 Ford Global Technologies, Llc Proximity switch assembly having pliable surface and depression
US9287864B2 (en) 2012-04-11 2016-03-15 Ford Global Technologies, Llc Proximity switch assembly and calibration method therefor
US9944237B2 (en) 2012-04-11 2018-04-17 Ford Global Technologies, Llc Proximity switch assembly with signal drift rejection and method
US9197206B2 (en) 2012-04-11 2015-11-24 Ford Global Technologies, Llc Proximity switch having differential contact surface
US9531379B2 (en) 2012-04-11 2016-12-27 Ford Global Technologies, Llc Proximity switch assembly having groove between adjacent proximity sensors
US9568527B2 (en) 2012-04-11 2017-02-14 Ford Global Technologies, Llc Proximity switch assembly and activation method having virtual button mode
US9219472B2 (en) 2012-04-11 2015-12-22 Ford Global Technologies, Llc Proximity switch assembly and activation method using rate monitoring
US9184745B2 (en) 2012-04-11 2015-11-10 Ford Global Technologies, Llc Proximity switch assembly and method of sensing user input based on signal rate of change
US9136840B2 (en) 2012-05-17 2015-09-15 Ford Global Technologies, Llc Proximity switch assembly having dynamic tuned threshold
US8981602B2 (en) 2012-05-29 2015-03-17 Ford Global Technologies, Llc Proximity switch assembly having non-switch contact and method
US9337832B2 (en) 2012-06-06 2016-05-10 Ford Global Technologies, Llc Proximity switch and method of adjusting sensitivity therefor
US9641172B2 (en) 2012-06-27 2017-05-02 Ford Global Technologies, Llc Proximity switch assembly having varying size electrode fingers
KR102061776B1 (en) * 2012-09-05 2020-01-02 후아웨이 테크놀러지 컴퍼니 리미티드 Method for reordering objects and an electronic device thereof
US8922340B2 (en) 2012-09-11 2014-12-30 Ford Global Technologies, Llc Proximity switch based door latch release
US8796575B2 (en) 2012-10-31 2014-08-05 Ford Global Technologies, Llc Proximity switch assembly having ground layer
US9582122B2 (en) 2012-11-12 2017-02-28 Microsoft Technology Licensing, Llc Touch-sensitive bezel techniques
US9654426B2 (en) 2012-11-20 2017-05-16 Dropbox, Inc. System and method for organizing messages
US9935907B2 (en) 2012-11-20 2018-04-03 Dropbox, Inc. System and method for serving a message client
US9729695B2 (en) 2012-11-20 2017-08-08 Dropbox Inc. Messaging client application interface
US9311204B2 (en) 2013-03-13 2016-04-12 Ford Global Technologies, Llc Proximity interface development system having replicator and method
WO2014159862A1 (en) 2013-03-14 2014-10-02 Headwater Partners I Llc Automated credential porting for mobile devices
US9625992B2 (en) 2013-10-21 2017-04-18 I-Interactive Llc Remote control with dual activated touch sensor input
US9478124B2 (en) 2013-10-21 2016-10-25 I-Interactive Llc Remote control with enhanced touch surface input
TWI502474B (en) * 2013-11-28 2015-10-01 Acer Inc Method for operating user interface and electronic device thereof
CN103744589B (en) * 2013-12-12 2018-07-13 华为终端(东莞)有限公司 A kind of moving method and device of content of pages
KR102158692B1 (en) * 2014-01-13 2020-09-22 엘지전자 주식회사 Mobile terminal and method for controlling the same
US9477337B2 (en) 2014-03-14 2016-10-25 Microsoft Technology Licensing, Llc Conductive trace routing for display and bezel sensors
US10038443B2 (en) 2014-10-20 2018-07-31 Ford Global Technologies, Llc Directional proximity switch assembly
KR20160046633A (en) * 2014-10-21 2016-04-29 삼성전자주식회사 Providing Method for inputting and Electronic Device
CN104571812B (en) * 2014-12-10 2020-04-24 联想(北京)有限公司 Information processing method and electronic equipment
US9654103B2 (en) 2015-03-18 2017-05-16 Ford Global Technologies, Llc Proximity switch assembly having haptic feedback and method
US9548733B2 (en) 2015-05-20 2017-01-17 Ford Global Technologies, Llc Proximity sensor assembly having interleaved electrode configuration
US10698504B2 (en) * 2015-06-15 2020-06-30 Microsoft Technology Licensing, Llc Detecting input pressure on a stylus pen

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1739533A2 (en) 2005-06-20 2007-01-03 LG Electronics Inc. Apparatus and method for processing data of a mobile terminal

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2784825B2 (en) * 1989-12-05 1998-08-06 ソニー株式会社 Information input control device
JPH08307954A (en) * 1995-05-12 1996-11-22 Sony Corp Device and method for coordinate input and information processor
US6259044B1 (en) * 2000-03-03 2001-07-10 Intermec Ip Corporation Electronic device with tactile keypad-overlay
JP2003005912A (en) * 2001-06-20 2003-01-10 Hitachi Ltd Display device with touch panel and display method
US6938221B2 (en) * 2001-11-30 2005-08-30 Microsoft Corporation User interface for stylus-based user input
KR20030046891A (en) * 2001-12-07 2003-06-18 에스케이텔레텍주식회사 Folder type mobile communication terminal having touch screen and function key on the outside of upper folder
US7092495B2 (en) * 2001-12-13 2006-08-15 Nokia Corporation Communication terminal
US20040001073A1 (en) * 2002-06-27 2004-01-01 Jan Chipchase Device having a display
JP4115198B2 (en) * 2002-08-02 2008-07-09 株式会社日立製作所 Display device with touch panel
US6943777B2 (en) * 2002-10-10 2005-09-13 Motorola, Inc. Electronic device with user interface capability and method therefor
US9244602B2 (en) * 2005-08-24 2016-01-26 Lg Electronics Inc. Mobile communications terminal having a touch input unit and controlling method thereof
KR100801089B1 (en) * 2005-12-13 2008-02-05 삼성전자주식회사 Mobile device and operation method control available for using touch and drag

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1739533A2 (en) 2005-06-20 2007-01-03 LG Electronics Inc. Apparatus and method for processing data of a mobile terminal

Also Published As

Publication number Publication date
KR20120013439A (en) 2012-02-14
RU2010116287A (en) 2011-11-10
EP2203807A2 (en) 2010-07-07
CN101809532A (en) 2010-08-18
WO2009042399A3 (en) 2009-06-18
MX2010003243A (en) 2010-04-21
US20090079699A1 (en) 2009-03-26
BRPI0818011A2 (en) 2015-04-14
KR101152008B1 (en) 2012-06-01
RU2446441C2 (en) 2012-03-27
BRPI0818011A8 (en) 2015-10-27
KR20100045522A (en) 2010-05-03

Similar Documents

Publication Publication Date Title
US20090079699A1 (en) Method and device for associating objects
US7443316B2 (en) Entering a character into an electronic device
CN100444092C (en) Method for shifting a shortcut in an electronic device, a display unit of the device, and an electronic device
JP5094158B2 (en) Terminal and control method of terminal with touch screen
US20100088628A1 (en) Live preview of open windows
US20080136784A1 (en) Method and device for selectively activating a function thereof
US20120176333A1 (en) Mobile communication device capable of providing canadidate phone number list and method of controlling operation of the mobile communication device
US20070070045A1 (en) Entering a character into an electronic device
JP2009530944A (en) Improved mobile communication terminal and method therefor
US20110115722A1 (en) System and method of entering symbols in a touch input device
TW200810501A (en) Electronic apparatus and method for symbol input
US11379116B2 (en) Electronic apparatus and method for executing application thereof
JP2001069223A (en) Communication equipment
KR101354820B1 (en) Electronic device and mode controlling method the same and mobile communication terminal
US8115743B2 (en) Terminal with touch screen and method for inputting message therein
JP5793054B2 (en) Portable terminal device, program, and execution suppression method
US20090104928A1 (en) Portable electronic device and a method for entering data on such a device
KR100640504B1 (en) Appratus and method for charactericstic recognizing of hand held devices
JP5961448B2 (en) Information processing apparatus, program, and control method for information processing apparatus
JP6205021B2 (en) Information processing apparatus, program, and control method for information processing apparatus
US9690532B2 (en) Mobile electronic device and character input method
KR20060003612A (en) Wireless communication terminal and its method for providing input character preview function
WO2009009305A1 (en) Entering a character into an electronic device

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200880108327.3

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08799388

Country of ref document: EP

Kind code of ref document: A2

WWE Wipo information: entry into national phase

Ref document number: 767/KOLNP/2010

Country of ref document: IN

WWE Wipo information: entry into national phase

Ref document number: 2008799388

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20107006320

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: MX/A/2010/003243

Country of ref document: MX

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2010116287

Country of ref document: RU

ENP Entry into the national phase

Ref document number: PI0818011

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20100324