WO2000075766A1 - Self-service terminal - Google Patents

Self-service terminal Download PDF

Info

Publication number
WO2000075766A1
WO2000075766A1 PCT/GB2000/001997 GB0001997W WO0075766A1 WO 2000075766 A1 WO2000075766 A1 WO 2000075766A1 GB 0001997 W GB0001997 W GB 0001997W WO 0075766 A1 WO0075766 A1 WO 0075766A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
probe
contact
area
user
Prior art date
Application number
PCT/GB2000/001997
Other languages
French (fr)
Inventor
Donald Macinnes
Alfredo Rizo-Patron
Huimin Fu
Original Assignee
Ncr International, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GB9923660A external-priority patent/GB2355086A/en
Application filed by Ncr International, Inc. filed Critical Ncr International, Inc.
Priority to AU50878/00A priority Critical patent/AU5087800A/en
Publication of WO2000075766A1 publication Critical patent/WO2000075766A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04805Virtual magnifying lens, i.e. window or frame movable on top of displayed information to enlarge it for better reading or selection

Definitions

  • the present invention relates to a self-service terminal (SST) .
  • the invention relates to an SST having a touch sensitive screen.
  • SSTs are public access terminals that provide services for members of the public. SSTs that process cash are referred to as ATMs (automated teller machines) ; whereas SSTs that do not process cash are referred to as information kiosks .
  • ATMs automated teller machines
  • SSTs are commonly sited in public areas, such as shopping centres, retail outlets, and such like. Typical services provided by SSTs include dispensing cash and providing users with information. With the increased use of the Internet, some SSTs are now Web-enabled, that is, they allow a user to browse the Internet.
  • a common user interface provided by SSTs is a touchscreen.
  • a touchscreen enables a user to select areas on the display by pointing at the area using a probe, such as one of their fingers or a stylus .
  • SST owners prefer not to provide a stylus because of the possibility of theft or vandalism of the stylus. Thus, users generally have to make selections using one of their fingers.
  • Web pages are designed for use at a home or office computer having an accurate pointing device such as a mouse.
  • Web pages generally include small hypertext links and other active areas not designed for touchscreen use.
  • Partially-sighted users also experience problems in selecting small hypertext links.
  • drift This means that the area on the screen that is pointed at by a user does not correspond exactly to the area sensed by the touchscreen. If two hypertext links are located in close proximity, a user may select one of the links but, because of drift in the touchscreen, the other link may be selected by the touchscreen.
  • Fig 1 illustrates a computer display 3, which presents a generic "web page" of the type downloaded from a public- access network, such as the Internet.
  • the web page contains several options 6.
  • a user selects an option 6 by manipulating a cursor 9, using a pointing device (not shown), such as a mouse, as indicated in Fig 2.
  • options 6 contain a cluster 7 of options.
  • the cluster may be small, compared with the size of the finger F. Further, the cluster 7 can become even smaller, if displayed on a small screen. Small screens are used, for example, in Automated Teller Machines, ATMs.
  • a user's fingertip With a small cluster 7 of displayed options, a user's fingertip will almost necessarily contact more than one option at a given time, particularly if the screen is small. A single option cannot be selected from the cluster 7, using the fingertip.
  • a pointing device such as a fingertip
  • a touch-sensitive display screen is detected.
  • Options near the pointing device (fingertip) are enlarged, and the enlarged options can be selected using the pointing device (fingertip).
  • a self-service terminal comprising a screen for displaying a plurality of selectable options, and touch sensing means for detecting an area of the screen that is in contact with a probe, characterised in that the terminal further comprises control means responsive to the touch sensing means for displaying on the screen an enlarged image of at least the area in contact with the probe to assist selection of a selectable option by a user.
  • a user is able to see a magnified image of exactly what area of the screen the user's finger is touching, thereby enabling the user to move his finger slightly, if necessary, to select the desired selectable option. This avoids the user accidentally selecting an undesired selectable option due to, for example, the size of his finger, the angle of view, or touchscreen drift .
  • the size of any hypertext links is effectively increased.
  • the enlarged image includes the area in contact with the probe and the area in the immediate vicinity of the area in contact with the probe.
  • control means provides indication means for indicating what part of the enlarged image is in contact with the probe .
  • the indication means may be in the form of a pointer.
  • the indication means may be implemented by the control means highlighting the part of the enlarged image that is in contact with the probe.
  • the control means may be operative to cease displaying the enlarged image on the screen when the probe is removed from contact with the screen. Alternatively, when the probe is removed from contact with the screen the control means may continue to display the enlarged image until the probe is re-applied to the screen and a new enlarged image is shown.
  • control means are operative to display the enlarged image on an area of the screen that is not obscured by a user's hand.
  • control means are operative to display the enlarged image on a fixed area of the screen, so that the enlarged image always appears in the same place.
  • the touch sensing means are operable to select an option on removal of the probe from contact with the screen.
  • an additional contact may be required to select an option.
  • the enlarged image may be displayed as a graphical window that is configurable by a user so that the user is able to resize the window.
  • the window may also allow the user to select the desired magnification.
  • the control means may be implemented in software.
  • the SST may be an ATM.
  • a method of assisting a user select options at a self-service terminal having a touch sensitive screen comprising the steps of: detecting an area of the screen that is in contact with a probe, and displaying on the screen an enlarged image of at least the area in contact with the probe.
  • the step of displaying on the screen an enlarged image of at least the area in contact with the probe includes displaying on the screen an area in the immediate vicinity of the area in contact with the probe.
  • the method includes the further step of indicating on the enlarged image what part of the image is in contact with the probe.
  • a touch sensitive screen for displaying a plurality of selectable options, the screen including touch sensing means for detecting an area of the screen that is in contact with a probe, and control means responsive to the touch sensing means for displaying on the screen an enlarged image of at least the area in contact with the probe for assisting selection of a selectable option by a user.
  • a computer program product for use with a computer having a touch sensitive screen responsive to a probe, the product comprising computer program code means, when the program is loaded, for responding to touch sensing means and for displaying on the screen an enlarged image of at least an area in contact with the probe, and for displaying indication means for indicating what part of the enlarged image is in contact with the probe, for assisting selection of a selectable option by a user.
  • a method of operating a computer display comprising the following steps: a) detecting proximity of a human finger to a displayed option; and b) in response, enlarging the option.
  • the method may further comprise the following step, executed before the enlargement step: c) ascertaining whether spacing between the option and neighbouring options falls below a threshold.
  • the method may further comprise the step of enlarging options which neighbour the option.
  • the method may further comprise the step of detecting selection of an enlarged option.
  • a method of operating a computer display which presents options to a user for selection, comprising the following steps: a) detecting proximity of a human-controlled object to an option; b) in response to the detected proximity, generating a pointing icon; and c) causing the pointing icon to follow motion of the object.
  • the method may further comprise the steps of: d) detecting proximity of the pointing icon to a target option; and e) providing a venue for the user to select the target option.
  • a seventh aspect of the invention there is provided a system, comprising:
  • the enlargement may only occur if spacing between the options within the region falls below a threshold.
  • a touch-sensitive display screen a touch-sensitive display screen; b) means for displaying multiple options within a space approximately the size of an average adult fingerprint; and c) means for allowing a user to select one of said options, using said touch-sensitive display.
  • the space may be approximately 12mm by 12 mm.
  • the system may further comprise an automated teller machine, which utilises the screen in its transactions.
  • Figs 1 to 2 illustrate events occurring in a prior-art touch-screen display
  • Fig 3 illustrates an observation made by the inventor, regarding the displays of Figs 1 and 2, if operated under certain conditions;
  • Figs 4 and 5 illustrate one form of the invention;
  • Fig 6 is a flow chart illustrating logic implemented by one form of the invention.
  • Figs 7 to 10 illustrate operation of another form of the invention
  • Fig 11 is a flow chart illustrating logic implemented by one form of the invention.
  • Fig 12 illustrates another form of the invention
  • Fig 13 illustrates part of Fig 12
  • Figs 14 to 18 illustrate how the invention handles a specific problem
  • Fig 19 illustrates spacing D between options
  • Fig 20 illustrates one form of the invention
  • Fig 21 is a simplified block diagram of a self-service terminal according to one embodiment of the present inven ion;
  • Fig 22 is a simplified block diagram of part of the terminal of Fig 21;
  • Fig 23 is a pictorial representation of a user interacting with the display of Fig 21;
  • Fig 24 is a flowchart illustrating the steps used by the terminal of Fig 21.
  • a user's fingertip F makes contact with a region of the touch-sensitive display 12.
  • the invention produces an enlargement of the region of contact, indicated by enlargement 15 in Fig 5.
  • the enlarged options 6E are of sufficient size that the fingertip F can select one option, without interference by the others.
  • Fig 6 is a flow chart illustrating logic implemented by this form of the invention.
  • Block 40 detects the touch of fingertip F in Fig 4.
  • Block 45 in Fig 4 inquires whether the spacing of the option-buttons 6 in Fig 4 is below a threshold, thereby creating difficulty in selecting one button 6 over the others, using a human fingertip F.
  • the threshold may require, for example, that a spacing 1/2 inch wide surround every option.
  • Fig 19 illustrates this spacing as distance D.
  • block 60 in Fig 6 is reached, and processing of the signals produced by the touch screen 12 proceeds in the usual manner. If the answer is affirmative, block 50 is reached.
  • Block 50 represents the steps of generating an enlarged copy of a region surrounding the location of the touching, and displaying that copy on the screen.
  • the region may be one inch in diameter, or a square one inch on a side, and that region is enlarged.
  • Image 15 in Fig 5 represents such an enlarged copy.
  • region 15 in Fig 5 covers an area which was previously devoted to other display functions, and possibly contained other option-buttons (not shown). When region 15 is generated, those functions and buttons become displaced.
  • the touch-screen system (not shown) which responds to the touching of the screen must, in effect, be re-programmed to recognise that the options 6E have been moved.
  • Block 65 is then reached, wherein inquiry is made as to whether one of the enlarged buttons 6E in Fig 5 has been chosen. If so, block 70 is reached, wherein the option is executed, in the usual manner. Then block 75 restores the screen 12 to its previous condition, as indicated in Fig 4.
  • block 80 is reached, which, together with block 85, implement a time-out function: the enlarged region 15 in Fig 5 is displayed for a specific, limited time, such as five seconds. After that time expires, region 15 is dissolved, and the screen 12 resumes its previous appearance, as in Fig 4. If, after block 55 is executed, an option is neither selected, nor the time of block 80 has expired, the logic idles in loop 90 until one of those two events occurs.
  • a screen 12 displays an image of a fingertip FI.
  • An actual fingertip F is also shown.
  • the owner of the fingertip F is instructed by a message M to place fingertip F over the image FI.
  • the touch-sensitive screen detects the contact, and responds by surrounding the finger-image (not shown) with a disc, or halo H.
  • An arrow A is generated, which extends from the halo H.
  • a message 100 is displayed, telling the user to move the arrow to the target option. In Fig 9, the user does so.
  • the system responds by highlighting the option selected, as indicated.
  • the system may instruct the user to hold the arrow at the selected option for a period of time, as indicated in Fig 9. When the time expires, the option becomes selected, and a confirming message is displayed so indicating, as in Fig 10.
  • the system displays a message stating, "Three beeps will sound. Hold the arrow at the desired option during all three beeps.” Then, the system sounds three beeps through a speaker (not shown), and inquires whether the arrow A was held at an option during all three beeps. If so, the option is executed.
  • the system displays a message stating, "To select this option, lift your finger, and press the halo again.”
  • the system looks for a termination of finger contact, followed by resumed contact in the same general vicinity.
  • Fig 11 is a flow chart of logic implemented by the second embodiment.
  • Block 150 detects a touch on screen 12 in Fig 7 and, specifically, a touch of the finger image FI. If no touch is detected, the logic idles in loop 151 in Fig 11. When a touch is detected, the logic reaches block 155, which displays a message, such as message 100 in Fig 8.
  • block 170 is reached, wherein a message is generated telling the user how to select the highlighted option.
  • a message is shown in Fig 9.
  • a keypad 200 for moving a cursor C is displayed on the touch-sensitive screen 12.
  • That keypad contains five "hot spots.” Four of them are identical in size and shape, and indicated by circles 205 in Fig 13. The circles 205 are about 5/8 inch in diameter, which is roughly the size of the U.S. ten-cent coin, or dime. Touching a circle 205 causes the cursor C to move in the corresponding direction.
  • the fifth hot spot occupies the "enter" key 210, which corresponds, in function, to the "enter” key on a computer keyboard.
  • the user would first move the cursor C to the option desired, using hot spots 205.
  • the user presses the "enter” button 210.
  • Circuitry and software indicated by block 220 in Fig 12, interacts with keypad 200, as indicated by arrow 225, in order to implement the functions just described. Such implementation is known in the art.
  • Fig 14 an option 6H is shown, and that option is about to be exercised.
  • the exercise may cause immediate execution of a computer process (not shown), or, in other cases, cause a display of additional options 300, as in Fig 15.
  • Fig 16 illustrates how the invention can handle the display of the additional options 300.
  • the original option 6H, together with its neighbours, were initially enlarged by the invention, as indicated by options 6E in Fig 16, and that enlarged option 6H was selected.
  • the result of exercising option 6H is to call up another menu of options, rather than causing execution of a computer process.
  • the invention displays the additional options 300 in Fig 16, in their ordinary size. That is, for example, the additional options 300, located on the right side of the Fig, are displayed as the same size as original options, such as 6H, located at the left side of the Fig.
  • the additional options 300 may not fit the display 3, because the enlarged options 6E consume some space which the additional options 300 would have occupied.
  • the invention dissolves the enlarged options 6E, as in Fig 18, whenever an enlarged option calls up additional options 300.
  • the additional options 300 can, themselves, call up further additional options.
  • the principles described in Figs 6 or 11 are applied to the additional options. That is, for example, if the user's fingertip touches them, and if their spacing falls below the threshold, they are enlarged. If not, they remain their normal size. These processes are repeated for yet additional options.
  • options in the vicinity of a finger-touch on the screen 3 are enlarged.
  • a user may select one of the enlarged options. If that option calls up additional options, the enlarged options, presently displayed, are removed, and the additional options are displayed in the usual manner.
  • FIG 20 illustrates one form of the invention.
  • a public kiosk 400 such as an Automated Teller Machine, ATM, contains a touch-sensitive screen 405. Among the components contained within the ATM are those contained within dashed box 410.
  • a control system 415 for the touch screen 405 detects the positions on the screen 405 which are touched, and issues signals on line 416 indicating the co-ordinates of the touch-points.
  • Computer 420 receives those coordinates, and uses them in execution of the processes described above. The computer 420 also controls the images displayed on screen 405.
  • Fig 19 illustrates one type of spacing. If distance D is less than 0.5 inch, for example, then enlargement occurs if a touch occurs at point P.
  • the options enlarged are those residing within, for example, 1.0 inch of point P. 2.
  • a mere touch of the screen does not cause enlargement. That is, if a user touches the screen, but no options are located within the vicinity, such as within one inch, then no enlargement occurs. Consequently, a user may drag a finger across the screen, but nothing would happen, until an option comes within one inch of the finger.
  • This approach has the advantage of not distorting the screen. That is, if a magnified region of the screen jumped into view every time a person touched the screen, that may interfere with reading the screen.
  • a touch screen was described above. Such screens are known in the art. Further, actual contact with the screen is not necessarily required. Proximity of a finger to the screen can be detected, as known in the art. That is, if a finger approaches within a small distance, such as 1/10 inch (2.54 mm), that proximity can be considered equivalent to contact, for purposes of option selection and enlargement .
  • a human finger be used for the touching.
  • a human finger encased within a glove can be used.
  • a stylus such as a pencil, held in a human hand, can be used.
  • a disabled person may not use a hand, but may use an artificial hand, or prosthesis.
  • the prosthesis may hold the stylus just described.
  • any object capable of touching the screen at a discrete spot of approximately the size of a human fingerprint can be used.
  • a two-stage selection process can be identified. First, a user makes a preliminary selection, by touching a cluster of closely-spaced options. In response, the invention enlarges options contained within the cluster.
  • the preliminary selection is not necessarily a selection of an option.
  • the preliminary selection may be caused by physical contact with an option. However, in general, that situation will be accidental.
  • That option may be the same as the option described in the previous paragraph. That coincidence, in general, will be accidental.
  • multiple options are displayed, such as those shown in Fig 19. These options are sufficiently small that two, or more, fit into a space approximately the size of an average adult human fingerprint. That space is about 1/2 inch by 1/2 inch. Despite the small size of the displayed options, the invention allows a user to select a desired option, to the exclusion of others.
  • a self-service terminal 510 having a user interface 512 comprising a screen 514 for displaying a plurality of selectable options (such as hypertext links) , touch sensing means 516 in the form of a touch panel aligned with and located adjacent to the screen 514, a card receiving slot 518 and a printer slot 520.
  • selectable options such as hypertext links
  • the screen 514 has an associated display driver 522
  • the touch panel 516 has a touch panel driver 524
  • the card receiving slot 518 has an associated motorised card reader module 526
  • the printer slot 520 has an associated printer module 530. All of the drivers 522, 524 and the modules 526, 530 are connected to a terminal controller 532 that controls the operation of the terminal 510.
  • the terminal controller 532 is also connected to a network connection 534 for communicating with an IP (Internet Protocol) network 536, such as the Internet, an Intranet, or an Extranet.
  • IP Internet Protocol
  • the terminal controller 532 includes a processor 540, associated memory 542, and storage space 544 in the form of a hard disk drive.
  • the hard disk 544 stores the operating system for the terminal 510, the application program that controls the terminal 510, and control means in the form of a zoom-in program.
  • the zoom-in program may be based on a conventional zoom-in tool such as Zoo ln version 3.1 available from Microsoft (trade mark) Corporation.
  • the zoom-in program is responsive to the touch panel driver 524 for displaying on the screen 514 an enlarged image of the area in contact with a probe.
  • the operating system kernel 50 (Fig 2)
  • the application program 52 (Fig 2) for controlling the terminal
  • the zoom-in program 54 (Fig 2) are loaded into memory 42 (step 5100 of Fig 4) .
  • the touch panel 516 senses this (step 602) and the panel driver 524 conveys coordinate data to the controller 532 for enabling the controller 532 to determine what pixel of the screen 514 has been touched by the user 560.
  • the co-ordinate data is generally a single Cartesian (x,y) co-ordinate representing the pixel touched by the finger 562.
  • the touch panel driver 524 performs calculations to determine from the plurality of pixels touched by the finger 562 what the centre pixel is; the co-ordinates of this centre pixel are sent to the controller 532.
  • the zoom-in program 554 uses this single co-ordinate to instruct the display driver 522 to display a magnification window 570 (step 604) having:
  • indication means 564 in the form of a pointer shaped like an arrow, pointing at the pixel represented by the single co-ordinate. It will be appreciated that a conventional zoom-in application such as Zoomln version 3.1 can be easily modified to include a pointer that points at the pixel represented by the single co-ordinate.
  • the magnification window 570 overlies part of the main window 572 and is located so that it is not obscured by the user's hand.
  • the zoom-in program 554 instructs the display driver to display all of the pixels within a certain (x,y) distance of the pixel represented by the single co-ordinate so that a magnified image in the vicinity of the single co-ordinate is displayed on window 570.
  • the zoom-in program 554 enlarges the area around the finger 562 and adds a pointer 564 that points to the pixel represented by the single co-ordinate, it does not affect the size of any touch zones within the touch panel 516; thus, the operation of the touch panel 516 is unaffected by the zoom-in program 554.
  • the pointer 564 gives the user 560 an indication of the exact area of the screen 514 he is pointing at.
  • magnification window 570 can be moved by the user to any desired position on the screen 514.
  • technology used for creating and manipulating graphical windows on a display is well known in the art it will not be described in detail herein.
  • three small hypertext links 578,580,582 are displayed on screen 514.
  • the user 560 places his finger 562 near to the desired hypertext link 580.
  • the magnification window 570 is opened (step 604) and the user 560 is able to move his finger 562 to guide the pointer 564 to the desired hypertext link 580.
  • the touch panel 516 continually monitors the position of the finger 562 to detect removal (step 606) or movement (step 608) of the finger 562.
  • the touch panel driver 524 sends the new single co-ordinate and the zoom-in program 554 uses this co-ordinate to instruct the display driver 522 to update (step 610) the contents of the magnification window 570.
  • the user 560 removes his finger 562 from the touch panel 516 and the hypertext link 580 is selected (step 612) by the touch panel driver 524.
  • the magnification window 570 is then closed by the zoom-in program 554 until the touch panel 516 is touched again (step 602) .
  • the above embodiment has the advantage that any drift in the touch panel 516 is corrected because the user 560 can identify exactly what point on the screen 514 the touch panel 516 is sensing.
  • the above embodiment also has the advantage that a user 560 is provided with a magnified view of what point he is touching.
  • the screen 514 and touch panel 516 may be incorporated into a single integral touchscreen unit.
  • the probe may be a stylus.
  • the control means may be implemented in hardware or firmware.

Abstract

A self-service terminal (510) having a touch sensitive screen is described. The terminal (510) comprises a screen (514) for displaying a plurality of selectable options, and touch sensing means (516) for detecting an area of the screen (514) that is in contact with a probe (562). The terminal (510) further comprises control means (552) responsive to the touch sensing means (516) for displaying on the screen an enlarged image of at least the area in contact with the probe (562) to assist selection of a selectable option (578, 580, 582) by a user. A method of assisting a user select options at a self-service terminal, and a touch sensitive screen are also described.

Description

— _L —
SELF-SERVICE TERMINAL
The present invention relates to a self-service terminal (SST) . In particular the invention relates to an SST having a touch sensitive screen.
SSTs are public access terminals that provide services for members of the public. SSTs that process cash are referred to as ATMs (automated teller machines) ; whereas SSTs that do not process cash are referred to as information kiosks .
SSTs are commonly sited in public areas, such as shopping centres, retail outlets, and such like. Typical services provided by SSTs include dispensing cash and providing users with information. With the increased use of the Internet, some SSTs are now Web-enabled, that is, they allow a user to browse the Internet.
A common user interface provided by SSTs is a touchscreen. As is well known in the art, a touchscreen enables a user to select areas on the display by pointing at the area using a probe, such as one of their fingers or a stylus . SST owners prefer not to provide a stylus because of the possibility of theft or vandalism of the stylus. Thus, users generally have to make selections using one of their fingers.
One problem associated with browsing the Internet at an SST is that Web pages are designed for use at a home or office computer having an accurate pointing device such as a mouse. Web pages generally include small hypertext links and other active areas not designed for touchscreen use. For a novice user of an SST, or for an SST user who is not used to selecting areas on the screen using his finger, it is very difficult to select the small hypertext links accurately. Partially-sighted users also experience problems in selecting small hypertext links.
Another problem associated with using some touchscreens is that there tends to be a difference between a point on the touchscreen and the corresponding point displayed on the screen. This difference is termed the drift. This means that the area on the screen that is pointed at by a user does not correspond exactly to the area sensed by the touchscreen. If two hypertext links are located in close proximity, a user may select one of the links but, because of drift in the touchscreen, the other link may be selected by the touchscreen.
Fig 1 illustrates a computer display 3, which presents a generic "web page" of the type downloaded from a public- access network, such as the Internet. The web page contains several options 6. A user (not shown) selects an option 6 by manipulating a cursor 9, using a pointing device (not shown), such as a mouse, as indicated in Fig 2.
A problem can arise using this approach, because of the size of the human finger. As shown in Fig 3, options 6 contain a cluster 7 of options. In a normal web page, the cluster may be small, compared with the size of the finger F. Further, the cluster 7 can become even smaller, if displayed on a small screen. Small screens are used, for example, in Automated Teller Machines, ATMs.
With a small cluster 7 of displayed options, a user's fingertip will almost necessarily contact more than one option at a given time, particularly if the screen is small. A single option cannot be selected from the cluster 7, using the fingertip.
It is among the objects of one or more embodiments of the present invention to obviate or mitigate one or more of the above disadvantages, or of other disadvantages associated with the prior art.
In one form of the invention, presence of a pointing device (such as a fingertip) near a touch-sensitive display screen is detected. Options near the pointing device (fingertip) are enlarged, and the enlarged options can be selected using the pointing device (fingertip).
According to a first aspect of the present invention there is provided a self-service terminal comprising a screen for displaying a plurality of selectable options, and touch sensing means for detecting an area of the screen that is in contact with a probe, characterised in that the terminal further comprises control means responsive to the touch sensing means for displaying on the screen an enlarged image of at least the area in contact with the probe to assist selection of a selectable option by a user.
By virtue of this aspect of the invention, a user is able to see a magnified image of exactly what area of the screen the user's finger is touching, thereby enabling the user to move his finger slightly, if necessary, to select the desired selectable option. This avoids the user accidentally selecting an undesired selectable option due to, for example, the size of his finger, the angle of view, or touchscreen drift . By providing a magnified image of at least the area in contact with the probe, the size of any hypertext links is effectively increased.
By magnifying the area of the screen in contact with the probe, a user can determine exactly what point on the screen is in contact with the probe, so that the touch sensitive zones on the screen do not require enlargement . This is particularly advantageous when the screen is used for displaying numerous selectable options, such as is common on a "Web page" .
Preferably, the enlarged image includes the area in contact with the probe and the area in the immediate vicinity of the area in contact with the probe.
Preferably, the control means provides indication means for indicating what part of the enlarged image is in contact with the probe . The indication means may be in the form of a pointer. Alternatively, the indication means may be implemented by the control means highlighting the part of the enlarged image that is in contact with the probe.
This allows a user to modify the position of the probe so that the probe is located exactly where the user desires . This gives the user fine control over the placement of the probe on the screen, thereby minimising the possibility of accidental selection of an undesired option.
The control means may be operative to cease displaying the enlarged image on the screen when the probe is removed from contact with the screen. Alternatively, when the probe is removed from contact with the screen the control means may continue to display the enlarged image until the probe is re-applied to the screen and a new enlarged image is shown.
Preferably, the control means are operative to display the enlarged image on an area of the screen that is not obscured by a user's hand. Alternatively, the control means are operative to display the enlarged image on a fixed area of the screen, so that the enlarged image always appears in the same place.
Preferably, the touch sensing means are operable to select an option on removal of the probe from contact with the screen. Alternatively, an additional contact may be required to select an option.
The enlarged image may be displayed as a graphical window that is configurable by a user so that the user is able to resize the window. The window may also allow the user to select the desired magnification.
This has the advantage that the user can determine the size of the image to suit their preference or requirement. For example, a partially-sighted person may require a larger enlarged image than a person having perfect eyesight .
The control means may be implemented in software.
The SST may be an ATM.
According to a second aspect of the invention there is provided a method of assisting a user select options at a self-service terminal having a touch sensitive screen, the method comprising the steps of: detecting an area of the screen that is in contact with a probe, and displaying on the screen an enlarged image of at least the area in contact with the probe. Preferably, the step of displaying on the screen an enlarged image of at least the area in contact with the probe includes displaying on the screen an area in the immediate vicinity of the area in contact with the probe.
Preferably, the method includes the further step of indicating on the enlarged image what part of the image is in contact with the probe.
According to a third aspect of the invention there is provided a touch sensitive screen for displaying a plurality of selectable options, the screen including touch sensing means for detecting an area of the screen that is in contact with a probe, and control means responsive to the touch sensing means for displaying on the screen an enlarged image of at least the area in contact with the probe for assisting selection of a selectable option by a user.
According to a fourth aspect of the invention there is provided a computer program product for use with a computer having a touch sensitive screen responsive to a probe, the product comprising computer program code means, when the program is loaded, for responding to touch sensing means and for displaying on the screen an enlarged image of at least an area in contact with the probe, and for displaying indication means for indicating what part of the enlarged image is in contact with the probe, for assisting selection of a selectable option by a user.
According to a fifth aspect of the invention there is provided a method of operating a computer display, the method comprising the following steps: a) detecting proximity of a human finger to a displayed option; and b) in response, enlarging the option.
The method may further comprise the following step, executed before the enlargement step: c) ascertaining whether spacing between the option and neighbouring options falls below a threshold.
The method may further comprise the step of enlarging options which neighbour the option.
The method may further comprise the step of detecting selection of an enlarged option.
According to a sixth aspect of the invention there is provided a method of operating a computer display which presents options to a user for selection, comprising the following steps: a) detecting proximity of a human-controlled object to an option; b) in response to the detected proximity, generating a pointing icon; and c) causing the pointing icon to follow motion of the object.
The method may further comprise the steps of: d) detecting proximity of the pointing icon to a target option; and e) providing a venue for the user to select the target option.
According to a seventh aspect of the invention there is provided a system, comprising:
a) a display screen; and b) a system for i) presenting options in a region on a screen, ii) detecting contact between an external object and the region, iii) enlarging options contained within the region, and iv) detecting touch of an enlarged option.
The enlargement may only occur if spacing between the options within the region falls below a threshold.
According to an eighth aspect of the invention there is provided a system comprising:
a) a touch-sensitive display screen; b) means for displaying multiple options within a space approximately the size of an average adult fingerprint; and c) means for allowing a user to select one of said options, using said touch-sensitive display.
The space may be approximately 12mm by 12 mm.
The system may further comprise an automated teller machine, which utilises the screen in its transactions.
These and other aspects of the invention will become apparent from the following specific description, given by way of example, with reference to the accompanying drawings, in which:
Figs 1 to 2 illustrate events occurring in a prior-art touch-screen display;
Fig 3 illustrates an observation made by the inventor, regarding the displays of Figs 1 and 2, if operated under certain conditions; Figs 4 and 5 illustrate one form of the invention;
Fig 6 is a flow chart illustrating logic implemented by one form of the invention;
Figs 7 to 10 illustrate operation of another form of the invention;
Fig 11 is a flow chart illustrating logic implemented by one form of the invention;
Fig 12 illustrates another form of the invention;
Fig 13 illustrates part of Fig 12;
Figs 14 to 18 illustrate how the invention handles a specific problem;
Fig 19 illustrates spacing D between options;
Fig 20 illustrates one form of the invention;
Fig 21 is a simplified block diagram of a self-service terminal according to one embodiment of the present inven ion;
Fig 22 is a simplified block diagram of part of the terminal of Fig 21;
Fig 23 is a pictorial representation of a user interacting with the display of Fig 21; and
Fig 24 is a flowchart illustrating the steps used by the terminal of Fig 21.
In Fig 4, a user's fingertip F makes contact with a region of the touch-sensitive display 12. In response to this contact, the invention produces an enlargement of the region of contact, indicated by enlargement 15 in Fig 5. The enlarged options 6E are of sufficient size that the fingertip F can select one option, without interference by the others.
Fig 6 is a flow chart illustrating logic implemented by this form of the invention. Block 40 detects the touch of fingertip F in Fig 4. Block 45 in Fig 4 inquires whether the spacing of the option-buttons 6 in Fig 4 is below a threshold, thereby creating difficulty in selecting one button 6 over the others, using a human fingertip F. The threshold may require, for example, that a spacing 1/2 inch wide surround every option. Fig 19 illustrates this spacing as distance D.
If the answer to the inquiry is negative, block 60 in Fig 6 is reached, and processing of the signals produced by the touch screen 12 proceeds in the usual manner. If the answer is affirmative, block 50 is reached.
Block 50 represents the steps of generating an enlarged copy of a region surrounding the location of the touching, and displaying that copy on the screen. For example, the region may be one inch in diameter, or a square one inch on a side, and that region is enlarged. Image 15 in Fig 5 represents such an enlarged copy.
The enlargement is sufficiently great that the spacing between the options equals, or exceeds the threshold stated above. In this manner, two modes of operation arise. In one mode, if a fingertip approaches a cluster of "large" options, having a spacing which exceeds the threshold, then no enlargement is undertaken. In the second mode, if the fingertip approaches a cluster of "small" options, in which spacing fails to exceed the threshold, then the enlargement is undertaken. In either case, options having a spacing exceeding the threshold are presented to the customer. Block 55 in Fig 6 adjusts the co-ordinates assigned to the option-buttons 6E in Fig 5 within the enlargement. That is, region 15 in Fig 5 covers an area which was previously devoted to other display functions, and possibly contained other option-buttons (not shown). When region 15 is generated, those functions and buttons become displaced. The touch-screen system (not shown) which responds to the touching of the screen must, in effect, be re-programmed to recognise that the options 6E have been moved.
Accordingly, block 55 adjust the co-ordinates in question. It establishes the fact, when a touch occurs on one of the three enlarged buttons 6E in Fig 5, the touched button should be actuated, and not any buttons previously displayed, because the latter have been displaced by the enlarged buttons 6E. This is termed a "co-ordinate adjustment" because, for example, the co-ordinates (x = 6, y = 2) in Fig 5 are now occupied by an enlarged button 6E. That fact is recorded by block 55 in Fig 6.
Block 65 is then reached, wherein inquiry is made as to whether one of the enlarged buttons 6E in Fig 5 has been chosen. If so, block 70 is reached, wherein the option is executed, in the usual manner. Then block 75 restores the screen 12 to its previous condition, as indicated in Fig 4.
If the answer to the inquiry of block 65 is negative, block 80 is reached, which, together with block 85, implement a time-out function: the enlarged region 15 in Fig 5 is displayed for a specific, limited time, such as five seconds. After that time expires, region 15 is dissolved, and the screen 12 resumes its previous appearance, as in Fig 4. If, after block 55 is executed, an option is neither selected, nor the time of block 80 has expired, the logic idles in loop 90 until one of those two events occurs.
Second Embodiment
The preceding discussion was framed in terms of changing the relative size between (1) the option-buttons 6 in Fig 4 and (2) the size of the fingertip F. Another approach is to change the effective size of the fingertip F, as will now be explained.
In Fig 7, a screen 12 displays an image of a fingertip FI. An actual fingertip F is also shown. The owner of the fingertip F is instructed by a message M to place fingertip F over the image FI. When this occurs, as in Fig 8, the touch-sensitive screen detects the contact, and responds by surrounding the finger-image (not shown) with a disc, or halo H.
An arrow A is generated, which extends from the halo H. A message 100 is displayed, telling the user to move the arrow to the target option. In Fig 9, the user does so. The system responds by highlighting the option selected, as indicated.
The user must now select the option selected. A venue, or approach, for making this selection can be provided in numerous different ways. The system may instruct the user to hold the arrow at the selected option for a period of time, as indicated in Fig 9. When the time expires, the option becomes selected, and a confirming message is displayed so indicating, as in Fig 10.
In an alternate method of selection, the system displays a message stating, "Three beeps will sound. Hold the arrow at the desired option during all three beeps." Then, the system sounds three beeps through a speaker (not shown), and inquires whether the arrow A was held at an option during all three beeps. If so, the option is executed.
In yet another method of selection, the system displays a message stating, "To select this option, lift your finger, and press the halo again." The system looks for a termination of finger contact, followed by resumed contact in the same general vicinity.
Fig 11 is a flow chart of logic implemented by the second embodiment. Block 150 detects a touch on screen 12 in Fig 7 and, specifically, a touch of the finger image FI. If no touch is detected, the logic idles in loop 151 in Fig 11. When a touch is detected, the logic reaches block 155, which displays a message, such as message 100 in Fig 8.
Then the logic reaches block 160 in Fig 11, which inquires whether arrow A in Fig 8 has reached an option- button 6. If not, the logic idles in loop 161 in Fig 11. When an option-button is reached, the YES branch is taken from block 160, leading to block 165. The latter highlights the option, as indicated by the heavy rectangle in Fig 9.
Next, in Fig 11, block 170 is reached, wherein a message is generated telling the user how to select the highlighted option. One such message is shown in Fig 9. Several approaches to selecting the highlighted option were discussed above.
Third Embodiment
The two embodiments discussed above discussed a change in relative size between (1) the effective selection point of the finger F and (2) the option-buttons to be selected. In another embodiment, shown in Fig 11, a keypad 200 for moving a cursor C is displayed on the touch-sensitive screen 12.
That keypad contains five "hot spots." Four of them are identical in size and shape, and indicated by circles 205 in Fig 13. The circles 205 are about 5/8 inch in diameter, which is roughly the size of the U.S. ten-cent coin, or dime. Touching a circle 205 causes the cursor C to move in the corresponding direction.
The fifth hot spot occupies the "enter" key 210, which corresponds, in function, to the "enter" key on a computer keyboard. Thus, to select an option-button, the user would first move the cursor C to the option desired, using hot spots 205. When the option is reached, as indicated by highlighting, the user presses the "enter" button 210.
Circuitry and software, indicated by block 220 in Fig 12, interacts with keypad 200, as indicated by arrow 225, in order to implement the functions just described. Such implementation is known in the art.
Fourth Embodiment
In Fig 14, an option 6H is shown, and that option is about to be exercised. The exercise may cause immediate execution of a computer process (not shown), or, in other cases, cause a display of additional options 300, as in Fig 15.
Fig 16 illustrates how the invention can handle the display of the additional options 300. Assume that the original option 6H, together with its neighbours, were initially enlarged by the invention, as indicated by options 6E in Fig 16, and that enlarged option 6H was selected. Assume also that the result of exercising option 6H is to call up another menu of options, rather than causing execution of a computer process.
Under these assumptions, the invention displays the additional options 300 in Fig 16, in their ordinary size. That is, for example, the additional options 300, located on the right side of the Fig, are displayed as the same size as original options, such as 6H, located at the left side of the Fig.
One reason for this approach is the assumption that the designer who created the program which displays the additional options 300 made arrangements so that the options would fit into the display 3. Consequently, the invention does not enlarge additional options 300, because enlargement may prevent their fitting into the display 3.
However, even without enlargement, the additional options 300 may not fit the display 3, because the enlarged options 6E consume some space which the additional options 300 would have occupied. To prevent this problem, the invention dissolves the enlarged options 6E, as in Fig 18, whenever an enlarged option calls up additional options 300.
The additional options 300 can, themselves, call up further additional options. In such a case, the principles described in Figs 6 or 11 are applied to the additional options. That is, for example, if the user's fingertip touches them, and if their spacing falls below the threshold, they are enlarged. If not, they remain their normal size. These processes are repeated for yet additional options.
Consequently, in one form of the invention, options in the vicinity of a finger-touch on the screen 3 are enlarged. A user may select one of the enlarged options. If that option calls up additional options, the enlarged options, presently displayed, are removed, and the additional options are displayed in the usual manner.
If the additional options are too closely spaced when a user's finger approaches them, those in the vicinity of the finger are enlarged. If the user selects an enlarged option, and if the selected option calls up further options, the enlarged options are removed, and the further options are displayed in the usual manner. This process repeats, so long as additional options are called up by the enlarged options .
One Form of Invention
Fig 20 illustrates one form of the invention. A public kiosk 400, such as an Automated Teller Machine, ATM, contains a touch-sensitive screen 405. Among the components contained within the ATM are those contained within dashed box 410. A control system 415 for the touch screen 405 detects the positions on the screen 405 which are touched, and issues signals on line 416 indicating the co-ordinates of the touch-points. Computer 420 receives those coordinates, and uses them in execution of the processes described above. The computer 420 also controls the images displayed on screen 405.
Additional Considerations
1. It was stated above that enlargement of options in the vicinity of the finger is undertaken, if the options are spaced too closely. Fig 19 illustrates one type of spacing. If distance D is less than 0.5 inch, for example, then enlargement occurs if a touch occurs at point P. The options enlarged are those residing within, for example, 1.0 inch of point P. 2. In one form of the invention, a mere touch of the screen does not cause enlargement. That is, if a user touches the screen, but no options are located within the vicinity, such as within one inch, then no enlargement occurs. Consequently, a user may drag a finger across the screen, but nothing would happen, until an option comes within one inch of the finger.
This approach has the advantage of not distorting the screen. That is, if a magnified region of the screen jumped into view every time a person touched the screen, that may interfere with reading the screen.
3. In one embodiment, it is not necessary to evaluate the spacing between options. That is, whenever proximity of a finger, or other touching agent, is detected, the options within a certain radius of the finger are enlarged.
4. A touch screen was described above. Such screens are known in the art. Further, actual contact with the screen is not necessarily required. Proximity of a finger to the screen can be detected, as known in the art. That is, if a finger approaches within a small distance, such as 1/10 inch (2.54 mm), that proximity can be considered equivalent to contact, for purposes of option selection and enlargement .
Both situations can be classified as "tactile selection". Actual touching is clearly tactile. Close proximity is also viewed as tactile, because the proximity can be detected by equipment associated with the display screen.
5. It is not necessarily required that a human finger be used for the touching. For example, in cold weather, a human finger encased within a glove can be used. Alternately, a stylus, such as a pencil, held in a human hand, can be used.
A disabled person may not use a hand, but may use an artificial hand, or prosthesis. The prosthesis may hold the stylus just described.
In general, any object capable of touching the screen at a discrete spot of approximately the size of a human fingerprint can be used.
6. A two-stage selection process can be identified. First, a user makes a preliminary selection, by touching a cluster of closely-spaced options. In response, the invention enlarges options contained within the cluster.
The preliminary selection is not necessarily a selection of an option. The preliminary selection may be caused by physical contact with an option. However, in general, that situation will be accidental.
Next, a final selection is made, wherein the user selects one of the enlarged options. That option may be the same as the option described in the previous paragraph. That coincidence, in general, will be accidental.
7. In one form of the invention, multiple options are displayed, such as those shown in Fig 19. These options are sufficiently small that two, or more, fit into a space approximately the size of an average adult human fingerprint. That space is about 1/2 inch by 1/2 inch. Despite the small size of the displayed options, the invention allows a user to select a desired option, to the exclusion of others.
Another Form of the Invention Referring to Fig 21, there is shown a self-service terminal 510 having a user interface 512 comprising a screen 514 for displaying a plurality of selectable options (such as hypertext links) , touch sensing means 516 in the form of a touch panel aligned with and located adjacent to the screen 514, a card receiving slot 518 and a printer slot 520.
The screen 514 has an associated display driver 522, the touch panel 516 has a touch panel driver 524, the card receiving slot 518 has an associated motorised card reader module 526, and the printer slot 520 has an associated printer module 530. All of the drivers 522, 524 and the modules 526, 530 are connected to a terminal controller 532 that controls the operation of the terminal 510. The terminal controller 532 is also connected to a network connection 534 for communicating with an IP (Internet Protocol) network 536, such as the Internet, an Intranet, or an Extranet.
Referring to Fig 22, which is a block diagram of the terminal of Fig 21 excluding the user interface 512, the terminal controller 532 includes a processor 540, associated memory 542, and storage space 544 in the form of a hard disk drive. The hard disk 544 stores the operating system for the terminal 510, the application program that controls the terminal 510, and control means in the form of a zoom-in program. The zoom-in program may be based on a conventional zoom-in tool such as Zoo ln version 3.1 available from Microsoft (trade mark) Corporation. The zoom-in program is responsive to the touch panel driver 524 for displaying on the screen 514 an enlarged image of the area in contact with a probe.
Referring to Figs 22, 23 and 24, on power-up of the SST 510, the operating system kernel 50 (Fig 2) , the application program 52 (Fig 2) for controlling the terminal, and the zoom-in program 54 (Fig 2) are loaded into memory 42 (step 5100 of Fig 4) .
When a user 560 places a probe 562, such as his finger, in contact with the touch panel 516, the touch panel 516 senses this (step 602) and the panel driver 524 conveys coordinate data to the controller 532 for enabling the controller 532 to determine what pixel of the screen 514 has been touched by the user 560. The co-ordinate data is generally a single Cartesian (x,y) co-ordinate representing the pixel touched by the finger 562. As is known in the art the touch panel driver 524 performs calculations to determine from the plurality of pixels touched by the finger 562 what the centre pixel is; the co-ordinates of this centre pixel are sent to the controller 532.
The zoom-in program 554 uses this single co-ordinate to instruct the display driver 522 to display a magnification window 570 (step 604) having:
(1) an enlarged image of the area of the screen in the vicinity of the pixel represented by the single co-ordina e, and
(2) indication means 564, in the form of a pointer shaped like an arrow, pointing at the pixel represented by the single co-ordinate. It will be appreciated that a conventional zoom-in application such as Zoomln version 3.1 can be easily modified to include a pointer that points at the pixel represented by the single co-ordinate. The magnification window 570 overlies part of the main window 572 and is located so that it is not obscured by the user's hand.
The zoom-in program 554 instructs the display driver to display all of the pixels within a certain (x,y) distance of the pixel represented by the single co-ordinate so that a magnified image in the vicinity of the single co-ordinate is displayed on window 570.
Thus, the zoom-in program 554 enlarges the area around the finger 562 and adds a pointer 564 that points to the pixel represented by the single co-ordinate, it does not affect the size of any touch zones within the touch panel 516; thus, the operation of the touch panel 516 is unaffected by the zoom-in program 554. The pointer 564 gives the user 560 an indication of the exact area of the screen 514 he is pointing at.
As for conventional "windows" on screens, the magnification window 570 can be moved by the user to any desired position on the screen 514. As the technology used for creating and manipulating graphical windows on a display is well known in the art it will not be described in detail herein.
As shown in Fig 23, three small hypertext links 578,580,582 are displayed on screen 514. Initially, the user 560 places his finger 562 near to the desired hypertext link 580. When the user 560 touches the screen 514, the magnification window 570 is opened (step 604) and the user 560 is able to move his finger 562 to guide the pointer 564 to the desired hypertext link 580. The touch panel 516 continually monitors the position of the finger 562 to detect removal (step 606) or movement (step 608) of the finger 562.
If the pointer is moved then the touch panel driver 524 sends the new single co-ordinate and the zoom-in program 554 uses this co-ordinate to instruct the display driver 522 to update (step 610) the contents of the magnification window 570. When the user has moved his finger 562 so that the pointer 564 is located directly above the desired hypertext link 580, the user 560 then removes his finger 562 from the touch panel 516 and the hypertext link 580 is selected (step 612) by the touch panel driver 524. The magnification window 570 is then closed by the zoom-in program 554 until the touch panel 516 is touched again (step 602) .
It will be appreciated that the above embodiment has the advantage that any drift in the touch panel 516 is corrected because the user 560 can identify exactly what point on the screen 514 the touch panel 516 is sensing. The above embodiment also has the advantage that a user 560 is provided with a magnified view of what point he is touching.
It will now be appreciated that conventional web- enabled SSTs can be updated by adding a software module (the control means) to provide an SST according to one embodiment of the invention; this enables SSTs to be upgraded in the field, that is, existing SSTs may be retro-fitted with the control means to provide an embodiment of the invention.
Various modifications may be made to the above described embodiment within the scope of the invention, for example, the screen 514 and touch panel 516 may be incorporated into a single integral touchscreen unit. In other embodiments the probe may be a stylus. In other embodiments, the control means may be implemented in hardware or firmware.

Claims

Claims
1. A self-service terminal (510) comprising a screen (514) for displaying a plurality of selectable options, and touch sensing means (516) for detecting an area of the screen (514) that is in contact with a probe (562) , characterised in that the terminal (510) further comprises control means (552) responsive to the touch sensing means (516) for displaying on the screen an enlarged image of at least the area in contact with the probe (562) to assist selection of a selectable option (578,580,582) by a user.
2. A terminal according to claim 1, wherein the enlarged image includes the area in contact with the probe (562) and the area in the immediate vicinity of the area in contact with the probe (562) .
3. A terminal according to claim 1 or 2 , wherein the control means (552) provides indication means (564) for indicating what part of the enlarged image is in contact with the probe (562) .
4. A terminal according to any preceding claim, wherein the control means (552) are operative to display the enlarged image on an area of the screen that is not obscured by a user' s hand.
5. A terminal according to any preceding claim, wherein the enlarged image is displayed as a graphical window that is configurable by a user so that the user is able to resize the window and to select a desired magnification.
6. A method of assisting a user select options at a self-service terminal having a touch sensitive screen, the method comprising the steps of: detecting (step 602) an area of the screen that is in contact with a probe, and displaying (step 604) on the screen an enlarged image of at least the area in contact with the probe.
7. The method of claim 6 , wherein the step of displaying on the screen an enlarged image of at least the area in contact with the probe includes displaying on the screen an area in the immediate vicinity of the area in contact with the probe.
8. The method of claim 6 or 7, wherein the method includes the further step of indicating on the enlarged image what part of the image is in contact with the probe.
9. A touch sensitive screen for displaying a plurality of selectable options, the screen including touch sensing means (516) for detecting an area of the screen that is in contact with a probe, and control means responsive to the touch sensing means for displaying on the screen an enlarged image of at least the area in contact with the probe for assisting selection of a selectable option by a user.
10. A computer program product (552) for use with a computer (532) having a touch sensitive screen (514,516) responsive to a probe (562) , the product (52) comprising computer program code means, when the program is loaded, for responding to touch sensing means (516) and for displaying on the screen (514) an enlarged image of at least an area in contact with the probe (562) , and for displaying indication means (564) for indicating what part of the enlarged image is in contact with the probe (562) , for assisting selection of a selectable option by a user.
PCT/GB2000/001997 1999-06-02 2000-05-24 Self-service terminal WO2000075766A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU50878/00A AU5087800A (en) 1999-06-02 2000-05-24 Self-service terminal

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US32432599A 1999-06-02 1999-06-02
US09/324,325 1999-06-02
GB9923660A GB2355086A (en) 1999-10-06 1999-10-06 Selection of displayed options, for self-service terminals
GB9923660.6 1999-10-06

Publications (1)

Publication Number Publication Date
WO2000075766A1 true WO2000075766A1 (en) 2000-12-14

Family

ID=26315985

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2000/001997 WO2000075766A1 (en) 1999-06-02 2000-05-24 Self-service terminal

Country Status (2)

Country Link
AU (1) AU5087800A (en)
WO (1) WO2000075766A1 (en)

Cited By (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1349057A2 (en) * 2002-03-01 2003-10-01 Siemens Aktiengesellschaft Method for optical emphasis of informations contained in an area of the user interface of a computer workstation
WO2004017227A1 (en) * 2002-08-16 2004-02-26 Myorigo Oy Varying-content menus for touch screens
WO2004051392A2 (en) 2002-11-29 2004-06-17 Koninklijke Philips Electronics N.V. User interface with displaced representation of touch area
WO2006003590A2 (en) * 2004-06-29 2006-01-12 Koninklijke Philips Electronics, N.V. A method and device for preventing staining of a display device
US7066335B2 (en) 2001-12-19 2006-06-27 Pretech As Apparatus for receiving and distributing cash
CN1304937C (en) * 2002-04-03 2007-03-14 日本先锋公司 Display portion integrated type touch panel apparatus and method for manufacturing the same
WO2008070815A1 (en) * 2006-12-07 2008-06-12 Microsoft Corporation Operating touch screen interfaces
EP1942402A1 (en) 2006-12-28 2008-07-09 Samsung Electronics Co., Ltd. Method to provide menu, using menu set and multimedia device using the same
WO2008052100A3 (en) * 2006-10-26 2008-09-12 Apple Inc Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
EP1993028A1 (en) 2007-05-15 2008-11-19 High Tech Computer Corp. Method and device for handling large input mechanisms in touch screens
WO2009104064A1 (en) 2008-02-18 2009-08-27 Nokia Corporation Appartatus, method and computer program product for manipulating a reference designator listing
EP1847915A3 (en) * 2006-04-19 2009-11-18 LG Electronics Inc. Touch screen device and method of displaying and selecting menus thereof
EP2131272A2 (en) * 2008-06-02 2009-12-09 LG Electronics Inc. Mobile communication terminal having proximity sensor and display controlling method therein
JP2010102662A (en) * 2008-10-27 2010-05-06 Sharp Corp Display apparatus and mobile terminal
EP2199899A1 (en) * 2008-12-22 2010-06-23 BRITISH TELECOMMUNICATIONS public limited company Touch sensitive display
US7760187B2 (en) 2004-07-30 2010-07-20 Apple Inc. Visual expander
JP2010181940A (en) * 2009-02-03 2010-08-19 Zenrin Datacom Co Ltd Apparatus and method for processing image
US7782308B2 (en) 2006-05-24 2010-08-24 Lg Electronics Inc. Touch screen device and method of method of displaying images thereon
WO2010095109A1 (en) 2009-02-20 2010-08-26 Nokia Corporation Method and apparatus for causing display of a cursor
CN101825990A (en) * 2010-04-28 2010-09-08 宇龙计算机通信科技(深圳)有限公司 Touch point positioning method and system and touch screen device
WO2010135128A1 (en) 2009-05-21 2010-11-25 Sony Computer Entertainment Inc. Touch screen disambiguation based on prior ancillary touch input
WO2010135160A2 (en) * 2009-05-21 2010-11-25 Microsoft Corporation Method of visualizing an input location
US7856605B2 (en) 2006-10-26 2010-12-21 Apple Inc. Method, system, and graphical user interface for positioning an insertion marker in a touch screen display
WO2011081889A1 (en) * 2009-12-15 2011-07-07 Apple Inc. Device, method, and graphical user interface for management and manipulation of user interface elements
JP2011154524A (en) * 2010-01-27 2011-08-11 Fujitsu Toshiba Mobile Communications Ltd Three-dimensional input device
WO2011105981A1 (en) * 2010-02-26 2011-09-01 Echostar Ukraine, L.L.C. System and methods for enhancing operation of a graphical user interface
US8028251B2 (en) 2006-05-24 2011-09-27 Lg Electronics Inc. Touch screen device and method of selecting files thereon
EP2386941A3 (en) * 2010-05-14 2012-02-01 Sony Corporation Information processing apparatus and method, and program
WO2012013987A1 (en) * 2010-07-30 2012-02-02 Jaguar Cars Limited Computing device with improved function selection and method
EP2402846A3 (en) * 2010-06-29 2012-03-14 Lg Electronics Inc. Mobile terminal and method for controlling operation of the mobile terminal
CN102414654A (en) * 2009-05-07 2012-04-11 创新科技有限公司 Methods for searching digital files on a user interface
JP2012084125A (en) * 2010-09-14 2012-04-26 Sanyo Electric Co Ltd Image display device
EP1674976A3 (en) * 2004-12-22 2012-05-02 Microsoft Corporation Improving touch screen accuracy
US8201109B2 (en) 2008-03-04 2012-06-12 Apple Inc. Methods and graphical user interfaces for editing on a portable multifunction device
CN102707872A (en) * 2011-02-28 2012-10-03 微软公司 Scrollable list navigation using persistent headings
JP2012203644A (en) * 2011-03-25 2012-10-22 Kyocera Corp Electronic device
US8296889B2 (en) 2009-12-17 2012-10-30 Whirlpoop Corporation Laundry treatment appliance control system
EP2523083A1 (en) * 2011-05-13 2012-11-14 Harman Becker Automotive Systems GmbH System and method for operating a touchscreen and a processing unit
US8370736B2 (en) 2009-03-16 2013-02-05 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
EP2601569A1 (en) * 2010-08-08 2013-06-12 Qualcomm Incorporated Method and system for adjusting display content
EP2463765A3 (en) * 2010-12-07 2013-10-02 Sony Ericsson Mobile Communications AB Touch input disambiguation
US8570278B2 (en) 2006-10-26 2013-10-29 Apple Inc. Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
US8661339B2 (en) 2011-05-31 2014-02-25 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US8681132B2 (en) * 2004-03-18 2014-03-25 Koninklijke Philips N.V. Scanning display apparatus
US8713975B2 (en) 2009-12-17 2014-05-06 Whirlpool Corporation Laundry treating appliance control system
US20140129985A1 (en) * 2012-11-02 2014-05-08 Microsoft Corporation Touch based selection of graphical elements
US8766911B2 (en) 2007-05-16 2014-07-01 Volkswagen Ag Multifunction display and operating device and method for operating a multifunction display and operating device having improved selection operation
EP2755124A1 (en) * 2013-01-15 2014-07-16 BlackBerry Limited Enhanced display of interactive elements in a browser
US20140359507A1 (en) * 2013-05-30 2014-12-04 Samsung Electronics Co., Ltd. Method and apparatus for displaying images in touchscreen-based devices
WO2014190951A1 (en) * 2013-05-29 2014-12-04 Gu Hongbo System and method for mapping occluded area
US9041658B2 (en) 2006-05-24 2015-05-26 Lg Electronics Inc Touch screen device and operating method thereof
EP2452260A4 (en) * 2009-07-09 2016-02-24 Qualcomm Inc Automatic enlargement of viewing area with selectable objects
DE102006037155B4 (en) * 2006-03-27 2016-02-25 Volkswagen Ag Multimedia device and method for operating a multimedia device
WO2016100548A3 (en) * 2014-12-17 2016-11-03 Datalogic ADC, Inc. Floating soft trigger for touch displays on electronic device
US9671867B2 (en) 2006-03-22 2017-06-06 Volkswagen Ag Interactive control device and method for operating the interactive control device
EP2377000A4 (en) * 2008-12-31 2017-08-09 QUALCOMM Incorporated Computing device and method for selecting display regions responsive to non-discrete directional input actions and intelligent content analysis
US9798454B2 (en) 2013-03-22 2017-10-24 Oce-Technologies B.V. Method for performing a user action upon a digital item
US9904405B2 (en) 2008-03-20 2018-02-27 Lg Electronics Inc. Portable terminal capable of sensing proximity touch and method for controlling screen in the same
US10152228B2 (en) 2013-01-15 2018-12-11 Blackberry Limited Enhanced display of interactive elements in a browser
CN109407939A (en) * 2018-10-12 2019-03-01 重阳健康数据技术(深圳)有限责任公司 A kind of terminal image amplification method, device and computer readable storage medium
EP3647927A4 (en) * 2018-02-08 2021-01-27 Rakuten, Inc. Selection device, selection method, program, and non-transitory computer-readable information recording medium
US11153472B2 (en) 2005-10-17 2021-10-19 Cutting Edge Vision, LLC Automatic upload of pictures from a camera

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0291690A (en) * 1988-09-28 1990-03-30 Toshiba Corp Enlargement/reduction display system
EP0476972A2 (en) * 1990-09-17 1992-03-25 Xerox Corporation Touch screen user interface with expanding touch locations for a reprographic machine
WO1994029788A1 (en) * 1993-06-15 1994-12-22 Honeywell Inc. A method for utilizing a low resolution touch screen system in a high resolution graphics environment
US5553225A (en) * 1994-10-25 1996-09-03 International Business Machines Corporation Method and apparatus for combining a zoom function in scroll bar sliders
JPH10269022A (en) * 1997-03-25 1998-10-09 Hitachi Ltd Portable information processor with communication function
WO1998052118A1 (en) * 1997-05-15 1998-11-19 Sony Electronics, Inc. Display of menu items on a computer screen
WO1999054807A1 (en) * 1998-04-17 1999-10-28 Koninklijke Philips Electronics N.V. Graphical user interface touch screen with an auto zoom feature

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0291690A (en) * 1988-09-28 1990-03-30 Toshiba Corp Enlargement/reduction display system
EP0476972A2 (en) * 1990-09-17 1992-03-25 Xerox Corporation Touch screen user interface with expanding touch locations for a reprographic machine
WO1994029788A1 (en) * 1993-06-15 1994-12-22 Honeywell Inc. A method for utilizing a low resolution touch screen system in a high resolution graphics environment
US5553225A (en) * 1994-10-25 1996-09-03 International Business Machines Corporation Method and apparatus for combining a zoom function in scroll bar sliders
JPH10269022A (en) * 1997-03-25 1998-10-09 Hitachi Ltd Portable information processor with communication function
WO1998052118A1 (en) * 1997-05-15 1998-11-19 Sony Electronics, Inc. Display of menu items on a computer screen
WO1999054807A1 (en) * 1998-04-17 1999-10-28 Koninklijke Philips Electronics N.V. Graphical user interface touch screen with an auto zoom feature

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
"INTERNET KIOSK TOUCH PANEL SHELL", IBM TECHNICAL DISCLOSURE BULLETIN,US,IBM CORP. NEW YORK, vol. 39, no. 8, 1 August 1996 (1996-08-01), pages 85 - 87, XP000638146, ISSN: 0018-8689 *
PATENT ABSTRACTS OF JAPAN vol. 014, no. 296 (P - 1067) 26 June 1990 (1990-06-26) *
PATENT ABSTRACTS OF JAPAN vol. 1999, no. 01 29 January 1999 (1999-01-29) *

Cited By (117)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7066335B2 (en) 2001-12-19 2006-06-27 Pretech As Apparatus for receiving and distributing cash
EP1349057A3 (en) * 2002-03-01 2006-05-31 Siemens Aktiengesellschaft Method for optical emphasis of informations contained in an area of the user interface of a computer workstation
EP1349057A2 (en) * 2002-03-01 2003-10-01 Siemens Aktiengesellschaft Method for optical emphasis of informations contained in an area of the user interface of a computer workstation
CN1304937C (en) * 2002-04-03 2007-03-14 日本先锋公司 Display portion integrated type touch panel apparatus and method for manufacturing the same
WO2004017227A1 (en) * 2002-08-16 2004-02-26 Myorigo Oy Varying-content menus for touch screens
WO2004051392A3 (en) * 2002-11-29 2004-12-29 Koninkl Philips Electronics Nv User interface with displaced representation of touch area
WO2004051392A2 (en) 2002-11-29 2004-06-17 Koninklijke Philips Electronics N.V. User interface with displaced representation of touch area
JP2006520024A (en) * 2002-11-29 2006-08-31 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ User interface using moved representation of contact area
US8042044B2 (en) 2002-11-29 2011-10-18 Koninklijke Philips Electronics N.V. User interface with displaced representation of touch area
KR101016981B1 (en) * 2002-11-29 2011-02-28 코닌클리케 필립스 일렉트로닉스 엔.브이. Data processing system, method of enabling a user to interact with the data processing system and computer-readable medium having stored a computer program product
US8681132B2 (en) * 2004-03-18 2014-03-25 Koninklijke Philips N.V. Scanning display apparatus
WO2006003590A2 (en) * 2004-06-29 2006-01-12 Koninklijke Philips Electronics, N.V. A method and device for preventing staining of a display device
WO2006003590A3 (en) * 2004-06-29 2006-05-18 Koninkl Philips Electronics Nv A method and device for preventing staining of a display device
US8427445B2 (en) 2004-07-30 2013-04-23 Apple Inc. Visual expander
US7760187B2 (en) 2004-07-30 2010-07-20 Apple Inc. Visual expander
EP1674976A3 (en) * 2004-12-22 2012-05-02 Microsoft Corporation Improving touch screen accuracy
US11818458B2 (en) 2005-10-17 2023-11-14 Cutting Edge Vision, LLC Camera touchpad
US11153472B2 (en) 2005-10-17 2021-10-19 Cutting Edge Vision, LLC Automatic upload of pictures from a camera
US9671867B2 (en) 2006-03-22 2017-06-06 Volkswagen Ag Interactive control device and method for operating the interactive control device
DE102006037155B4 (en) * 2006-03-27 2016-02-25 Volkswagen Ag Multimedia device and method for operating a multimedia device
EP1847915A3 (en) * 2006-04-19 2009-11-18 LG Electronics Inc. Touch screen device and method of displaying and selecting menus thereof
US7737958B2 (en) 2006-04-19 2010-06-15 Lg Electronics Inc. Touch screen device and method of displaying and selecting menus thereof
US9041658B2 (en) 2006-05-24 2015-05-26 Lg Electronics Inc Touch screen device and operating method thereof
US7782308B2 (en) 2006-05-24 2010-08-24 Lg Electronics Inc. Touch screen device and method of method of displaying images thereon
US7916125B2 (en) 2006-05-24 2011-03-29 Lg Electronics Inc. Touch screen device and method of displaying images thereon
US8028251B2 (en) 2006-05-24 2011-09-27 Lg Electronics Inc. Touch screen device and method of selecting files thereon
US9058099B2 (en) 2006-05-24 2015-06-16 Lg Electronics Inc. Touch screen device and operating method thereof
US9348511B2 (en) 2006-10-26 2016-05-24 Apple Inc. Method, system, and graphical user interface for positioning an insertion marker in a touch screen display
US9632695B2 (en) 2006-10-26 2017-04-25 Apple Inc. Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
US8570278B2 (en) 2006-10-26 2013-10-29 Apple Inc. Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
US9207855B2 (en) 2006-10-26 2015-12-08 Apple Inc. Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
WO2008052100A3 (en) * 2006-10-26 2008-09-12 Apple Inc Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
EP2437155A1 (en) * 2006-10-26 2012-04-04 Apple Inc. Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
US7856605B2 (en) 2006-10-26 2010-12-21 Apple Inc. Method, system, and graphical user interface for positioning an insertion marker in a touch screen display
CN102999290B (en) * 2006-12-07 2017-12-05 微软技术许可有限责任公司 The method and computer system of target item are presented over the display
KR101486688B1 (en) 2006-12-07 2015-01-29 마이크로소프트 코포레이션 Operating touch screen interfaces
CN101553775B (en) * 2006-12-07 2012-11-14 微软公司 Operating touch screen interfaces
RU2623181C2 (en) * 2006-12-07 2017-06-27 МАЙКРОСОФТ ТЕКНОЛОДЖИ ЛАЙСЕНСИНГ, ЭлЭлСи Operation interfaces of touch screen
TWI450136B (en) * 2006-12-07 2014-08-21 Microsoft Corp Operating touch screen interfaces
EP2126672A4 (en) * 2006-12-07 2016-01-06 Microsoft Technology Licensing Llc Operating touch screen interfaces
WO2008070815A1 (en) * 2006-12-07 2008-06-12 Microsoft Corporation Operating touch screen interfaces
US7692629B2 (en) 2006-12-07 2010-04-06 Microsoft Corporation Operating touch screen interfaces
EP2189885A3 (en) * 2006-12-28 2013-01-16 Samsung Electronics Co., Ltd. Method to provide menu, using menu set and multimedia device using the same
EP1942402A1 (en) 2006-12-28 2008-07-09 Samsung Electronics Co., Ltd. Method to provide menu, using menu set and multimedia device using the same
EP1993028A1 (en) 2007-05-15 2008-11-19 High Tech Computer Corp. Method and device for handling large input mechanisms in touch screens
US8766911B2 (en) 2007-05-16 2014-07-01 Volkswagen Ag Multifunction display and operating device and method for operating a multifunction display and operating device having improved selection operation
WO2009104064A1 (en) 2008-02-18 2009-08-27 Nokia Corporation Appartatus, method and computer program product for manipulating a reference designator listing
US9529524B2 (en) 2008-03-04 2016-12-27 Apple Inc. Methods and graphical user interfaces for editing on a portable multifunction device
US8201109B2 (en) 2008-03-04 2012-06-12 Apple Inc. Methods and graphical user interfaces for editing on a portable multifunction device
US9904405B2 (en) 2008-03-20 2018-02-27 Lg Electronics Inc. Portable terminal capable of sensing proximity touch and method for controlling screen in the same
EP2104024B1 (en) * 2008-03-20 2018-05-02 LG Electronics Inc. Portable terminal capable of sensing proximity touch and method for controlling screen using the same
EP2131272A2 (en) * 2008-06-02 2009-12-09 LG Electronics Inc. Mobile communication terminal having proximity sensor and display controlling method therein
JP2010102662A (en) * 2008-10-27 2010-05-06 Sharp Corp Display apparatus and mobile terminal
EP2199899A1 (en) * 2008-12-22 2010-06-23 BRITISH TELECOMMUNICATIONS public limited company Touch sensitive display
EP2377000A4 (en) * 2008-12-31 2017-08-09 QUALCOMM Incorporated Computing device and method for selecting display regions responsive to non-discrete directional input actions and intelligent content analysis
JP2010181940A (en) * 2009-02-03 2010-08-19 Zenrin Datacom Co Ltd Apparatus and method for processing image
EP2399187A4 (en) * 2009-02-20 2017-03-08 Nokia Technologies Oy Method and apparatus for causing display of a cursor
WO2010095109A1 (en) 2009-02-20 2010-08-26 Nokia Corporation Method and apparatus for causing display of a cursor
US8510665B2 (en) 2009-03-16 2013-08-13 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8756534B2 (en) 2009-03-16 2014-06-17 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8584050B2 (en) 2009-03-16 2013-11-12 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8661362B2 (en) 2009-03-16 2014-02-25 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US9846533B2 (en) 2009-03-16 2017-12-19 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US10761716B2 (en) 2009-03-16 2020-09-01 Apple, Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US9875013B2 (en) 2009-03-16 2018-01-23 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8370736B2 (en) 2009-03-16 2013-02-05 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
CN102414654A (en) * 2009-05-07 2012-04-11 创新科技有限公司 Methods for searching digital files on a user interface
EP2433201A4 (en) * 2009-05-21 2017-09-13 Sony Interactive Entertainment Inc. Touch screen disambiguation based on prior ancillary touch input
WO2010135160A2 (en) * 2009-05-21 2010-11-25 Microsoft Corporation Method of visualizing an input location
US8416193B2 (en) 2009-05-21 2013-04-09 Microsoft Corporation Method of visualizing an input location
WO2010135160A3 (en) * 2009-05-21 2011-03-10 Microsoft Corporation Method of visualizing an input location
US10705692B2 (en) 2009-05-21 2020-07-07 Sony Interactive Entertainment Inc. Continuous and dynamic scene decomposition for user interface
CN102439543A (en) * 2009-05-21 2012-05-02 微软公司 Method of visualizing an input location
WO2010135128A1 (en) 2009-05-21 2010-11-25 Sony Computer Entertainment Inc. Touch screen disambiguation based on prior ancillary touch input
US9927964B2 (en) 2009-05-21 2018-03-27 Sony Computer Entertainment Inc. Customization of GUI layout based on history of use
US9372614B2 (en) 2009-07-09 2016-06-21 Qualcomm Incorporated Automatic enlargement of viewing area with selectable objects
EP2452260A4 (en) * 2009-07-09 2016-02-24 Qualcomm Inc Automatic enlargement of viewing area with selectable objects
US8358281B2 (en) 2009-12-15 2013-01-22 Apple Inc. Device, method, and graphical user interface for management and manipulation of user interface elements
WO2011081889A1 (en) * 2009-12-15 2011-07-07 Apple Inc. Device, method, and graphical user interface for management and manipulation of user interface elements
US8296889B2 (en) 2009-12-17 2012-10-30 Whirlpoop Corporation Laundry treatment appliance control system
US8713975B2 (en) 2009-12-17 2014-05-06 Whirlpool Corporation Laundry treating appliance control system
JP2011154524A (en) * 2010-01-27 2011-08-11 Fujitsu Toshiba Mobile Communications Ltd Three-dimensional input device
WO2011105981A1 (en) * 2010-02-26 2011-09-01 Echostar Ukraine, L.L.C. System and methods for enhancing operation of a graphical user interface
CN101825990A (en) * 2010-04-28 2010-09-08 宇龙计算机通信科技(深圳)有限公司 Touch point positioning method and system and touch screen device
EP2386941A3 (en) * 2010-05-14 2012-02-01 Sony Corporation Information processing apparatus and method, and program
EP2402846A3 (en) * 2010-06-29 2012-03-14 Lg Electronics Inc. Mobile terminal and method for controlling operation of the mobile terminal
US9285967B2 (en) 2010-07-30 2016-03-15 Jaguar Land Rover Limited Computing device with improved function selection and method
WO2012013987A1 (en) * 2010-07-30 2012-02-02 Jaguar Cars Limited Computing device with improved function selection and method
EP2601569A1 (en) * 2010-08-08 2013-06-12 Qualcomm Incorporated Method and system for adjusting display content
JP2012084125A (en) * 2010-09-14 2012-04-26 Sanyo Electric Co Ltd Image display device
EP2463765A3 (en) * 2010-12-07 2013-10-02 Sony Ericsson Mobile Communications AB Touch input disambiguation
CN102707872A (en) * 2011-02-28 2012-10-03 微软公司 Scrollable list navigation using persistent headings
JP2012203644A (en) * 2011-03-25 2012-10-22 Kyocera Corp Electronic device
EP2523083A1 (en) * 2011-05-13 2012-11-14 Harman Becker Automotive Systems GmbH System and method for operating a touchscreen and a processing unit
US9244605B2 (en) 2011-05-31 2016-01-26 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US9092130B2 (en) 2011-05-31 2015-07-28 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US10664144B2 (en) 2011-05-31 2020-05-26 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US11256401B2 (en) 2011-05-31 2022-02-22 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US8677232B2 (en) 2011-05-31 2014-03-18 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US8661339B2 (en) 2011-05-31 2014-02-25 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US8719695B2 (en) 2011-05-31 2014-05-06 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US9690449B2 (en) 2012-11-02 2017-06-27 Microsoft Technology Licensing, Llc Touch based selection of graphical elements
WO2014071245A1 (en) * 2012-11-02 2014-05-08 Microsoft Corporation Touch based selection of graphical elements
US20140129985A1 (en) * 2012-11-02 2014-05-08 Microsoft Corporation Touch based selection of graphical elements
EP2755124A1 (en) * 2013-01-15 2014-07-16 BlackBerry Limited Enhanced display of interactive elements in a browser
US10152228B2 (en) 2013-01-15 2018-12-11 Blackberry Limited Enhanced display of interactive elements in a browser
US9798454B2 (en) 2013-03-22 2017-10-24 Oce-Technologies B.V. Method for performing a user action upon a digital item
WO2014190951A1 (en) * 2013-05-29 2014-12-04 Gu Hongbo System and method for mapping occluded area
US20140359507A1 (en) * 2013-05-30 2014-12-04 Samsung Electronics Co., Ltd. Method and apparatus for displaying images in touchscreen-based devices
KR102120651B1 (en) 2013-05-30 2020-06-09 삼성전자 주식회사 Method and apparatus for displaying a seen in a device comprising a touch screen
US9886741B2 (en) 2013-05-30 2018-02-06 Samsung Electronics Co., Ltd. Method and apparatus for displaying images in touchscreen-based devices
KR20140140759A (en) * 2013-05-30 2014-12-10 삼성전자주식회사 Method and apparatus for displaying a seen in a device comprising a touch screen
EP2808776A3 (en) * 2013-05-30 2015-04-01 Samsung Electronics Co., Ltd Method and apparatus for displaying images in touchscreen-based devices
WO2016100548A3 (en) * 2014-12-17 2016-11-03 Datalogic ADC, Inc. Floating soft trigger for touch displays on electronic device
EP3647927A4 (en) * 2018-02-08 2021-01-27 Rakuten, Inc. Selection device, selection method, program, and non-transitory computer-readable information recording medium
CN109407939B (en) * 2018-10-12 2021-03-30 深圳鑫想科技有限责任公司 Terminal image amplification method and device and computer readable storage medium
CN109407939A (en) * 2018-10-12 2019-03-01 重阳健康数据技术(深圳)有限责任公司 A kind of terminal image amplification method, device and computer readable storage medium

Also Published As

Publication number Publication date
AU5087800A (en) 2000-12-28

Similar Documents

Publication Publication Date Title
WO2000075766A1 (en) Self-service terminal
US7245293B2 (en) Display unit with touch panel and information processing method
US6037929A (en) Coordinate input system and method of controlling same
US10055046B2 (en) Touch-sensitive electronic apparatus for media applications, and methods therefor
US6727892B1 (en) Method of facilitating the selection of features at edges of computer touch screens
US6587131B1 (en) Method for assisting user to operate pointer
CN102929520B (en) The input of touch screen terminal and output intent and device
EP0990202B1 (en) Graphical user interface touch screen with an auto zoom feature
JP4093823B2 (en) View movement operation method
US7760187B2 (en) Visual expander
US6181325B1 (en) Computer system with precise control of the mouse pointer
JP3319647B2 (en) Character input device
KR100999895B1 (en) Input device
EP1191430A1 (en) Graphical user interface for devices having small tactile displays
CN102934065B (en) Signal conditioning package
US20120013645A1 (en) Display and method of displaying icon image
CN102262504A (en) User interaction gestures with virtual keyboard
EP1292880A1 (en) Immediate mouse control of measuring functionalities for medical images
WO2007082290A2 (en) User interface for a touch-screen based computing device and method therefor
GB2355086A (en) Selection of displayed options, for self-service terminals
JP5492627B2 (en) Information display device and information display method
JPH1145141A (en) Data input device
KR101403079B1 (en) method for zooming in touchscreen and terminal using the same
JPWO2002046899A1 (en) Window display control method, window display control device, and computer-readable recording medium recording program
JPH11175212A (en) Touch operation processing method for touch panel device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP