US20110279415A1 - Method and system for implementing a user interface for a device employing written graphical elements - Google Patents

Method and system for implementing a user interface for a device employing written graphical elements Download PDF

Info

Publication number
US20110279415A1
US20110279415A1 US12/942,927 US94292710A US2011279415A1 US 20110279415 A1 US20110279415 A1 US 20110279415A1 US 94292710 A US94292710 A US 94292710A US 2011279415 A1 US2011279415 A1 US 2011279415A1
Authority
US
United States
Prior art keywords
graphical element
element icon
function
icon
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/942,927
Inventor
James Marggraff
Alexander Chisholm
Tracy L. Edgecomb
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Leapfrog Enterprises Inc
Original Assignee
Leapfrog Enterprises Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US10/803,806 external-priority patent/US20040229195A1/en
Priority claimed from US10/861,243 external-priority patent/US20060033725A1/en
Application filed by Leapfrog Enterprises Inc filed Critical Leapfrog Enterprises Inc
Priority to US12/942,927 priority Critical patent/US20110279415A1/en
Publication of US20110279415A1 publication Critical patent/US20110279415A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0317Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
    • G06F3/0321Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface by optically sensing the absolute position with respect to a regularly patterned surface forming a passive digitiser, e.g. pen optically detecting position indicative tags printed on a paper sheet
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/142Image acquisition using hand-held instruments; Constructional details of the instruments
    • G06V30/1423Image acquisition using hand-held instruments; Constructional details of the instruments the instrument generating sequences of position coordinates corresponding to handwriting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/32Digital ink
    • G06V30/36Matching; Classification

Definitions

  • Embodiments of the invention relate to the control and use of interactive devices, computers, electronic devices, appliances, toys, and the like.
  • Devices such as optical readers or optical pens conventionally emit light that reflects off a surface to a detector or imager. As the device is moved relative to the surface (or vice versa), successive images are rapidly captured. By analyzing the images, movement of the optical device relative to the surface can be tracked.
  • optical pen One type of optical pen is used with a sheet of paper on which very small dots are printed.
  • the dots are printed on the page in a pattern with a nominal spacing of about 0.3 millimeters (0.01 inches).
  • the pattern of dots within any region on the page is unique to that region.
  • the optical pen essentially takes a snapshot of the surface, perhaps 100 times a second or more. By interpreting the dot positions captured in each snapshot, the optical pen can precisely determine its position relative to the page.
  • An optical pen with Bluetooth or other wireless capability can be linked to other devices and used for sending electronic mail (e-mail) or faxes.
  • a typical prior art optical pen will implement its intended functionality by the user operating one or more buttons/switches or controls of the optical pen to activate one or more software programs, routines, embedded devices, or the like.
  • the pen may contain or be in communication with a computer system. Upon actuation of such controls, the pen device performs its intended function. Accessing the capabilities of increasingly powerful optical pens through the limited number and configuration of switches, buttons, etc. provided on the pen itself, or any remotely coupled computer system device, is not a satisfactory arrangement.
  • One prior art solution uses the optical pen to recognize a user-defined command, and uses that command to invoke some function of the pen (e.g., PCT publication WO/01/48590 A1). For example, a user's writing can be recognized (e.g., in real-time) and interpreted as a command for the optical pen.
  • the drawback with this solution involves the fact that interaction and control of the functions of the pen requires real-time recognition of the user's handwriting (e.g., as the user writes the command down on a sheet of paper).
  • This solution is not satisfactory due to the fact that interaction with more complex functionality of an optical pen requires the user to repeatedly write-down one or more commands to access different choices, options, or functions provided by the pen. While the solution might be satisfactory for exceedingly simple, single step type applications (e.g., “turn off”, “store”, etc.), the solution is overly cumbersome and limiting in those cases where more complex, satisfying, rich functionality is desired.
  • a user interface method and system that enables interaction with more complex functionality of an optical pen device having a computer system associated therewith and enables more efficient access to the different choices, options, and functions provided by the pen device, would be valuable.
  • a method and interactive interface useful for interacting with an operating system resident on, or in communication with, a pen device.
  • the present invention is implemented as a method for implementing a user interface for a device employing user created or written graphical elements and/or printed graphical elements that are on a surface.
  • the method includes recognizing a created graphical element icon (e.g., created by a user) on a surface. Once recognized, a function related to the graphical element icon is accessed and an output in accordance with the function is provided.
  • the functionality may reside on the pen device and the written graphical element may be written using the pen device.
  • the graphical element icon can be a symbol, character, or mark created on the surface by the user, that is recognized as such by interpreting functionality (e.g., optical sensors, embedded computer system, etc.) of the device.
  • the output is typically an audio output provided via an audio output device (e.g., a speaker coupled to, or resident on, the device).
  • the function is persistently associated with the graphical element icon, enabling a subsequent access of the function (e.g., at some later time) by a subsequent actuation (e.g., tapping) of the graphical element icon by the pen device.
  • the selection of a written or printed graphical element icon causes the pen device to audibly render a list of further selections that may be written and selected by the user.
  • the first graphic element icon functions as a menu item that when selected causes the pen device to render sub-menu items related to the first graphic element icon. Any, or all, of the sub-menu items may be written on the surface and themselves selected, thereby causing the pen device to perform related functionality.
  • FIG. 1 is a block diagram of a device upon which embodiments of the present invention can be implemented.
  • FIG. 2 is a block diagram of another device upon which embodiments of the present invention can be implemented.
  • FIG. 3 shows an exemplary sheet of paper provided with a pattern of marks according to one embodiment of the present invention.
  • FIG. 4 shows an enlargement of a pattern of marks on an exemplary sheet of paper according to one embodiment of the present invention.
  • FIG. 5 shows a computer-controlled flowchart of the steps of a device user interface process in accordance with one embodiment of the present invention.
  • FIG. 6 shows a computer-controlled flowchart of the steps of a hierarchical device user interface process in accordance with one embodiment of the present invention.
  • FIG. 7 shows a menu item tree directory according to an embodiment of the present invention.
  • FIG. 8A shows a menu item audible prompting process in accordance with one embodiment of the present invention.
  • FIG. 8B shows a menu item selection process in accordance with one embodiment of the present invention.
  • FIG. 8C shows a sub-menu items selection process in accordance with one embodiment of the present invention.
  • FIG. 9 shows a plurality of different types of graphical item icons on a surface in accordance with one embodiment of the present invention.
  • FIG. 1 is a block diagram of a pen device 100 upon which embodiments of the present invention can be implemented.
  • pen device 100 may be referred to as an optical device, more specifically as an optical reader, optical pen or digital pen.
  • the device may contain a computer system and an operating system resident thereon. Application programs may also reside thereon.
  • pen device 100 includes a processor 32 inside a housing 62 .
  • housing 62 has the form of a pen or other writing or marking utensil or instrument.
  • Processor 32 is operable for processing information and instructions used to implement the functions of pen device 100 , which are described below.
  • the pen device 100 may include an audio output device 36 and a display device 40 coupled to the processor 32 .
  • the audio output device and/or the display device are physically separated from pen device 100 , but in communication with pen device 100 through either a wired or wireless connection.
  • pen device 100 can include a transceiver or transmitter (not shown in FIG. 1 ).
  • the audio output device 36 may include a speaker or an audio jack (e.g., for an earphone or headphone).
  • the display device 40 may be a liquid crystal display (LCD) or some other suitable type of display.
  • pen device 100 may include input buttons 38 coupled to the processor 32 for activating and controlling the pen device 100 .
  • the input buttons 38 allow a user to input information and commands to pen device 100 or to turn pen device 100 on or off.
  • Pen device 100 also includes a power source 34 such as a battery.
  • Pen device 100 also includes a light source or optical emitter 44 and a light sensor or optical detector 42 coupled to the processor 32 .
  • the optical emitter 44 may be a light emitting diode (LED), for example, and the optical detector 42 may be a charge coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) imager array, for example.
  • the optical emitter 44 illuminates surface 70 or a portion thereof. Light reflected from the surface 70 is received at and recorded by optical detector 42 .
  • the surface 70 may be a sheet a paper, although the present invention is not so limited.
  • the surface 70 may comprise an LCD (liquid crystal display, CRT (cathode ray tube), touch screen, a surface comprising electronic ink, reconfigurable paper, or other types of electronically active surfaces (e.g., the display of a laptop or tablet PC).
  • LCD liquid crystal display
  • CRT cathode ray tube
  • touch screen a surface comprising electronic ink
  • reconfigurable paper e.g., the display of a laptop or tablet PC.
  • a pattern of markings is printed on surface 70 .
  • the end of pen device 100 that holds optical emitter 44 and optical detector 42 is placed against or near surface 70 .
  • the pattern of markings are read and recorded by optical emitter 44 and optical detector 42 .
  • the markings on surface 70 are used to determine the position of pen device 100 relative to surface (see FIGS. 3 and 4 ).
  • the markings on surface 70 are used to encode information (see FIGS. 5 and 6 ).
  • the captured images of surface 70 can be analyzed (processed) by pen device 100 to decode the markings and recover the encoded information.
  • Pen device 100 of FIG. 1 also includes a memory unit 48 coupled to the processor 32 .
  • memory unit 48 is a removable memory unit embodied as a memory cartridge or a memory card.
  • memory unit 48 includes random access (volatile) memory (RAM) and read-only (non-volatile) memory (ROM) for storing information and instructions for processor 32 .
  • pen device 100 includes a writing element 52 situated at the same end of pen device 100 as the optical detector 42 and the optical emitter 44 .
  • Writing element 52 can be, for example, a pen, pencil, marker or the like, and may or may not be retractable. In certain applications, writing element 52 is not needed.
  • a user can use writing element 52 to make marks (e.g., graphical elements) on surface 70 , including characters such as letters, words, numbers, mathematical symbols and the like. These marks can be scanned (imaged) and interpreted by pen device 100 according to their position on the surface 70 . The position of the user-produced marks can be determined using a pattern of marks that are printed on surface 70 ; refer to the discussion of FIGS. 3 and 4 , below.
  • the user-produced markings can be interpreted by pen device 100 using optical character recognition (OCR) techniques that recognize handwritten characters.
  • OCR optical character recognition
  • surface 70 may be any surface suitable on which to write, such as, for example, a sheet of paper, although surfaces consisting of materials other than paper may be used. Also, surface 70 may or may not be flat. For example, surface 70 may be embodied as the surface of a globe. Furthermore, surface 70 may be smaller or larger than a conventional (e.g., 8.5 ⁇ 11 inch) page of paper.
  • FIG. 2 is a block diagram of another device 200 upon which embodiments of the present invention can be implemented.
  • Device 200 includes processor 32 , power source 34 , audio output device 36 , input buttons 38 , memory unit 48 , optical detector 42 , optical emitter 44 and writing element 52 , previously described herein.
  • optical detector 42 , optical emitter 44 and writing element 52 are embodied as optical device 201 in housing 62
  • processor 32 , power source 34 , audio output device 36 , input buttons 38 and memory unit 48 are embodied as platform 202 in housing 74 .
  • optical device 201 is coupled to platform 202 by a cable 102 ; however, a wireless connection can be used instead.
  • the elements illustrated by FIG. 2 can be distributed between optical device 201 and platform 200 in combinations other than those described above.
  • FIG. 3 shows a sheet of paper 15 provided with a pattern of marks according to one embodiment of the present invention.
  • sheet of paper 15 is provided with a coding pattern in the form of optically readable position code 17 that consists of a pattern of marks 18 .
  • the marks 18 in FIG. 3 are greatly enlarged for the sake of clarity. In actuality, the marks 18 may not be easily discernible by the human visual system, and may appear as grayscale on sheet of paper 15 . In one embodiment, the marks 18 are embodied as dots; however, the present invention is not so limited.
  • FIG. 4 shows an enlarged portion 19 of the position code 17 of FIG. 3 .
  • An optical device such as devices 100 and 200 ( FIGS. 1 and 2 ) is positioned to record an image of a region of the position code 17 .
  • the optical device fits the marks 18 to a reference system in the form of a raster with raster lines 21 that intersect at raster points 22 .
  • Each of the marks 18 is associated with a raster point 22 .
  • mark 23 is associated with raster point 24 .
  • the displacement of a mark from the raster point associated with the mark is determined. Using these displacements, the pattern in the image/raster is compared to patterns in the reference system.
  • Each pattern in the reference system is associated with a particular location on the surface 70 .
  • the position of the pattern on the surface 70 and hence the position of the optical device relative to the surface 70 , can be determined.
  • each region on surface 70 is indicated by the letters A, B, C and D (these characters are not printed on surface 70 , but are used herein to indicate positions on surface 70 ). There may be many such regions on the surface 70 . Associated with each region on surface 70 is a unique pattern of marks. The regions on surface 70 may overlap because even if some marks are shared between overlapping regions, the pattern of marks in a region is still unique to that region.
  • a user may create a character consisting, for example, of a circled letter “M” at position A on surface 70 (generally, the user may create the character at any position on surface 70 ).
  • the user may create such a character in response to a prompt (e.g., an audible prompt) from pen device 100 .
  • pen device 100 records the pattern of markings that are uniquely present at the position where the character is created.
  • the pen device 100 associates that pattern of markings with the character just created.
  • pen device 100 When pen device 100 is subsequently positioned over the circled “M,” pen device 100 recognizes the pattern of marks associated therewith and recognizes the position as being associated with a circled “M.” In effect, pen device 100 recognizes the character using the pattern of markings at the position where the character is located, rather than by recognizing the character itself.
  • the characters described above comprise “graphic elements” that are associated with one or more commands of the pen device 100 .
  • graphic elements that are associated with, and are used to access the pen device 100 implemented functions comprising commands, are referred to as “graphic element icons” hereafter in order to distinguish from other written characters, marks, etc. that are not associated with accessing functions or applications of the pen device 100 .
  • a user can create (write) a graphic element icon that identifies a particular command, and can invoke that command repeatedly by simply positioning pen device 100 over the graphic element icon (e.g., the written character).
  • the writing instrument is positioned over the graphical character.
  • the user does not have to write the character for a command each time the command is to be invoked by the pen device 100 ; instead, the user can write the graphic element icon for a command one time and invoke the command repeatedly using the same written graphic element icon.
  • This attribute is referred to as “persistence” and is described in greater detail below. This is also true regarding graphical element icons that are not user written but pre-printed on the surface and are nevertheless selectable by the pen device 100 .
  • the graphic element icons can include a letter or number with a line circumscribing the letter or number.
  • the line circumscribing the letter or number may be a circle, oval, square, polygon, etc.
  • Such graphic elements appear to be like “buttons” that can be selected by the user, instead of ordinary letters and numbers.
  • the user can visually distinguish graphic element icons such as functional icons from ordinary letters and numbers, which may be treated as data by the pen device 100 .
  • the pen device may also be able to better distinguish functional or menu item type graphic elements from non-functional or non-menu item type graphic elements. For instance, a user may create a graphic element icon that is the letter “M” which is enclosed by a circle to create an interactive “menu” graphic element icon.
  • the pen device 100 may be programmed to recognize an overlapping circle or square with the letter “M” in it as a functional graphic element as distinguished from the letter “M” in a word.
  • the graphic element icon may also include a small “check mark” symbol adjacent thereto, within a certain distance (e.g., 1 inch, 1.5 inches, etc.). The checkmark will be associated with the graphic element icon.
  • Computer code for recognizing such functional graphic elements and distinguishing them from other non-functional graphic elements can reside in the memory unit in the pen device.
  • the processor can recognize the graphic element icons and can identify the locations of those graphic element icons so that the pen device 100 can perform various functions, operations, and the like associated therewith.
  • the memory unit may comprise computer code for correlating any graphic elements produced by the user with their locations on the surface.
  • the pen device 100 recognizes a “down-touch” or “down-stroke” or being placed down upon the surface (e.g., when the user begins writing) and recognizes an “up-stroke” or being picked up from the surface (e.g., when the user finishes writing).
  • Such down-strokes and up-strokes can be interpreted by the pen device 100 as, for example, indicators as to when certain functionality is invoked and what particular function/application is invoked (e.g., triggering OCR processing).
  • a down-stroke quickly followed by an up-stroke e.g., a tap of the pen device on the surface
  • can be associated with a special action depending upon the application e.g., selecting a graphic element icon, text string, etc.).
  • graphic element may include any suitable marking created by the user, and is distinguishable from a graphic element icon which refers to a functional graphic element that is used to access one or more functions of the device.
  • graphic element icons can be created by the pen device 100 (e.g., drawn by the user) or can be pre-existing (e.g., a printed element on a sheet of paper).
  • Example graphic elements include, but are not limited to symbols, indicia such as letters and/or numbers, characters, words, shapes, lines, etc. They can be regular or irregular in shape.
  • User written/created graphic elements are typically created using the pen device 100 .
  • graphic element icons usually, but not always, incorporate a circumscribing line (e.g., circle) around a character (e.g., the letter “M”) to give them an added degree of distinctiveness to both the user and the pen device 100 .
  • a circumscribing line e.g., circle
  • a character e.g., the letter “M”
  • an up-stroke after finishing a circle around the character can specifically indicate to the pen device 100 that the user has just created a graphic element icon.
  • FIG. 5 shows a flowchart of the steps of a computer implement process 550 in accordance with one embodiment of the present invention.
  • Process 550 depicts the basic operating steps of a user interface process as implemented by a device (e.g., pen device 100 ) in accordance with one embodiment of the present invention as it interprets user input in the form of graphic elements, writing, marks, etc. and provides the requested functionality to the user.
  • a device e.g., pen device 100
  • Process 550 begins in step 551 , where the computer implemented functionality of the pen device 100 recognizes a created graphical element icon (e.g., created by a user). Alternatively, the graphic element may be preprinted on the surface and its location known to the pen device 100 .
  • the pen device 100 is using the optical sensor and the processor to perform OCR (optical character recognition) on the writing to identify the user written graphical element. Its unique location on the surface is then also recorded, in one embodiment.
  • OCR optical character recognition
  • This function can be, for example, a menu function that can enunciate (e.g., audibly render) a predetermined list of functions (e.g., menu choices or sub-menu options) for subsequent activation by the user.
  • an audio output in accordance with the function is provided. This audio output can be, for example, the enunciation of what particular choice the user is at within the list of choices.
  • the function is persistently associated with the graphical element icon, enabling a subsequent access of the function (e.g., at some later time) by a subsequent actuation (e.g., tapping with the pen device 100 ) of the graphical element icon.
  • the listed menu choices can be subsequently accessed by the user at some later time by simply actuating the menu graphic element icon (e.g., tapping it).
  • the output of the pen device 100 can be visual output (e.g., via a display, indicator lights, etc.) in addition to, or instead of, audio output.
  • the visual output and/or audio output can come directly from the pen device 100 , or can be from another device (e.g., personal computer, speaker, LCD display, etc.) communicatively coupled to the pen device 100 .
  • embodiments of the present invention implement a user interface means for navigating the functionality of a computer system, particularly the pen based computer system comprising, for example, the pen device 100 .
  • the user interface as implemented by the graphical element icons provides the method of interacting with a number of software applications that execute within the pen device 100 .
  • output from the pen device 100 may include audio output, and thus, the user interface means enables the user to carry on a “dialog” with the applications and functionality of the pen device 100 .
  • the user interface enables the user to create mutually recognized items such as graphic element icons that allow the user and the pen device 100 to interact with one another.
  • the mutually recognized items are typically symbols or marks or icons that the user draws on a surface, typically a sheet of paper.
  • the manner of interaction will call up different computer implemented functionality of the pen device.
  • the menu functionality allows the user to iterate through a list of functions that are related to the graphic element (e.g., the number of taps on the menu graphic element icon iterates through a list of functions). Audio from the pen device can enunciate the function or mode as the taps are done. One of the enunciated functions/modes can then be selected by the user through some further interaction (e.g., drawing or selecting a previously drawn checkmark graphic element associated with the graphic element icon).
  • the functionality and options and further sub-menus of the particular selected function can then be accessed by the user.
  • one of the audibly rendered sub-options is itself a menu graphical icon, it can be selected by the user drawing its representation on the surface and selecting it.
  • FIG. 6 shows a flowchart of the computer implemented steps of a process 650 in accordance with one embodiment of the present invention.
  • Process 650 depicts the basic operating steps of a user interface process for accessing (e.g., navigating through) a number of nested, hierarchical functions of an interactive device (e.g., pen device 100 ) in accordance with one embodiment of the present invention.
  • Process 650 is described with reference to FIGS. 8A , 8 B, and 8 C.
  • Process 650 begins in step 651 , where the computer implemented functionality of the pen device 100 recognizes a created graphic element icon, shown in FIG. 8A as a menu icon “M”. Like step 551 , the graphic element icon may be written by the user or preprinted on the surface. In one case, the graphic element icon can provide a list of choices of further graphic element icons (e.g., hierarchical arrangement) that are associated their with and which themselves may provide further choices.
  • step 652 and as shown in FIG. 8A , once recognized, a first hierarchical menu of functions related to the graphic element icon is accessed.
  • the menu icon “M” of step 651 causes a list of sub-options (e.g., system “S”, games “G”, reference “R”, and tools “T”) to be audibly rendered (e.g., via audible prompts), one option at a time, as shown in FIG. 8A .
  • the options are rendered in response to successive selections of the menu icon of step 651 by the pen device (e.g., pen device 100 ).
  • one of the enunciated functions in this example, the reference graphic element icon “R”, is selected through an appropriate number of actuations of the menu graphic element icon (e.g., taps) and an actuation the associated checkmark icon 870 .
  • the activated function may prompt the creation of a second graphic element icon for a second hierarchical menu of functions.
  • the second graphic element icon, the reference icon “R” in this example may then be drawn on the surface by the user. The selection thereof, as shown in FIG.
  • step 655 one of the enunciated functions of the second graphic element icon is activated through an appropriate number of actuations to select one of the second hierarchical level functions.
  • one menu can invoke a number of sub-menus which themselves have even further sub-menus.
  • different levels of graphic element icons can be hierarchically arranged.
  • top-level graphic element icons which present menus of functions are referred to as group graphic element icons.
  • Application graphic element icons are second-level graphic element icons that generally present menus of configuration options or application settings for a given application.
  • application graphic element icons can be considered as a special case of a group graphic element icon.
  • an application graphic element icon has a specialized application related default behavior associated with it.
  • the menu items may include directory names, subdirectory names, application names, or names of specific data sets.
  • directory or subdirectory names include, but are not limited to, “tools” (e.g., for interactive useful functions applicable under many different circumstances), “reference” (e.g., for reference materials such as dictionaries), “games” (e.g., for different games), etc.
  • specific application (or subdirectory) names include “calculator”, “spell checker”, and “translator”.
  • Specific examples of data sets may include a set of foreign words and their definitions, a phone list, a calendar, a to-do list, etc. Additional examples of menu items are shown in FIG. 7 .
  • the pen device can instruct the user to write the name of a second language and circle it. After the user does this, the pen device can further instruct the user to write down a word in English and then select the circled second language to hear the written word translated into the second language. After doing so, the audio output device in the pen device may recite the word in the second language.
  • FIG. 7 shows a menu item tree directory according to an embodiment of the present invention including the graphical element icon representation of each option.
  • the menu item tree directory can embody an audio menu starting from the menu graphic element icon.
  • a first audio subdirectory would be a tools T subdirectory.
  • the tools T subdirectory there could be a translator TR subdirectory, a calculator C subdirectory, a spell checker SC subdirectory, a personal assistant PA subdirectory, an alarm clock AL subdirectory, and a tutor TU function.
  • the translator TR subdirectory there would be Spanish SP, French FR, and German GE translator functions.
  • a user may proceed or navigate down any desired path by listening to recitations of the various menu items and then selecting the menu item desired.
  • the subsequent selection of the desired menu item may occur in any suitable manner.
  • a user can cause the pen device to scroll through the audio menu by “down touching” (e.g., down-stroke) on a created graphic element.
  • the “down touching” may be recognized by the electronics in the pen device as an “actuation” by using any suitable mechanism.
  • the pen device may be programmed to recognize the image change associated with the downward movement of it towards the selected graphic element.
  • a pressure sensitive switch may be provided in the pen device so that when the end of the pen device applies pressure to the paper, the pressure switch activates. This informs the pen device to scroll through the audio menu. For instance, after selecting the circled letter “M” with the pen device (to thereby cause the pressure switch in the pen device to activate), the audio output device in the pen device may recite “tools” and nothing more. The user may select the circled letter “M” a second time to cause the audio output device to recite the menu item “reference”. This can be repeated as often as desired to scroll through the audio menu. To select a particular menu item, the user can create a distinctive mark on the paper or provide a specific gesture with the scanning apparatus.
  • the user may draw a “checkmark” (or other graphic element) next to the circled letter “M” after hearing the word “tools” to select the subdirectory “tools”.
  • a user may navigate towards the intended directory, subdirectory, or function in the menu item tree.
  • the creation of a different graphic element or a different gesture may be used to cause the pen device to scroll upward.
  • buttons or other actuators may be provided in the pen device to scroll through the menu.
  • the user may select the menu graphic element icon.
  • Software in the scanning apparatus recognizes the circled letter as being the menu symbol and causes the scanning apparatus to recite the menu items “tools”, “reference”, “games”, and “system” sequentially and at spaced timing intervals, without down touching by the user.
  • Audio instructions can be provided to the user.
  • the pen device may say “To select the ‘tools’ directory, write the letter ‘T’ and circle it,” To select the menu item, the user may create the letter “T’ and circle it. This indicates to the pen device that the user has selected the subdirectory “tools”.
  • the pen device can recite the menu items under the “tools” directory for the user.
  • the pen device can proceed directly to a particular directory, subdirectory, or function in the menu item tree by creating a graphic element representing that directory, subdirectory, or function on a sheet and interacting there with.
  • the menu item already resides on the surface, the user can anytime interact with it to select its functions.
  • the order of items within the directories, subdirectories, option menus, etc. of the graphic element icons depicted in FIG. 7 can be changed by the user.
  • the user can access a certain application and use that application to change the order in which the items of one or more directories, subdirectories, etc., are audibly rendered.
  • the user can change the specific audio output associated with one or more items within a given directory/subdirectory etc. for sample, the user can record her own voice for an item, use a prerecorded song (e.g., MP3, etc.), or the like, and user according as the item's audibly rendered output.
  • additional items for one or more directories, subdirectories, etc. can be added through, for example, software/or firmware updates provided to the pen device (e.g., uploading new software based functionality).
  • a respective state of multiple instances of a graphic element icon can be persistently associated with each specific instance. For example, in a case where two or more graphic element icons exist on a common surface (e.g., created by the user, preprinted, or the like) their state, or their particular location within their directory of options can be independently retained, or remembered, for each icon.
  • a first menu icon is currently on option three (e.g., “games”)
  • a second menu icon is currently on option one (e.g., “tools”)
  • the user can go off and perform other tasks using other applications (e.g., calculator, dictionary, etc.) and come back at some later time to either the first or second menu icon and they will correctly retain their last state (e.g., “games” for the first and “tools” for the second menu icon).
  • a respective state of multiple instances of a graphic element icon can be coordinated among the multiple instances and persistently associated with each specific instance.
  • coordinated state where two or more graphic element icons exist on a common surface (e.g., created by the user, preprinted, or the like) their state can be remembered for each icon, but that state can be coordinated such that the options span across each instance. For example, if a first menu icon is currently on option two (e.g., “system”), a second menu icon will have its state coordinated such that it will be on option three (e.g., “tools”). The user can perform other intervening tasks and come back at some later time to either the first or second menu icon and they will correctly retain their coordinated state (e.g., “system” for the first and “tools” for the second).
  • FIG. 9 shows a surface 910 (e.g., a sheet of paper) having a number of graphic element icons written thereon in accordance with one embodiment of the present invention.
  • FIG. 9 shows examples of group graphic element icons (e.g., the menu icon “M” and the games icon “G”) and an application icon (e.g., the calculator icon “C”).
  • the graphic element icons can be written on the sheet paper 910 by the user or can be preprinted.
  • group graphic element icons generally audibly render a list options. For example, repeatedly tapping at location 901 with the pen device 100 proceeds through the options of the menu directory (e.g., system, games, reference, and tools), as described in the discussion FIG. 7 .
  • the menu directory e.g., system, games, reference, and tools
  • tapping twice on the menu icon will cause the pen device 100 to audibly render “system” and then audibly render “games” indicating the selection of the games subdirectory.
  • the game subdirectory can then be activated by touching location 902 (e.g., the checkmark) and the activation can be confirmed to the user through an audio tone.
  • the pen device 100 audibly prompts the user to create (e.g. draw) a games graphic element icon as shown in FIG. 9 .
  • the pen device 100 again causes the pen device 100 to proceed through the options of the games subdirectory (e.g., word scramble, funky potatoes, and doodler), as described in the discussion of FIG. 7 .
  • a selected one of the game subdirectory items can then be selected through a tap at location 904 (e.g., the checkmark associated with the games), or alternatively, drawing the checkmark if it is not already there.
  • a touch at the calculator icon “C” launches the calculator application.
  • the calculator icon does not render a list of menu items or subdirectory options, but rather directly launches an application itself, in this case the calculator application.
  • an OCR (optical character recognition) process needs to be performed on a mark, single character (e.g., the letter “M”), or a text string (e.g., a word) only once, as it is first written by the user (e.g., “M” shown in FIG. 9 ).
  • the pen device 100 includes functionality whereby the location of the graphic elements on the surface 910 can be determined by the pen device 100 reading data encoded on the surface 910 . This enables the pen device 100 to remember the location of the particular character, particular symbol, particular text string, etc.
  • the pen device 100 can thus identify subsequent selections of a particular word by recognizing the same location of the particular word on a surface (e.g., when the user touches the pen device 100 onto the particular word at some later time).
  • the results of the earlier performed OCR process are recalled, and these results are used by, for example, an active application (e.g., dictionary).
  • an active application e.g., dictionary
  • the ability to store results of an OCR process e.g., on words, characters, numbers, etc.
  • Resource intensive OCR processing need only be performed once by the computer system resources of the pen device 100 .
  • FIG. 9 also shows a user written word 906 (e.g., text string) created using a “prompt and believe” function of the pen device 100 .
  • a user written word 906 e.g., text string
  • the particular word, graphic element, etc. can be created by the user in response to an audible prompt from the pen device 100 , wherein the pen device prompts the user to write the particular word (e.g., “president”) and subsequently stores the location of the written word with the association (e.g., from the prompt). The subsequent selection of the created word is recognized by location in the manner described above.
  • pen device 100 can instruct the user to write the word “president” 906 .
  • the user writes the word “president” and the pen device 100 will treat, or in other words believe, upon a subsequent selection of the word that what the user wrote in response to the prompt was in fact the word “president.”
  • the pen device 100 associates the label “president” with whatever the user wrote in response to the prompt.
  • the user can be prompted to underline the word, put a box around the word, or otherwise and some distinguishing mark/graphic element.
  • the pen device 100 When the user is done writing the prompted word, the pen device 100 recognizes the fact that the user is finished by, for example, recognizing the inactivity (e.g., the user is no longer writing) as a data entry termination event. In this manner, a “timeout” mechanism can be used to recognize the end of data entry. Another termination event could be a case where the word is underlined or boxed as described above. Additional examples of termination events are described in the commonly assigned United States patent application “TERMINATION EVENTS”, by Marggraff et al., filed on Jan. 12, 2005, Attorney Docket No. LEAP-P0320, and is incorporated herein in its entirety.
  • the prompt-and-believe feature of embodiments of the present invention enables the creation of graphic elements having meanings that are mutually understood between the user and the pen device 100 .
  • Graphic elements created using the “prompt-and-believe” function can be associated with labels for other applications, options, menus, functions etc., whereby selection of the prompt-and-believe graphic element (e.g. by tapping) can invoke any of the above. Reducing the requirement for OCR processing lowers the computational demands on the pen device 100 and thus improves the responsiveness of the user interface.
  • a pen device can incorporate one or more position location mechanisms such as, for example, motion sensors, gyroscopes, etc., and be configured to accurately store a precise location of a given surface (e.g., a sheet of paper).
  • the precise location of the surface can be stored by, for example, sequentially touching opposite corners of the surface (e.g., a rectangular sheet of paper).
  • the pen device would then recognize the location of graphic elements written by the user on the surface by comparing the stored precise location of the surface with the results of its location determination means.

Abstract

A method and system for implementing a user interface for a device through user created graphical elements. The method includes recognizing a graphical element icon created by a user. Once recognized, a function related to the graphical element icon is accessed and an output in accordance with the function is provided. The function is persistently associated with the graphical element icon. Menu selection and navigation is implemented through interaction with the graphic element icon. A listing of options associated with the graphical element icon is audibly rendered. In response to a selection of one of the options, the selected option is invoked.

Description

    CROSS REFERENCES TO RELATED APPLICATIONS
  • This application is a Continuation of and claims priority to U.S. patent application Ser. No. 11/034,491, filed on Jan. 12, 2005, which is incorporated herein by reference. The application Ser. No. 11/034,491 is a Continuation-in-Part of the co-pending, commonly-owned U.S. patent application Ser. No. 10/803,806, filed Mar. 17, 2004, by James Marggraff et al., entitled “Scanning Apparatus,” and hereby incorporated by reference in its entirety. The application Ser. No. 11/034,491 is a Continuation-in-Part of the co-pending, commonly-owned U.S. patent application Ser. No. 10/861,243, filed Jun. 3, 2004, by James Marggraff et al., entitled “User Created Interactive Interface,” and hereby incorporated by reference in its entirety.
  • This application is related to U.S. patent application Ser. No. 11/035,003 “TERMINATION EVENTS”, by Marggraff et al., filed on Jan. 12, 2005, that is incorporated herein in its entirety. This application is related to U.S. patent application Ser. No. 11/034,489 “PROVIDING A USER INTERFACE HAVING INTERACTIVE ELEMENTS ON A WRITABLE SURFACE”, by Marggraff et al., filed on Jan. 12, 2005, that is incorporated herein in its entirety.
  • TECHNICAL FIELD
  • Embodiments of the invention relate to the control and use of interactive devices, computers, electronic devices, appliances, toys, and the like.
  • BACKGROUND ART
  • Devices such as optical readers or optical pens conventionally emit light that reflects off a surface to a detector or imager. As the device is moved relative to the surface (or vice versa), successive images are rapidly captured. By analyzing the images, movement of the optical device relative to the surface can be tracked.
  • One type of optical pen is used with a sheet of paper on which very small dots are printed. The dots are printed on the page in a pattern with a nominal spacing of about 0.3 millimeters (0.01 inches). The pattern of dots within any region on the page is unique to that region. The optical pen essentially takes a snapshot of the surface, perhaps 100 times a second or more. By interpreting the dot positions captured in each snapshot, the optical pen can precisely determine its position relative to the page.
  • Applications that utilize information about the position of an optical pen relative to a surface have been or are being devised. An optical pen with Bluetooth or other wireless capability can be linked to other devices and used for sending electronic mail (e-mail) or faxes.
  • The increasing power of embedded computer systems and the complexity of the functions they are able to implement have created a need for a more intuitive and user-friendly manner of accessing such power. A typical prior art optical pen will implement its intended functionality by the user operating one or more buttons/switches or controls of the optical pen to activate one or more software programs, routines, embedded devices, or the like. The pen may contain or be in communication with a computer system. Upon actuation of such controls, the pen device performs its intended function. Accessing the capabilities of increasingly powerful optical pens through the limited number and configuration of switches, buttons, etc. provided on the pen itself, or any remotely coupled computer system device, is not a satisfactory arrangement.
  • One prior art solution uses the optical pen to recognize a user-defined command, and uses that command to invoke some function of the pen (e.g., PCT publication WO/01/48590 A1). For example, a user's writing can be recognized (e.g., in real-time) and interpreted as a command for the optical pen. The drawback with this solution involves the fact that interaction and control of the functions of the pen requires real-time recognition of the user's handwriting (e.g., as the user writes the command down on a sheet of paper). This solution is not satisfactory due to the fact that interaction with more complex functionality of an optical pen requires the user to repeatedly write-down one or more commands to access different choices, options, or functions provided by the pen. While the solution might be satisfactory for exceedingly simple, single step type applications (e.g., “turn off”, “store”, etc.), the solution is overly cumbersome and limiting in those cases where more complex, satisfying, rich functionality is desired.
  • DISCLOSURE OF THE INVENTION
  • Accordingly, a user interface method and system that enables interaction with more complex functionality of an optical pen device having a computer system associated therewith and enables more efficient access to the different choices, options, and functions provided by the pen device, would be valuable. What is further desired is a method and interactive interface useful for interacting with an operating system resident on, or in communication with, a pen device. Embodiments in accordance with the present invention provide these and other advantages.
  • In one embodiment, the present invention is implemented as a method for implementing a user interface for a device employing user created or written graphical elements and/or printed graphical elements that are on a surface. The method includes recognizing a created graphical element icon (e.g., created by a user) on a surface. Once recognized, a function related to the graphical element icon is accessed and an output in accordance with the function is provided. The functionality may reside on the pen device and the written graphical element may be written using the pen device. The graphical element icon can be a symbol, character, or mark created on the surface by the user, that is recognized as such by interpreting functionality (e.g., optical sensors, embedded computer system, etc.) of the device. The output is typically an audio output provided via an audio output device (e.g., a speaker coupled to, or resident on, the device). The function is persistently associated with the graphical element icon, enabling a subsequent access of the function (e.g., at some later time) by a subsequent actuation (e.g., tapping) of the graphical element icon by the pen device.
  • In one embodiment, the selection of a written or printed graphical element icon causes the pen device to audibly render a list of further selections that may be written and selected by the user. In this case, the first graphic element icon functions as a menu item that when selected causes the pen device to render sub-menu items related to the first graphic element icon. Any, or all, of the sub-menu items may be written on the surface and themselves selected, thereby causing the pen device to perform related functionality.
  • These and other objects and advantages of the present invention will be recognized by one skilled in the art after having read the following detailed description, which are illustrated in the various drawing figures.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, which are incorporated in and form a part of this specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention:
  • FIG. 1 is a block diagram of a device upon which embodiments of the present invention can be implemented.
  • FIG. 2 is a block diagram of another device upon which embodiments of the present invention can be implemented.
  • FIG. 3 shows an exemplary sheet of paper provided with a pattern of marks according to one embodiment of the present invention.
  • FIG. 4 shows an enlargement of a pattern of marks on an exemplary sheet of paper according to one embodiment of the present invention.
  • FIG. 5 shows a computer-controlled flowchart of the steps of a device user interface process in accordance with one embodiment of the present invention.
  • FIG. 6 shows a computer-controlled flowchart of the steps of a hierarchical device user interface process in accordance with one embodiment of the present invention.
  • FIG. 7 shows a menu item tree directory according to an embodiment of the present invention.
  • FIG. 8A shows a menu item audible prompting process in accordance with one embodiment of the present invention.
  • FIG. 8B shows a menu item selection process in accordance with one embodiment of the present invention.
  • FIG. 8C shows a sub-menu items selection process in accordance with one embodiment of the present invention.
  • FIG. 9 shows a plurality of different types of graphical item icons on a surface in accordance with one embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE EMBODIMENTS
  • Reference will now be made in detail to the preferred embodiments of the present invention, examples of which are illustrated in the accompanying drawings. While the invention will be described in conjunction with the preferred embodiments, it will be understood that they are not intended to limit the invention to these embodiments. On the contrary, the invention is intended to cover alternatives, modifications and equivalents, which may be included within the spirit and scope of the invention as defined by the appended claims. Furthermore, in the following detailed description of embodiments of the present invention, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be recognized by one of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail as not to unnecessarily obscure aspects of the embodiments of the present invention.
  • Notation and Nomenclature
  • Some portions of the detailed descriptions which follow are presented in terms of procedures, steps, logic blocks, processing, and other symbolic representations of operations on data bits within a computer memory. These descriptions and representations are the means used by those skilled in the data processing arts to convey most effectively the substance of their work to others skilled in the art. A procedure, computer executed step, logic block, process, etc., are here, and generally, conceived to be self-consistent sequences of steps or instructions leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
  • It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the present invention, discussions utilizing terms such as “processing,” “computing,” “configuring,” “generating,” or the like, refer to the action and processes of a microcontroller, computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within registers and memories into other data similarly represented as physical quantities.
  • Embodiments of the Invention
  • FIG. 1 is a block diagram of a pen device 100 upon which embodiments of the present invention can be implemented. In general, pen device 100 may be referred to as an optical device, more specifically as an optical reader, optical pen or digital pen. The device may contain a computer system and an operating system resident thereon. Application programs may also reside thereon.
  • In the embodiment of FIG. 1, pen device 100 includes a processor 32 inside a housing 62. In one embodiment, housing 62 has the form of a pen or other writing or marking utensil or instrument. Processor 32 is operable for processing information and instructions used to implement the functions of pen device 100, which are described below.
  • In the present embodiment, the pen device 100 may include an audio output device 36 and a display device 40 coupled to the processor 32. In other embodiments, the audio output device and/or the display device are physically separated from pen device 100, but in communication with pen device 100 through either a wired or wireless connection. For wireless communication, pen device 100 can include a transceiver or transmitter (not shown in FIG. 1). The audio output device 36 may include a speaker or an audio jack (e.g., for an earphone or headphone). The display device 40 may be a liquid crystal display (LCD) or some other suitable type of display.
  • In the embodiment of FIG. 1, pen device 100 may include input buttons 38 coupled to the processor 32 for activating and controlling the pen device 100. For example, the input buttons 38 allow a user to input information and commands to pen device 100 or to turn pen device 100 on or off. Pen device 100 also includes a power source 34 such as a battery.
  • Pen device 100 also includes a light source or optical emitter 44 and a light sensor or optical detector 42 coupled to the processor 32. The optical emitter 44 may be a light emitting diode (LED), for example, and the optical detector 42 may be a charge coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) imager array, for example. The optical emitter 44 illuminates surface 70 or a portion thereof. Light reflected from the surface 70 is received at and recorded by optical detector 42.
  • The surface 70 may be a sheet a paper, although the present invention is not so limited. For example, the surface 70 may comprise an LCD (liquid crystal display, CRT (cathode ray tube), touch screen, a surface comprising electronic ink, reconfigurable paper, or other types of electronically active surfaces (e.g., the display of a laptop or tablet PC).
  • In one embodiment, a pattern of markings is printed on surface 70. The end of pen device 100 that holds optical emitter 44 and optical detector 42 is placed against or near surface 70. As pen device 100 is moved relative to the surface 70, the pattern of markings are read and recorded by optical emitter 44 and optical detector 42. As discussed in more detail further below, in one embodiment, the markings on surface 70 are used to determine the position of pen device 100 relative to surface (see FIGS. 3 and 4). In another embodiment, the markings on surface 70 are used to encode information (see FIGS. 5 and 6). The captured images of surface 70 can be analyzed (processed) by pen device 100 to decode the markings and recover the encoded information.
  • Additional descriptions regarding surface markings for encoding information and the reading/recording of such markings by electronic devices can be found in the following patents and patent applications that are assigned to Anoto and that are all herein incorporated by reference in their entirety: U.S. Pat. No. 6,502,756, U.S. application Ser. No. 10/179,966, filed on Jun. 26, 2002, WO 01/95559, WO 01/71473, WO 01/75723, WO 01/26032, WO 01/75780, WO 01/01670, WO 01/75773, WO 01/71475, WO 10 00/73983, and WO 01116691.
  • Pen device 100 of FIG. 1 also includes a memory unit 48 coupled to the processor 32. In one embodiment, memory unit 48 is a removable memory unit embodied as a memory cartridge or a memory card. In another embodiment, memory unit 48 includes random access (volatile) memory (RAM) and read-only (non-volatile) memory (ROM) for storing information and instructions for processor 32.
  • In the embodiment of FIG. 1, pen device 100 includes a writing element 52 situated at the same end of pen device 100 as the optical detector 42 and the optical emitter 44. Writing element 52 can be, for example, a pen, pencil, marker or the like, and may or may not be retractable. In certain applications, writing element 52 is not needed. In other applications, a user can use writing element 52 to make marks (e.g., graphical elements) on surface 70, including characters such as letters, words, numbers, mathematical symbols and the like. These marks can be scanned (imaged) and interpreted by pen device 100 according to their position on the surface 70. The position of the user-produced marks can be determined using a pattern of marks that are printed on surface 70; refer to the discussion of FIGS. 3 and 4, below. In one embodiment, the user-produced markings can be interpreted by pen device 100 using optical character recognition (OCR) techniques that recognize handwritten characters.
  • As mentioned above, surface 70 may be any surface suitable on which to write, such as, for example, a sheet of paper, although surfaces consisting of materials other than paper may be used. Also, surface 70 may or may not be flat. For example, surface 70 may be embodied as the surface of a globe. Furthermore, surface 70 may be smaller or larger than a conventional (e.g., 8.5×11 inch) page of paper.
  • FIG. 2 is a block diagram of another device 200 upon which embodiments of the present invention can be implemented. Device 200 includes processor 32, power source 34, audio output device 36, input buttons 38, memory unit 48, optical detector 42, optical emitter 44 and writing element 52, previously described herein. However, in the embodiment of FIG. 2, optical detector 42, optical emitter 44 and writing element 52 are embodied as optical device 201 in housing 62, and processor 32, power source 34, audio output device 36, input buttons 38 and memory unit 48 are embodied as platform 202 in housing 74. In the present embodiment, optical device 201 is coupled to platform 202 by a cable 102; however, a wireless connection can be used instead. The elements illustrated by FIG. 2 can be distributed between optical device 201 and platform 200 in combinations other than those described above.
  • FIG. 3 shows a sheet of paper 15 provided with a pattern of marks according to one embodiment of the present invention. In the embodiment of FIG. 3, sheet of paper 15 is provided with a coding pattern in the form of optically readable position code 17 that consists of a pattern of marks 18. The marks 18 in FIG. 3 are greatly enlarged for the sake of clarity. In actuality, the marks 18 may not be easily discernible by the human visual system, and may appear as grayscale on sheet of paper 15. In one embodiment, the marks 18 are embodied as dots; however, the present invention is not so limited.
  • FIG. 4 shows an enlarged portion 19 of the position code 17 of FIG. 3. An optical device such as devices 100 and 200 (FIGS. 1 and 2) is positioned to record an image of a region of the position code 17. In one embodiment, the optical device fits the marks 18 to a reference system in the form of a raster with raster lines 21 that intersect at raster points 22. Each of the marks 18 is associated with a raster point 22. For example, mark 23 is associated with raster point 24. For the marks in an image/raster, the displacement of a mark from the raster point associated with the mark is determined. Using these displacements, the pattern in the image/raster is compared to patterns in the reference system. Each pattern in the reference system is associated with a particular location on the surface 70. Thus, by matching the pattern in the image/raster with a pattern in the reference system, the position of the pattern on the surface 70, and hence the position of the optical device relative to the surface 70, can be determined.
  • Additional descriptions regarding surface markings for encoding information and the reading/recording of such markings by electronic devices can be found in the following patents and patent applications that are assigned to Anoto and that are all herein incorporated by reference in their entirety: U.S. Pat. No. 6,502,756, U.S. application Ser. No. 10/179,966, filed on Jun. 26, 2002, WO 01/95559, WO 01/71473, WO 01/75723, WO 01/26032, WO 01/75780, WO 01/01670, WO 01/75773, WO 01/71475, WO 10 00/73983, and WO 01/16691.
  • With reference back to FIG. 1, four positions or regions on surface 70 are indicated by the letters A, B, C and D (these characters are not printed on surface 70, but are used herein to indicate positions on surface 70). There may be many such regions on the surface 70. Associated with each region on surface 70 is a unique pattern of marks. The regions on surface 70 may overlap because even if some marks are shared between overlapping regions, the pattern of marks in a region is still unique to that region.
  • In the example of FIG. 1, using pen device 100 (specifically, using writing element 52), a user may create a character consisting, for example, of a circled letter “M” at position A on surface 70 (generally, the user may create the character at any position on surface 70). The user may create such a character in response to a prompt (e.g., an audible prompt) from pen device 100. When the user creates the character, pen device 100 records the pattern of markings that are uniquely present at the position where the character is created. The pen device 100 associates that pattern of markings with the character just created. When pen device 100 is subsequently positioned over the circled “M,” pen device 100 recognizes the pattern of marks associated therewith and recognizes the position as being associated with a circled “M.” In effect, pen device 100 recognizes the character using the pattern of markings at the position where the character is located, rather than by recognizing the character itself.
  • In one embodiment, the characters described above comprise “graphic elements” that are associated with one or more commands of the pen device 100. It should be noted that such graphic elements that are associated with, and are used to access the pen device 100 implemented functions comprising commands, are referred to as “graphic element icons” hereafter in order to distinguish from other written characters, marks, etc. that are not associated with accessing functions or applications of the pen device 100. In the example just described, a user can create (write) a graphic element icon that identifies a particular command, and can invoke that command repeatedly by simply positioning pen device 100 over the graphic element icon (e.g., the written character). In one embodiment, the writing instrument is positioned over the graphical character. In other words, the user does not have to write the character for a command each time the command is to be invoked by the pen device 100; instead, the user can write the graphic element icon for a command one time and invoke the command repeatedly using the same written graphic element icon. This attribute is referred to as “persistence” and is described in greater detail below. This is also true regarding graphical element icons that are not user written but pre-printed on the surface and are nevertheless selectable by the pen device 100.
  • In one embodiment, the graphic element icons can include a letter or number with a line circumscribing the letter or number. The line circumscribing the letter or number may be a circle, oval, square, polygon, etc. Such graphic elements appear to be like “buttons” that can be selected by the user, instead of ordinary letters and numbers. By creating a graphic element icon of this kind, the user can visually distinguish graphic element icons such as functional icons from ordinary letters and numbers, which may be treated as data by the pen device 100. Also, by creating graphic element icons of this kind, the pen device may also be able to better distinguish functional or menu item type graphic elements from non-functional or non-menu item type graphic elements. For instance, a user may create a graphic element icon that is the letter “M” which is enclosed by a circle to create an interactive “menu” graphic element icon.
  • The pen device 100 may be programmed to recognize an overlapping circle or square with the letter “M” in it as a functional graphic element as distinguished from the letter “M” in a word. The graphic element icon may also include a small “check mark” symbol adjacent thereto, within a certain distance (e.g., 1 inch, 1.5 inches, etc.). The checkmark will be associated with the graphic element icon. Computer code for recognizing such functional graphic elements and distinguishing them from other non-functional graphic elements can reside in the memory unit in the pen device. The processor can recognize the graphic element icons and can identify the locations of those graphic element icons so that the pen device 100 can perform various functions, operations, and the like associated therewith. In these embodiments, the memory unit may comprise computer code for correlating any graphic elements produced by the user with their locations on the surface. The pen device 100 recognizes a “down-touch” or “down-stroke” or being placed down upon the surface (e.g., when the user begins writing) and recognizes an “up-stroke” or being picked up from the surface (e.g., when the user finishes writing). Such down-strokes and up-strokes can be interpreted by the pen device 100 as, for example, indicators as to when certain functionality is invoked and what particular function/application is invoked (e.g., triggering OCR processing). Particularly, a down-stroke quickly followed by an up-stroke (e.g., a tap of the pen device on the surface) can be associated with a special action depending upon the application (e.g., selecting a graphic element icon, text string, etc.).
  • It should be noted that the generic term “graphic element” may include any suitable marking created by the user, and is distinguishable from a graphic element icon which refers to a functional graphic element that is used to access one or more functions of the device.
  • As mentioned above, it should be noted that graphic element icons can be created by the pen device 100 (e.g., drawn by the user) or can be pre-existing (e.g., a printed element on a sheet of paper). Example graphic elements include, but are not limited to symbols, indicia such as letters and/or numbers, characters, words, shapes, lines, etc. They can be regular or irregular in shape. User written/created graphic elements are typically created using the pen device 100. Additionally, graphic element icons usually, but not always, incorporate a circumscribing line (e.g., circle) around a character (e.g., the letter “M”) to give them an added degree of distinctiveness to both the user and the pen device 100. For example, in one embodiment, an up-stroke after finishing a circle around the character can specifically indicate to the pen device 100 that the user has just created a graphic element icon.
  • FIG. 5 shows a flowchart of the steps of a computer implement process 550 in accordance with one embodiment of the present invention. Process 550 depicts the basic operating steps of a user interface process as implemented by a device (e.g., pen device 100) in accordance with one embodiment of the present invention as it interprets user input in the form of graphic elements, writing, marks, etc. and provides the requested functionality to the user.
  • Process 550 begins in step 551, where the computer implemented functionality of the pen device 100 recognizes a created graphical element icon (e.g., created by a user). Alternatively, the graphic element may be preprinted on the surface and its location known to the pen device 100. At step 551, if the user is writing the graphic element for the first time, the pen device 100 is using the optical sensor and the processor to perform OCR (optical character recognition) on the writing to identify the user written graphical element. Its unique location on the surface is then also recorded, in one embodiment. In step 552, once recognized, a function related to the graphical element icon is accessed. This function can be, for example, a menu function that can enunciate (e.g., audibly render) a predetermined list of functions (e.g., menu choices or sub-menu options) for subsequent activation by the user. In step 553, an audio output in accordance with the function is provided. This audio output can be, for example, the enunciation of what particular choice the user is at within the list of choices. In step 554, the function is persistently associated with the graphical element icon, enabling a subsequent access of the function (e.g., at some later time) by a subsequent actuation (e.g., tapping with the pen device 100) of the graphical element icon. For example, in the case of a menu function, the listed menu choices can be subsequently accessed by the user at some later time by simply actuating the menu graphic element icon (e.g., tapping it).
  • It should be noted that the output of the pen device 100 can be visual output (e.g., via a display, indicator lights, etc.) in addition to, or instead of, audio output. The visual output and/or audio output can come directly from the pen device 100, or can be from another device (e.g., personal computer, speaker, LCD display, etc.) communicatively coupled to the pen device 100.
  • It is appreciated that a plurality of different graphic elements may exist on the surface and anytime, and the selection thereof may provide various functions to be executed by the pen device 100, for example, to invoked applications, invoke sub-menu options, etc.
  • In this manner, embodiments of the present invention implement a user interface means for navigating the functionality of a computer system, particularly the pen based computer system comprising, for example, the pen device 100. The user interface as implemented by the graphical element icons provides the method of interacting with a number of software applications that execute within the pen device 100. As described above, output from the pen device 100 may include audio output, and thus, the user interface means enables the user to carry on a “dialog” with the applications and functionality of the pen device 100. In other words, the user interface enables the user to create mutually recognized items such as graphic element icons that allow the user and the pen device 100 to interact with one another. As described above, the mutually recognized items are typically symbols or marks or icons that the user draws on a surface, typically a sheet of paper.
  • Different graphic element icons have different meaning and different manners of interaction with the user. Generally, for a given graphic element icon, the manner of interaction will call up different computer implemented functionality of the pen device. For illustration purposes, in the case of the menu example above, the menu functionality allows the user to iterate through a list of functions that are related to the graphic element (e.g., the number of taps on the menu graphic element icon iterates through a list of functions). Audio from the pen device can enunciate the function or mode as the taps are done. One of the enunciated functions/modes can then be selected by the user through some further interaction (e.g., drawing or selecting a previously drawn checkmark graphic element associated with the graphic element icon). Once selected, the functionality and options and further sub-menus of the particular selected function can then be accessed by the user. Alternatively, if one of the audibly rendered sub-options is itself a menu graphical icon, it can be selected by the user drawing its representation on the surface and selecting it.
  • FIG. 6 shows a flowchart of the computer implemented steps of a process 650 in accordance with one embodiment of the present invention. Process 650 depicts the basic operating steps of a user interface process for accessing (e.g., navigating through) a number of nested, hierarchical functions of an interactive device (e.g., pen device 100) in accordance with one embodiment of the present invention. Process 650 is described with reference to FIGS. 8A, 8B, and 8C.
  • Process 650 begins in step 651, where the computer implemented functionality of the pen device 100 recognizes a created graphic element icon, shown in FIG. 8A as a menu icon “M”. Like step 551, the graphic element icon may be written by the user or preprinted on the surface. In one case, the graphic element icon can provide a list of choices of further graphic element icons (e.g., hierarchical arrangement) that are associated their with and which themselves may provide further choices. In step 652, and as shown in FIG. 8A, once recognized, a first hierarchical menu of functions related to the graphic element icon is accessed. In this example, once recognized, the menu icon “M” of step 651 causes a list of sub-options (e.g., system “S”, games “G”, reference “R”, and tools “T”) to be audibly rendered (e.g., via audible prompts), one option at a time, as shown in FIG. 8A. The options are rendered in response to successive selections of the menu icon of step 651 by the pen device (e.g., pen device 100).
  • In step 653, and as illustrated in FIG. 8B, one of the enunciated functions, in this example, the reference graphic element icon “R”, is selected through an appropriate number of actuations of the menu graphic element icon (e.g., taps) and an actuation the associated checkmark icon 870. In step 654, the activated function may prompt the creation of a second graphic element icon for a second hierarchical menu of functions. The second graphic element icon, the reference icon “R” in this example, may then be drawn on the surface by the user. The selection thereof, as shown in FIG. 8C, will cause a second listing of submenu items to be audibly rendered (e.g., via audible prompts) in the manner described above (e.g., Thesaurus “TH”, dictionary “D”, and help “H”). Subsequently in step 655, one of the enunciated functions of the second graphic element icon is activated through an appropriate number of actuations to select one of the second hierarchical level functions.
  • In this manner, one menu can invoke a number of sub-menus which themselves have even further sub-menus. Thus, different levels of graphic element icons can be hierarchically arranged. Generally, top-level graphic element icons which present menus of functions are referred to as group graphic element icons. Application graphic element icons are second-level graphic element icons that generally present menus of configuration options or application settings for a given application. For example, application graphic element icons can be considered as a special case of a group graphic element icon. Generally, an application graphic element icon has a specialized application related default behavior associated with it.
  • In this manner, the user may then select a menu item from the list of menu items. The menu items may include directory names, subdirectory names, application names, or names of specific data sets. Examples of directory or subdirectory names include, but are not limited to, “tools” (e.g., for interactive useful functions applicable under many different circumstances), “reference” (e.g., for reference materials such as dictionaries), “games” (e.g., for different games), etc. Examples of specific application (or subdirectory) names include “calculator”, “spell checker”, and “translator”. Specific examples of data sets may include a set of foreign words and their definitions, a phone list, a calendar, a to-do list, etc. Additional examples of menu items are shown in FIG. 7.
  • Specific audio instructions can be provided for the various menu items. For instance, after the user selects the “calculator” menu item, the pen device may instruct the user to draw the numbers 0-9, and the operators ±, −′×, /, and = on the sheet of paper and then select the numbers to perform a math calculation. In another example, after the user selects the “translator” menu item, the pen device can instruct the user to write the name of a second language and circle it. After the user does this, the pen device can further instruct the user to write down a word in English and then select the circled second language to hear the written word translated into the second language. After doing so, the audio output device in the pen device may recite the word in the second language.
  • FIG. 7 shows a menu item tree directory according to an embodiment of the present invention including the graphical element icon representation of each option. The menu item tree directory can embody an audio menu starting from the menu graphic element icon. Starting from the top of FIG. 7, a first audio subdirectory would be a tools T subdirectory. Under the tools T subdirectory, there could be a translator TR subdirectory, a calculator C subdirectory, a spell checker SC subdirectory, a personal assistant PA subdirectory, an alarm clock AL subdirectory, and a tutor TU function. Under the translator TR subdirectory, there would be Spanish SP, French FR, and German GE translator functions. Under the personal assistant PA subdirectory, there would be calendar C, phone list PL, and to do list TD functions or subdirectories. Under the reference R subdirectory, there could be thesaurus TH function, a dictionary D subdirectory, and a help H function. Under the dictionary D subdirectory, there can be an English E function, a Spanish SF function, and a French FR function. Under the games G subdirectory, there can be games such as word scramble WS, funky potatoes FP, and doodler DO. Other games could also be present in other embodiments of the invention. Under the system S subdirectory, there can be a security SE function, and a personalization P function.
  • Details pertaining to some of the above directories, subdirectories, and functions are provided below. As illustrated by the menu item tree-directory, a user may proceed or navigate down any desired path by listening to recitations of the various menu items and then selecting the menu item desired. The subsequent selection of the desired menu item may occur in any suitable manner. For example, in some embodiments, a user can cause the pen device to scroll through the audio menu by “down touching” (e.g., down-stroke) on a created graphic element. The “down touching” may be recognized by the electronics in the pen device as an “actuation” by using any suitable mechanism. For instance, the pen device may be programmed to recognize the image change associated with the downward movement of it towards the selected graphic element.
  • In another example, a pressure sensitive switch may be provided in the pen device so that when the end of the pen device applies pressure to the paper, the pressure switch activates. This informs the pen device to scroll through the audio menu. For instance, after selecting the circled letter “M” with the pen device (to thereby cause the pressure switch in the pen device to activate), the audio output device in the pen device may recite “tools” and nothing more. The user may select the circled letter “M” a second time to cause the audio output device to recite the menu item “reference”. This can be repeated as often as desired to scroll through the audio menu. To select a particular menu item, the user can create a distinctive mark on the paper or provide a specific gesture with the scanning apparatus. For instance, the user may draw a “checkmark” (or other graphic element) next to the circled letter “M” after hearing the word “tools” to select the subdirectory “tools”. Using a method such as this, a user may navigate towards the intended directory, subdirectory, or function in the menu item tree. The creation of a different graphic element or a different gesture may be used to cause the pen device to scroll upward. Alternatively, buttons or other actuators may be provided in the pen device to scroll through the menu. Once “tools” is selected, it will function as described above, but with respect to its subdirectory menu.
  • In other embodiments, after creating the menu graphic element icon (e.g., letter “M” with a circle), the user may select the menu graphic element icon. Software in the scanning apparatus recognizes the circled letter as being the menu symbol and causes the scanning apparatus to recite the menu items “tools”, “reference”, “games”, and “system” sequentially and at spaced timing intervals, without down touching by the user. Audio instructions can be provided to the user. For example, the pen device may say “To select the ‘tools’ directory, write the letter ‘T’ and circle it,” To select the menu item, the user may create the letter “T’ and circle it. This indicates to the pen device that the user has selected the subdirectory “tools”. Then, the pen device can recite the menu items under the “tools” directory for the user. Thus, it is possible to proceed directly to a particular directory, subdirectory, or function in the menu item tree by creating a graphic element representing that directory, subdirectory, or function on a sheet and interacting there with. Alternatively, if the menu item already resides on the surface, the user can anytime interact with it to select its functions.
  • It should be noted that the order of items within the directories, subdirectories, option menus, etc. of the graphic element icons depicted in FIG. 7 can be changed by the user. For example, the user can access a certain application and use that application to change the order in which the items of one or more directories, subdirectories, etc., are audibly rendered. Similarly, the user can change the specific audio output associated with one or more items within a given directory/subdirectory etc. for sample, the user can record her own voice for an item, use a prerecorded song (e.g., MP3, etc.), or the like, and user according as the item's audibly rendered output. Additionally, it should be noted that additional items for one or more directories, subdirectories, etc., can be added through, for example, software/or firmware updates provided to the pen device (e.g., uploading new software based functionality).
  • It should be noted that a respective state of multiple instances of a graphic element icon (e.g., multiple menu icons) can be persistently associated with each specific instance. For example, in a case where two or more graphic element icons exist on a common surface (e.g., created by the user, preprinted, or the like) their state, or their particular location within their directory of options can be independently retained, or remembered, for each icon. For example, if a first menu icon is currently on option three (e.g., “games”), and a second menu icon is currently on option one (e.g., “tools”), the user can go off and perform other tasks using other applications (e.g., calculator, dictionary, etc.) and come back at some later time to either the first or second menu icon and they will correctly retain their last state (e.g., “games” for the first and “tools” for the second menu icon).
  • Similarly, it should be noted that a respective state of multiple instances of a graphic element icon (e.g., multiple menu icons) can be coordinated among the multiple instances and persistently associated with each specific instance. With coordinated state, where two or more graphic element icons exist on a common surface (e.g., created by the user, preprinted, or the like) their state can be remembered for each icon, but that state can be coordinated such that the options span across each instance. For example, if a first menu icon is currently on option two (e.g., “system”), a second menu icon will have its state coordinated such that it will be on option three (e.g., “tools”). The user can perform other intervening tasks and come back at some later time to either the first or second menu icon and they will correctly retain their coordinated state (e.g., “system” for the first and “tools” for the second).
  • FIG. 9 shows a surface 910 (e.g., a sheet of paper) having a number of graphic element icons written thereon in accordance with one embodiment of the present invention. FIG. 9 shows examples of group graphic element icons (e.g., the menu icon “M” and the games icon “G”) and an application icon (e.g., the calculator icon “C”). The graphic element icons can be written on the sheet paper 910 by the user or can be preprinted. As described above, group graphic element icons generally audibly render a list options. For example, repeatedly tapping at location 901 with the pen device 100 proceeds through the options of the menu directory (e.g., system, games, reference, and tools), as described in the discussion FIG. 7. For example, tapping twice on the menu icon will cause the pen device 100 to audibly render “system” and then audibly render “games” indicating the selection of the games subdirectory. The game subdirectory can then be activated by touching location 902 (e.g., the checkmark) and the activation can be confirmed to the user through an audio tone.
  • Subsequently, the pen device 100 audibly prompts the user to create (e.g. draw) a games graphic element icon as shown in FIG. 9. Repeatedly tapping the games icon at location 903 with the pen device 100 then causes the pen device 100 to proceed through the options of the games subdirectory (e.g., word scramble, funky potatoes, and doodler), as described in the discussion of FIG. 7. A selected one of the game subdirectory items can then be selected through a tap at location 904 (e.g., the checkmark associated with the games), or alternatively, drawing the checkmark if it is not already there.
  • Referring still to FIG. 9, a touch at the calculator icon “C” launches the calculator application. In this manner, the calculator icon does not render a list of menu items or subdirectory options, but rather directly launches an application itself, in this case the calculator application. Once the calculator application is invoked, the pen device 100 confirms the activation (e.g., by rendering an audio tone) and audibly prompts the user through a series of actions to prepare the calculator for use (e.g., by instructing the user to draw the numbers 0-9, and the operators ±, −′×, /, and = on the surface and then select the numbers to perform a math calculation).
  • Importantly, in the above examples, it should be noted that an OCR (optical character recognition) process needs to be performed on a mark, single character (e.g., the letter “M”), or a text string (e.g., a word) only once, as it is first written by the user (e.g., “M” shown in FIG. 9). As described above, the pen device 100 includes functionality whereby the location of the graphic elements on the surface 910 can be determined by the pen device 100 reading data encoded on the surface 910. This enables the pen device 100 to remember the location of the particular character, particular symbol, particular text string, etc. The pen device 100 can thus identify subsequent selections of a particular word by recognizing the same location of the particular word on a surface (e.g., when the user touches the pen device 100 onto the particular word at some later time). Upon subsequent selections of the word by the user, the results of the earlier performed OCR process are recalled, and these results are used by, for example, an active application (e.g., dictionary). Thus, the ability to store results of an OCR process (e.g., on words, characters, numbers, etc.), and to subsequently recall those results for use with one or more applications at a later time, greatly improves the responsiveness and the performance of the user interface implemented by embodiments of the present invention. Resource intensive OCR processing need only be performed once by the computer system resources of the pen device 100.
  • FIG. 9 also shows a user written word 906 (e.g., text string) created using a “prompt and believe” function of the pen device 100. In accordance with embodiments of the present invention, it should be noted that some words, text strings, marks, symbols, or other graphic elements, need not be processed at all using OCR. For example, the particular word, graphic element, etc., can be created by the user in response to an audible prompt from the pen device 100, wherein the pen device prompts the user to write the particular word (e.g., “president”) and subsequently stores the location of the written word with the association (e.g., from the prompt). The subsequent selection of the created word is recognized by location in the manner described above. For example, pen device 100 can instruct the user to write the word “president” 906. In response to the prompt, the user writes the word “president” and the pen device 100 will treat, or in other words believe, upon a subsequent selection of the word that what the user wrote in response to the prompt was in fact the word “president.” In other words, the pen device 100 associates the label “president” with whatever the user wrote in response to the prompt. Depending upon the application, the user can be prompted to underline the word, put a box around the word, or otherwise and some distinguishing mark/graphic element.
  • When the user is done writing the prompted word, the pen device 100 recognizes the fact that the user is finished by, for example, recognizing the inactivity (e.g., the user is no longer writing) as a data entry termination event. In this manner, a “timeout” mechanism can be used to recognize the end of data entry. Another termination event could be a case where the word is underlined or boxed as described above. Additional examples of termination events are described in the commonly assigned United States patent application “TERMINATION EVENTS”, by Marggraff et al., filed on Jan. 12, 2005, Attorney Docket No. LEAP-P0320, and is incorporated herein in its entirety.
  • In this manner, the prompt-and-believe feature of embodiments of the present invention enables the creation of graphic elements having meanings that are mutually understood between the user and the pen device 100. Importantly, it should be understood that there is no OCR processing being done on the word president. Graphic elements created using the “prompt-and-believe” function can be associated with labels for other applications, options, menus, functions etc., whereby selection of the prompt-and-believe graphic element (e.g. by tapping) can invoke any of the above. Reducing the requirement for OCR processing lowers the computational demands on the pen device 100 and thus improves the responsiveness of the user interface.
  • Although embodiments of the present invention have been described in the context of using surfaces encoded with markings in order to determine location of the pen device, it should be noted that embodiments of the present invention are suitable for use with pen devices that determine location using other means that do not require encoded surfaces. For example, in one embodiment, a pen device can incorporate one or more position location mechanisms such as, for example, motion sensors, gyroscopes, etc., and be configured to accurately store a precise location of a given surface (e.g., a sheet of paper). The precise location of the surface can be stored by, for example, sequentially touching opposite corners of the surface (e.g., a rectangular sheet of paper). The pen device would then recognize the location of graphic elements written by the user on the surface by comparing the stored precise location of the surface with the results of its location determination means.
  • The foregoing descriptions of specific embodiments of the present invention have been presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the invention to the precise forms disclosed, and obviously many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and its practical application, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the claims appended hereto and their equivalents.

Claims (20)

1. A method for interpreting user commands, comprising:
recognizing a created graphical element icon on a surface;
accessing a function related to the graphical element icon;
providing an output in accordance with the function; and
associating the function with the graphical element icon.
2. The method of claim 1, wherein the output comprises an audio output related to the function.
3. The method of claim 1, further comprising:
enabling a subsequent access of the function in response to a subsequent selection of the graphical element icon by storing the association of the function with the graphical element icon.
4. The method of claim 3, wherein the storing of the association of the function with the graphical element icon implements a persistent availability of the function, for a predetermined amount of time, via interaction with the graphical element icon.
5. The method of claim 1, wherein the graphical element icon is created by a pen device on the surface.
6. The method of claim 5, wherein the surface comprises a sheet of paper.
7. The method of claim 1, further comprising:
accessing one of a plurality of functions related to the graphical element icon by interpreting at least one actuation of the graphical element icon, wherein the at least one actuation selects the one of the plurality of functions.
8. The method of claim 7, wherein the at least one actuation comprises recognizing at least one tap of the graphical element icon.
9. The method of claim 7, further comprising:
providing one of a plurality of audio outputs when the one of the plurality of functions is selected.
10. The method of claim 7, wherein the plurality of functions comprises a predetermined menu of options.
11. The method of claim 7, wherein the plurality of functions comprises a plurality of configuration options of an application related to the graphical element icon.
12. The method of claim 11, wherein at least one of the plurality of configuration options comprises a default configuration of the application.
13. The method of claim 1, further comprising:
implementing a hierarchy of functions; and
providing access to the hierarchy of functions via a corresponding hierarchy of graphical element icons.
14. The method of claim 13, further comprising:
recognizing at least one actuation of the graphical element icon to select a first hierarchical level function;
prompting the creation of a second graphical element icon;
recognizing at least one actuation of the second graphical element icon to select a second hierarchical level function;
providing an audio output related to the second hierarchical level function; and
associating the second hierarchical level function with the second graphical element icon.
15. A method for interpreting user commands, comprising:
recognizing a created graphical element icon on a surface;
accessing a function related to the graphical element icon;
providing an audio output in accordance with the function; and
enabling a subsequent access of the function by storing an association of the function with the graphical element icon, wherein the storing of the association implements a persistent availability of the function for a predetermined amount of time, via interaction with the graphical element icon.
16. The method of claim 15, wherein the graphical element icon is created by a pen device on the surface.
17. The method of claim 16, wherein the surface comprises a sheet of paper.
18. The method of claim 15, further comprising:
accessing one of a plurality of functions related to the graphical element icon by interpreting at least one actuation of the graphical element icon, wherein the at least one actuation selects the one of the plurality of functions.
19. The method of claim 18, wherein the at least one actuation comprises recognizing at least one tap of the graphical element icon.
20. A method of interacting with a pen based computer system, said method comprising:
recognizing selection of a first graphical icon on a writable surface, said selection performed using a writing instrument of said pen based computer system;
in response to said selection, audibly rendering a listing of first options associated with said first graphical icon wherein said first options are operable to be invoked by said pen based computer system; and
in response to a selection of one of said first options, invoking said one of said first options.
US12/942,927 2004-03-17 2010-11-09 Method and system for implementing a user interface for a device employing written graphical elements Abandoned US20110279415A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/942,927 US20110279415A1 (en) 2004-03-17 2010-11-09 Method and system for implementing a user interface for a device employing written graphical elements

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US10/803,806 US20040229195A1 (en) 2003-03-18 2004-03-17 Scanning apparatus
US10/861,243 US20060033725A1 (en) 2004-06-03 2004-06-03 User created interactive interface
US11/034,491 US7831933B2 (en) 2004-03-17 2005-01-12 Method and system for implementing a user interface for a device employing written graphical elements
US12/942,927 US20110279415A1 (en) 2004-03-17 2010-11-09 Method and system for implementing a user interface for a device employing written graphical elements

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/034,491 Continuation US7831933B2 (en) 2004-03-17 2005-01-12 Method and system for implementing a user interface for a device employing written graphical elements

Publications (1)

Publication Number Publication Date
US20110279415A1 true US20110279415A1 (en) 2011-11-17

Family

ID=36678060

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/034,491 Expired - Fee Related US7831933B2 (en) 2004-03-17 2005-01-12 Method and system for implementing a user interface for a device employing written graphical elements
US12/942,927 Abandoned US20110279415A1 (en) 2004-03-17 2010-11-09 Method and system for implementing a user interface for a device employing written graphical elements

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/034,491 Expired - Fee Related US7831933B2 (en) 2004-03-17 2005-01-12 Method and system for implementing a user interface for a device employing written graphical elements

Country Status (7)

Country Link
US (2) US7831933B2 (en)
EP (1) EP1681624A1 (en)
JP (1) JP2006244463A (en)
KR (1) KR100806241B1 (en)
CN (1) CN1855012A (en)
CA (1) CA2532611A1 (en)
WO (1) WO2006076075A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014036397A3 (en) * 2012-08-31 2014-05-15 Ebay Inc. Expanded icon functionality
US20170242494A1 (en) * 2008-11-25 2017-08-24 Kenji Yoshida Handwriting input/output system, handwriting input sheet, information input system, and information input assistance sheet

Families Citing this family (61)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7916124B1 (en) 2001-06-20 2011-03-29 Leapfrog Enterprises, Inc. Interactive apparatus using print media
US7853193B2 (en) * 2004-03-17 2010-12-14 Leapfrog Enterprises, Inc. Method and device for audibly instructing a user to interact with a function
US7831933B2 (en) 2004-03-17 2010-11-09 Leapfrog Enterprises, Inc. Method and system for implementing a user interface for a device employing written graphical elements
US20060077184A1 (en) * 2004-03-17 2006-04-13 James Marggraff Methods and devices for retrieving and using information stored as a pattern on a surface
US8316068B2 (en) * 2004-06-04 2012-11-20 Telefonaktiebolaget Lm Ericsson (Publ) Memory compression
US20070042335A1 (en) * 2005-05-11 2007-02-22 Ctb Mcgraw-Hill System and method for assessment or survey response collection using a remote, digitally recording user input device
US7922099B1 (en) 2005-07-29 2011-04-12 Leapfrog Enterprises, Inc. System and method for associating content with an image bearing surface
JP4741908B2 (en) * 2005-09-08 2011-08-10 キヤノン株式会社 Information processing apparatus and information processing method
US7936339B2 (en) * 2005-11-01 2011-05-03 Leapfrog Enterprises, Inc. Method and system for invoking computer functionality by interaction with dynamically generated interface regions of a writing surface
US8261967B1 (en) 2006-07-19 2012-09-11 Leapfrog Enterprises, Inc. Techniques for interactively coupling electronic content with printed media
US20080098315A1 (en) * 2006-10-18 2008-04-24 Dao-Liang Chou Executing an operation associated with a region proximate a graphic element on a surface
US8126965B2 (en) * 2007-02-28 2012-02-28 Fuji Xerox Co., Ltd. Paper based meeting service management tool
US8638319B2 (en) 2007-05-29 2014-01-28 Livescribe Inc. Customer authoring tools for creating user-generated content for smart pen applications
US8374992B2 (en) * 2007-05-29 2013-02-12 Livescribe, Inc. Organization of user generated content captured by a smart pen computing system
WO2008150919A1 (en) * 2007-05-29 2008-12-11 Livescribe, Inc. Electronic annotation of documents with preexisting content
US8284951B2 (en) * 2007-05-29 2012-10-09 Livescribe, Inc. Enhanced audio recording for smart pen computing systems
US8416218B2 (en) * 2007-05-29 2013-04-09 Livescribe, Inc. Cyclical creation, transfer and enhancement of multi-modal information between paper and digital domains
KR20100029219A (en) * 2007-05-29 2010-03-16 라이브스크라이브, 인크. Multi-modal smartpen computing system
US8194081B2 (en) * 2007-05-29 2012-06-05 Livescribe, Inc. Animation of audio ink
US20090021495A1 (en) * 2007-05-29 2009-01-22 Edgecomb Tracy L Communicating audio and writing using a smart pen computing system
US8254605B2 (en) * 2007-05-29 2012-08-28 Livescribe, Inc. Binaural recording for smart pen computing systems
WO2008150887A1 (en) * 2007-05-29 2008-12-11 Livescribe, Inc. Self-addressing paper
US8689139B2 (en) * 2007-12-21 2014-04-01 Adobe Systems Incorporated Expandable user interface menu
US8566752B2 (en) 2007-12-21 2013-10-22 Ricoh Co., Ltd. Persistent selection marks
US8944824B2 (en) * 2008-04-03 2015-02-03 Livescribe, Inc. Multi-modal learning system
US20090251441A1 (en) * 2008-04-03 2009-10-08 Livescribe, Inc. Multi-Modal Controller
US7810730B2 (en) 2008-04-03 2010-10-12 Livescribe, Inc. Decoupled applications for printed materials
US20090251338A1 (en) * 2008-04-03 2009-10-08 Livescribe, Inc. Ink Tags In A Smart Pen Computing System
US8446297B2 (en) 2008-04-03 2013-05-21 Livescribe, Inc. Grouping variable media inputs to reflect a user session
US20090251440A1 (en) * 2008-04-03 2009-10-08 Livescribe, Inc. Audio Bookmarking
US8149227B2 (en) * 2008-04-03 2012-04-03 Livescribe, Inc. Removing click and friction noise in a writing device
US8446298B2 (en) * 2008-04-03 2013-05-21 Livescribe, Inc. Quick record function in a smart pen computing system
US9058067B2 (en) * 2008-04-03 2015-06-16 Livescribe Digital bookclip
US8300252B2 (en) * 2008-06-18 2012-10-30 Livescribe, Inc. Managing objects with varying and repeated printed positioning information
US8423916B2 (en) * 2008-11-20 2013-04-16 Canon Kabushiki Kaisha Information processing apparatus, processing method thereof, and computer-readable storage medium
US8289287B2 (en) * 2008-12-30 2012-10-16 Nokia Corporation Method, apparatus and computer program product for providing a personalizable user interface
US8860548B2 (en) * 2009-04-02 2014-10-14 Devecka Enterprises, Inc. Methods and apparatus for art supply useage compliance
US8819597B2 (en) * 2009-04-10 2014-08-26 Google Inc. Glyph entry on computing device
WO2011008862A2 (en) * 2009-07-14 2011-01-20 Zoomii, Inc. Markup language-based authoring and runtime environment for interactive content platform
CN102782614B (en) * 2009-12-28 2016-06-01 摩托罗拉移动公司 For using the input gesture method to associate the object on touch screen
US8642873B2 (en) * 2010-02-12 2014-02-04 ThinkGeek, Inc. Interactive electronic apparel incorporating a drum kit image
US20110307840A1 (en) * 2010-06-10 2011-12-15 Microsoft Corporation Erase, circle, prioritize and application tray gestures
DE102010033270A1 (en) * 2010-08-03 2012-02-09 Qing Holdings Ltd. Method, device and system for retrieving and locally storing data
US9021402B1 (en) 2010-09-24 2015-04-28 Google Inc. Operation of mobile device interface using gestures
US20120216152A1 (en) * 2011-02-23 2012-08-23 Google Inc. Touch gestures for remote control operations
US8793313B2 (en) 2011-09-08 2014-07-29 Red 5 Studios, Inc. Systems, methods and media for distributing peer-to-peer communications
US20130104039A1 (en) * 2011-10-21 2013-04-25 Sony Ericsson Mobile Communications Ab System and Method for Operating a User Interface on an Electronic Device
US9261961B2 (en) 2012-06-07 2016-02-16 Nook Digital, Llc Accessibility aids for users of electronic devices
US8632411B1 (en) 2012-06-28 2014-01-21 Red 5 Studios, Inc. Exchanging virtual rewards for computing resources
US8834268B2 (en) 2012-07-13 2014-09-16 Red 5 Studios, Inc. Peripheral device control and usage in a broadcaster mode for gaming environments
KR20140008985A (en) * 2012-07-13 2014-01-22 삼성전자주식회사 User interface appratus in a user terminal and method therefor
US9658746B2 (en) * 2012-07-20 2017-05-23 Nook Digital, Llc Accessible reading mode techniques for electronic devices
US9383834B2 (en) 2012-12-26 2016-07-05 Xerox Corporation System and method for creating and modifying physically transient handwritten digital documents
US9971495B2 (en) 2013-01-28 2018-05-15 Nook Digital, Llc Context based gesture delineation for user interaction in eyes-free mode
US9478146B2 (en) 2013-03-04 2016-10-25 Xerox Corporation Method and system for capturing reading assessment data
KR102157270B1 (en) * 2013-04-26 2020-10-23 삼성전자주식회사 User terminal device with a pen and control method thereof
KR20150039378A (en) * 2013-10-02 2015-04-10 삼성메디슨 주식회사 Medical device, controller of medical device, method for control of medical device
WO2016128484A1 (en) * 2015-02-13 2016-08-18 Dover Europe Sarl Hierarchical icons for graphical user interface
US10754495B1 (en) * 2016-04-05 2020-08-25 Bentley Systems, Incorporated 3-D screen menus
CN111209034B (en) * 2020-01-13 2023-03-03 成都卓影科技股份有限公司 Method for configuring visual dynamic page of TV large screen
US11546458B2 (en) * 2020-06-10 2023-01-03 Micron Technology, Inc. Organizing applications for mobile devices

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5943039A (en) * 1991-02-01 1999-08-24 U.S. Philips Corporation Apparatus for the interactive handling of objects
US6091675A (en) * 1996-07-13 2000-07-18 Samsung Electronics Co., Ltd. Integrated CD-ROM driving apparatus for driving different types of CD-ROMs in multimedia computer systems
US6426761B1 (en) * 1999-04-23 2002-07-30 Internation Business Machines Corporation Information presentation system for a graphical user interface
US6476834B1 (en) * 1999-05-28 2002-11-05 International Business Machines Corporation Dynamic creation of selectable items on surfaces
US20030014615A1 (en) * 2001-06-25 2003-01-16 Stefan Lynggaard Control of a unit provided with a processor
US6570597B1 (en) * 1998-11-04 2003-05-27 Fuji Xerox Co., Ltd. Icon display processor for displaying icons representing sub-data embedded in or linked to main icon data
US6646633B1 (en) * 2001-01-24 2003-11-11 Palm Source, Inc. Method and system for a full screen user interface and data entry using sensors to implement handwritten glyphs
US20050024341A1 (en) * 2001-05-16 2005-02-03 Synaptics, Inc. Touch screen with user interface enhancement
US20050251755A1 (en) * 2004-05-06 2005-11-10 Pixar Toolbar slot method and apparatus
US7530023B2 (en) * 2001-11-13 2009-05-05 International Business Machines Corporation System and method for selecting electronic documents from a physical document and for displaying said electronic documents over said physical document

Family Cites Families (327)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AUPQ131399A0 (en) 1999-06-30 1999-07-22 Silverbrook Research Pty Ltd A method and apparatus (NPAGE02)
US2182334A (en) 1939-02-18 1939-12-05 Crespo Joseph Panoramic device
US2932907A (en) 1956-01-16 1960-04-19 Joseph A Stieber Map projections demonstrator
US3304612A (en) 1963-12-23 1967-02-21 Union Oil Co Method and apparatus for converting cartograph coordinates to permanent digital form
US3292489A (en) 1964-07-09 1966-12-20 Ibm Hierarchical search system
US3466391A (en) 1966-09-07 1969-09-09 Marconi Co Ltd Electrical position resolver employing capacitive probe coupled to resistive layer containing two pairs of conductive strips energized by four different frequency signals
US3591718A (en) 1968-04-18 1971-07-06 Shintron Co Inc Graphical input tablet
US3782734A (en) 1971-03-15 1974-01-01 S Krainin Talking book, an educational toy with multi-position sound track and improved stylus transducer
US3798370A (en) 1972-04-17 1974-03-19 Elographics Inc Electrographic sensor for determining planar coordinates
US3921165A (en) 1973-03-21 1975-11-18 Ibm High resolution graphic data tablet
US3911215A (en) 1974-03-18 1975-10-07 Elographics Inc Discriminating contact sensor
US4079194A (en) 1976-08-09 1978-03-14 Victor Kley Graphical data entry pad
US4220815B1 (en) 1978-12-04 1996-09-03 Elographics Inc Nonplanar transparent electrographic sensor
NL7904469A (en) 1979-06-07 1980-12-09 Philips Nv DEVICE FOR READING A PRINTED CODE AND CONVERTING IT TO AN AUDIO SIGNAL.
US4686332A (en) 1986-06-26 1987-08-11 International Business Machines Corporation Combined finger touch and stylus detection system for use on the viewing surface of a visual display device
US4337375A (en) 1980-06-12 1982-06-29 Texas Instruments Incorporated Manually controllable data reading apparatus for speech synthesizers
US4464118A (en) 1980-06-19 1984-08-07 Texas Instruments Incorporated Didactic device to improve penmanship and drawing skills
JPS585611A (en) 1981-07-01 1983-01-13 Toyota Motor Corp Device for guiding running operation
DE3126886A1 (en) 1981-07-08 1983-01-27 Olympia Werke Ag DEVICE FOR TEXT PROCESSING AND TEXT PROCESSING
US4425099A (en) 1981-10-13 1984-01-10 Texas Instruments Incorporated Educational aid for use with workbook
US4604065A (en) 1982-10-25 1986-08-05 Price/Stern/Sloan Publishers, Inc. Teaching or amusement apparatus
US4604058A (en) 1982-11-01 1986-08-05 Teledyne Industries, Inc. Dental appliance
US4492819A (en) 1982-12-30 1985-01-08 Kurta Corporation Graphic tablet and method
US4570149A (en) 1983-03-15 1986-02-11 Koala Technologies Corporation Simplified touch tablet data device
US4603231A (en) 1983-03-31 1986-07-29 Interand Corporation System for sensing spatial coordinates
US4650926A (en) 1984-10-26 1987-03-17 Scriptel Corporation Electrographic system and method
US4627819A (en) 1985-01-23 1986-12-09 Price/Stern/Sloan Publishers, Inc. Teaching or amusement apparatus
GB2202664B (en) 1985-10-22 1990-08-15 Garry Douglas Robb Automated service systems
US4739299A (en) 1986-01-17 1988-04-19 Interlink Electronics, Inc. Digitizer pad
US4764800A (en) * 1986-05-07 1988-08-16 Advanced Micro Devices, Inc. Seal structure for an integrated circuit
US5057024A (en) 1986-08-01 1991-10-15 Sprott Glenn C Computerized globe/almanac system
US4748318A (en) 1986-10-22 1988-05-31 Bearden James D Wand for a hand-held combined light pen and bar code reader
US4839634A (en) 1986-12-01 1989-06-13 More Edward S Electro-optic slate for input/output of hand-entered textual and graphic information
US5194852A (en) 1986-12-01 1993-03-16 More Edward S Electro-optic slate for direct entry and display and/or storage of hand-entered textual and graphic information
US4787040A (en) 1986-12-22 1988-11-22 International Business Machines Corporation Display system for automotive vehicle
GB8702728D0 (en) 1987-02-06 1987-03-11 Price Stern Sloan Publishers Teaching & amusement apparatus
JP2658039B2 (en) 1987-03-20 1997-09-30 キヤノン株式会社 Information processing device
GB2207027B (en) 1987-07-15 1992-01-08 Matsushita Electric Works Ltd Voice encoding and composing system
US5030117A (en) 1987-09-25 1991-07-09 Delorme David M Digital global map generating system
US4841387A (en) 1987-12-15 1989-06-20 Rindfuss Diane J Arrangement for recording and indexing information
US5113178A (en) 1988-01-29 1992-05-12 Aisin Seiki K.K. Position display apparatus
US4853498A (en) 1988-06-13 1989-08-01 Tektronix, Inc. Position measurement apparatus for capacitive touch panel system
US4924387A (en) 1988-06-20 1990-05-08 Jeppesen John C Computerized court reporting system
US4913463A (en) 1988-10-27 1990-04-03 Texas Instruments Incorporated Hinged case providing sectional cover with anti-pinch interleaving through
US5007085A (en) 1988-10-28 1991-04-09 International Business Machines Corporation Remotely sensed personal stylus
US4853499A (en) 1988-12-12 1989-08-01 Calcomp Inc. Ground switching technique for silkscreened digitizer grids
US5256901A (en) * 1988-12-26 1993-10-26 Ngk Insulators, Ltd. Ceramic package for memory semiconductor
JPH0322259A (en) 1989-03-22 1991-01-30 Seiko Epson Corp Small-sized data display and reproducing device
US5484292A (en) 1989-08-21 1996-01-16 Mctaggart; Stephen I. Apparatus for combining audio and visual indicia
US5167508A (en) 1989-08-21 1992-12-01 Mc Taggart Stephen I Electronic book
US5209665A (en) * 1989-10-12 1993-05-11 Sight & Sound Incorporated Interactive audio visual work
US5184003A (en) 1989-12-04 1993-02-02 National Computer Systems, Inc. Scannable form having a control mark column with encoded data marks
JP2784825B2 (en) * 1989-12-05 1998-08-06 ソニー株式会社 Information input control device
CA2044404C (en) 1990-07-31 1998-06-23 Dan S. Bloomberg Self-clocking glyph shape codes
US5168147A (en) 1990-07-31 1992-12-01 Xerox Corporation Binary image processing for decoding self-clocking glyph shape codes
US5128525A (en) 1990-07-31 1992-07-07 Xerox Corporation Convolution filtering for decoding self-clocking glyph shape codes
JPH04121923A (en) 1990-09-12 1992-04-22 Sony Corp Switch structure for electronic apparatus
US5053585A (en) 1990-10-12 1991-10-01 Interlink Electronics, Incorporated Multipurpose keyboard using digitizer pad featuring spatial minimization of a pressure contact area and method of making same
US5149919A (en) 1990-10-31 1992-09-22 International Business Machines Corporation Stylus sensing system
US5117071A (en) 1990-10-31 1992-05-26 International Business Machines Corporation Stylus sensing system
GB9024526D0 (en) * 1990-11-12 1991-01-02 Eden Group Ltd Electronic display apparatus
US5574804A (en) 1990-12-21 1996-11-12 Olschafskie; Francis Hand-held scanner
US5301243A (en) 1990-12-21 1994-04-05 Francis Olschafskie Hand-held character-oriented scanner with external view area
GB9103768D0 (en) 1991-02-22 1991-04-10 King Reginald A Educational apparatus
JPH04274510A (en) 1991-02-28 1992-09-30 Casio Comput Co Ltd Input device
US5220649A (en) 1991-03-20 1993-06-15 Forcier Mitchell D Script/binary-encoded-character processing method and system with moving space insertion mode
US5848187A (en) 1991-11-18 1998-12-08 Compaq Computer Corporation Method and apparatus for entering and manipulating spreadsheet cell data
JP3120085B2 (en) 1991-11-21 2000-12-25 株式会社セガ Electronic devices and information carriers
US5220136A (en) 1991-11-26 1993-06-15 Elographics, Inc. Contact touchscreen with an improved insulated spacer arrangement
US5221833A (en) 1991-12-27 1993-06-22 Xerox Corporation Methods and means for reducing bit error rates in reading self-clocking glyph codes
JPH05334470A (en) 1991-12-27 1993-12-17 Xerox Corp Self-clocking graphic mark code
US5314336A (en) 1992-02-07 1994-05-24 Mark Diamond Toy and method providing audio output representative of message optically sensed by the toy
US5788508A (en) 1992-02-11 1998-08-04 John R. Lee Interactive computer aided natural learning method and apparatus
JPH05265633A (en) 1992-03-18 1993-10-15 Gunze Ltd Touch panel
US5217376A (en) 1992-03-20 1993-06-08 Marcel Gosselin Drawing aid
US5852434A (en) 1992-04-03 1998-12-22 Sekendur; Oral F. Absolute optical position determination
US5356296A (en) * 1992-07-08 1994-10-18 Harold D. Pierce Audio storybook
US5438430A (en) 1992-09-25 1995-08-01 Xerox Corporation Paper user interface for image manipulations such as cut and paste
US5739814A (en) 1992-09-28 1998-04-14 Sega Enterprises Information storage system and book device for providing information in response to the user specification
ATE195030T1 (en) 1992-09-28 2000-08-15 Olympus Optical Co POINT CODE RECORDING MEDIUM AND INFORMATION RECORDING SYSTEM
US5401916A (en) 1992-09-29 1995-03-28 Ncr Corporation Method and apparatus for capturing handwritten information and providing visual feedback
US5217378A (en) 1992-09-30 1993-06-08 Donovan Karen R Painting kit for the visually impaired
WO1994015272A1 (en) 1992-12-22 1994-07-07 Morgan Michael W Pen-based electronic teaching system
US5409381A (en) * 1992-12-31 1995-04-25 Sundberg Learning Systems, Inc. Educational display device and method
JPH06266490A (en) 1993-03-12 1994-09-22 Toshiba Corp Information input device and position recognition system for information input
US5510606A (en) 1993-03-16 1996-04-23 Worthington; Hall V. Data collection system including a portable data collection terminal with voice prompts
US5474457A (en) * 1993-06-09 1995-12-12 Bromley; Eric Interactive talking picture machine
US5413486A (en) * 1993-06-18 1995-05-09 Joshua Morris Publishing, Inc. Interactive book
US5572651A (en) * 1993-10-15 1996-11-05 Xerox Corporation Table-based user interface for retrieving and manipulating indices between data structures
DE69423296T2 (en) 1993-11-30 2000-11-30 Hewlett Packard Co Arrangement for data entry
US5835726A (en) 1993-12-15 1998-11-10 Check Point Software Technologies Ltd. System for securing the flow of and selectively modifying packets in a computer network
JP3546337B2 (en) 1993-12-21 2004-07-28 ゼロックス コーポレイション User interface device for computing system and method of using graphic keyboard
US5604517A (en) 1994-01-14 1997-02-18 Binney & Smith Inc. Electronic drawing device
US5561446A (en) 1994-01-28 1996-10-01 Montlick; Terry F. Method and apparatus for wireless remote information retrieval and pen-based data entry
IL108566A0 (en) 1994-02-04 1994-05-30 Baron Research & Dev Company L Handwriting input apparatus using more than one sensing technique
US5466158A (en) * 1994-02-14 1995-11-14 Smith, Iii; Jay Interactive book device
US5480306A (en) 1994-03-16 1996-01-02 Liu; Chih-Yuan Language learning apparatus and method utilizing optical code as input medium
US5574519A (en) 1994-05-03 1996-11-12 Eastman Kodak Company Talking photoalbum
JP2939119B2 (en) 1994-05-16 1999-08-25 シャープ株式会社 Handwritten character input display device and method
US5649023A (en) 1994-05-24 1997-07-15 Panasonic Technologies, Inc. Method and apparatus for indexing a plurality of handwritten objects
US6008799A (en) 1994-05-24 1999-12-28 Microsoft Corporation Method and system for entering data using an improved on-screen keyboard
US6164534A (en) 1996-04-04 2000-12-26 Rathus; Spencer A. Method and apparatus for accessing electronic data via a familiar printed medium
US5932863A (en) 1994-05-25 1999-08-03 Rathus; Spencer A. Method and apparatus for accessing electric data via a familiar printed medium
US5624265A (en) 1994-07-01 1997-04-29 Tv Interactive Data Corporation Printed publication remote contol for accessing interactive media
US5652412A (en) 1994-07-11 1997-07-29 Sia Technology Corp. Pen and paper information recording system
EP0693739A3 (en) 1994-07-13 1997-06-11 Yashima Denki Kk Method and apparatus capable of storing and reproducing handwriting
US5640193A (en) * 1994-08-15 1997-06-17 Lucent Technologies Inc. Multimedia service access by reading marks on an object
US5951298A (en) * 1994-08-23 1999-09-14 Werzberger; Bernice Floraine Interactive book assembly
US6262719B1 (en) * 1994-09-02 2001-07-17 Packard Bell Nec, Inc. Mouse emulation with a passive pen
US5974558A (en) * 1994-09-02 1999-10-26 Packard Bell Nec Resume on pen contact
US5801687A (en) * 1994-09-30 1998-09-01 Apple Computer, Inc. Authoring tool comprising nested state machines for use in a computer system
US5652714A (en) * 1994-09-30 1997-07-29 Apple Computer, Inc. Method and apparatus for capturing transient events in a multimedia product using an authoring tool on a computer system
US5661506A (en) * 1994-11-10 1997-08-26 Sia Technology Corporation Pen and paper information recording system using an imaging pen
CA2163316A1 (en) 1994-11-21 1996-05-22 Roger L. Collins Interactive play with a computer
US6018656A (en) 1994-12-30 2000-01-25 Sony Corporation Programmable cellular telephone and system
US5760773A (en) 1995-01-06 1998-06-02 Microsoft Corporation Methods and apparatus for interacting with data objects using action handles
US5636995A (en) 1995-01-17 1997-06-10 Stephen A. Schwartz Interactive story book and graphics tablet apparatus and methods for operating the same
FI99071C (en) 1995-02-15 1997-09-25 Nokia Mobile Phones Ltd Procedure for use of applications in a mobile telephone as well as a mobile telephone
US5801702A (en) * 1995-03-09 1998-09-01 Terrabyte Technology System and method for adding network links in a displayed hierarchy
US5520544A (en) * 1995-03-27 1996-05-28 Eastman Kodak Company Talking picture album
US5730602A (en) 1995-04-28 1998-03-24 Penmanship, Inc. Computerized method and apparatus for teaching handwriting
JPH08335134A (en) 1995-06-07 1996-12-17 Canon Inc Information processor
US5978773A (en) 1995-06-20 1999-11-02 Neomedia Technologies, Inc. System and method for using an ordinary article of commerce to access a remote computer
JPH0926769A (en) * 1995-07-10 1997-01-28 Hitachi Ltd Picture display device
JP3111024B2 (en) 1995-07-19 2000-11-20 キヤノン株式会社 Apparatus and method for manufacturing color filter, method for manufacturing display apparatus, and method for manufacturing apparatus provided with display apparatus
US6124851A (en) * 1995-07-20 2000-09-26 E Ink Corporation Electronic book with multiple page displays
DE69637146T2 (en) 1995-08-03 2008-02-28 Interval Research Corp., Palo Alto COMPUTER INTERACTOR SYSTEM AND METHOD FOR PROVIDING IT
US7498509B2 (en) 1995-09-28 2009-03-03 Fiberspar Corporation Composite coiled tubing end connector
US5635726A (en) 1995-10-19 1997-06-03 Lucid Technologies Inc. Electro-optical sensor for marks on a sheet
US6081261A (en) * 1995-11-01 2000-06-27 Ricoh Corporation Manual entry interactive paper and electronic document handling and processing system
US5767457A (en) 1995-11-13 1998-06-16 Cirque Corporation Apparatus and method for audible feedback from input device
US5697793A (en) * 1995-12-14 1997-12-16 Motorola, Inc. Electronic book and method of displaying at least one reading metric therefor
US5663748A (en) * 1995-12-14 1997-09-02 Motorola, Inc. Electronic book having highlighting feature
US5694102A (en) 1995-12-21 1997-12-02 Xerox Corporation Vector reconstruction of asynchronously captured tiled embedded data blocks
US6000621A (en) 1995-12-21 1999-12-14 Xerox Corporation Tilings of mono-code and dual-code embedded data pattern strips for robust asynchronous capture
TW394879B (en) 1996-02-09 2000-06-21 Sega Enterprises Kk Graphics processing system and its data input device
US5877458A (en) 1996-02-15 1999-03-02 Kke/Explore Acquisition Corp. Surface position location system and method
US5686705A (en) 1996-02-15 1997-11-11 Explore Technologies, Inc. Surface position location system and method
US5902968A (en) * 1996-02-20 1999-05-11 Ricoh Company, Ltd. Pen-shaped handwriting input apparatus using accelerometers and gyroscopes and an associated operational device for determining pen movement
US5757361A (en) 1996-03-20 1998-05-26 International Business Machines Corporation Method and apparatus in computer systems to selectively map tablet input devices using a virtual boundary
JP3193628B2 (en) 1996-03-28 2001-07-30 オリンパス光学工業株式会社 Code printing device
JP3061765B2 (en) 1996-05-23 2000-07-10 ゼロックス コーポレイション Computer-based document processing method
JP3378900B2 (en) 1996-06-25 2003-02-17 富士通株式会社 Object editing method, object editing system, and recording medium
US5956034A (en) 1996-08-13 1999-09-21 Softbook Press, Inc. Method and apparatus for viewing electronic reading materials
US5847698A (en) 1996-09-17 1998-12-08 Dataventures, Inc. Electronic book device
US5903729A (en) * 1996-09-23 1999-05-11 Motorola, Inc. Method, system, and article of manufacture for navigating to a resource in an electronic network
US6218964B1 (en) * 1996-09-25 2001-04-17 Christ G. Ellis Mechanical and digital reading pen
US5803748A (en) * 1996-09-30 1998-09-08 Publications International, Ltd. Apparatus for producing audible sounds in response to visual indicia
US5973420A (en) 1996-10-03 1999-10-26 Colortronics Technologies L.L.C. Electrical system having a clear conductive composition
US5790114A (en) 1996-10-04 1998-08-04 Microtouch Systems, Inc. Electronic whiteboard with multi-functional user interface
US6130666A (en) 1996-10-07 2000-10-10 Persidsky; Andre Self-contained pen computer with built-in display
US5889506A (en) 1996-10-25 1999-03-30 Matsushita Electric Industrial Co., Ltd. Video user's environment
JP2002515149A (en) 1996-11-08 2002-05-21 ネオメディア テクノロジーズ,インク. Automatic access of electronic information by machine readable code of printed documents
US6313828B1 (en) 1996-11-12 2001-11-06 Carlos Landetta Chombo Electronic book
US6088023A (en) * 1996-12-10 2000-07-11 Willow Design, Inc. Integrated pointing and drawing graphics system for computers
US5937110A (en) 1996-12-20 1999-08-10 Xerox Corporation Parallel propagating embedded binary sequences for characterizing objects in N-dimensional address space
US6215901B1 (en) 1997-03-07 2001-04-10 Mark H. Schwartz Pen based computer handwriting instruction
US5970455A (en) 1997-03-20 1999-10-19 Xerox Corporation System for capturing and retrieving audio data and corresponding hand-written notes
KR100224618B1 (en) 1997-03-27 1999-10-15 윤종용 View changing method for multi-purpose educational device
WO1998051035A1 (en) 1997-05-09 1998-11-12 Neomedia Technologies, Inc. Method and system for accessing electronic resources via machine-readable data on intelligent documents
US6104387A (en) 1997-05-14 2000-08-15 Virtual Ink Corporation Transcription system
KR100208019B1 (en) 1997-07-16 1999-07-15 윤종용 Multi-purpose training system
JP3475048B2 (en) 1997-07-18 2003-12-08 シャープ株式会社 Handwriting input device
US5957697A (en) * 1997-08-20 1999-09-28 Ithaca Media Corporation Printed book augmented with an electronic virtual book and associated electronic data
US5910009A (en) * 1997-08-25 1999-06-08 Leff; Ruth B. Communication aid using multiple membrane switches
US6252564B1 (en) * 1997-08-28 2001-06-26 E Ink Corporation Tiled displays
JPH1178369A (en) 1997-09-03 1999-03-23 Plus Kogyo Kk Display system
US6201903B1 (en) 1997-09-30 2001-03-13 Ricoh Company, Ltd. Method and apparatus for pen-based faxing
US6518950B1 (en) * 1997-10-07 2003-02-11 Interval Research Corporation Methods and systems for providing human/computer interfaces
US6256638B1 (en) * 1998-04-14 2001-07-03 Interval Research Corporation Printable interfaces and digital linkmarks
US6215476B1 (en) 1997-10-10 2001-04-10 Apple Computer, Inc. Flat panel display with integrated electromagnetic pen digitizer
WO1999019823A2 (en) 1997-10-10 1999-04-22 Interval Research Corporation Methods and systems for providing human/computer interfaces
JPH11122401A (en) 1997-10-17 1999-04-30 Noritsu Koki Co Ltd Device for preparing photograph provided with voice code
GB9722766D0 (en) * 1997-10-28 1997-12-24 British Telecomm Portable computers
CN1142471C (en) 1997-11-21 2004-03-17 资通电脑股份有限公司 Method and apparatus for operation by hand written alphabets and symbols
US6297824B1 (en) * 1997-11-26 2001-10-02 Xerox Corporation Interactive interface for viewing retrieval results
US6181329B1 (en) * 1997-12-23 2001-01-30 Ricoh Company, Ltd. Method and apparatus for tracking a hand-held writing instrument with multiple sensors that are calibrated by placing the writing instrument in predetermined positions with respect to the writing surface
US5992817A (en) 1998-02-04 1999-11-30 Klitsner Industrial Design, Llc Keyboard interface device
US6148173A (en) * 1998-02-26 2000-11-14 Eastman Kodak Company System for initialization of an image holder that stores images with associated audio segments
US6456749B1 (en) * 1998-02-27 2002-09-24 Carnegie Mellon University Handheld apparatus for recognition of writing, for remote communication, and for user defined input templates
US6144371A (en) 1998-03-18 2000-11-07 International Business Machines Corporation Thinkscribe combined electronic and paper based scheduling
US6331867B1 (en) * 1998-03-20 2001-12-18 Nuvomedia, Inc. Electronic book with automated look-up of terms of within reference titles
US6665490B2 (en) 1998-04-01 2003-12-16 Xerox Corporation Obtaining and using data associating annotating activities with portions of recordings
US6330976B1 (en) * 1998-04-01 2001-12-18 Xerox Corporation Marking medium area with encoded identifier for producing action through network
US6064855A (en) * 1998-04-27 2000-05-16 Ho; Frederick Pak Wai Voice book system
WO1999059101A2 (en) 1998-05-12 1999-11-18 E-Ink Corporation Microencapsulated electrophoretic electrostatically-addressed media for drawing device applications
US6100877A (en) 1998-05-14 2000-08-08 Virtual Ink, Corp. Method for calibrating a transcription system
JP4144935B2 (en) 1998-06-08 2008-09-03 ノーリツ鋼機株式会社 Reception method and reception apparatus for creating a photograph with sound
US6199042B1 (en) 1998-06-19 2001-03-06 L&H Applications Usa, Inc. Reading system
US6331865B1 (en) * 1998-10-16 2001-12-18 Softbook Press, Inc. Method and apparatus for electronically distributing and viewing digital contents
US6089943A (en) 1998-10-30 2000-07-18 Tai Sun Plastic Novelties Ltd. Toy
US6392632B1 (en) * 1998-12-08 2002-05-21 Windbond Electronics, Corp. Optical mouse having an integrated camera
US5997309A (en) 1998-12-31 1999-12-07 Metheny; Jeff Erasable drawing board
JP2000206631A (en) 1999-01-18 2000-07-28 Olympus Optical Co Ltd Photographing device
US20020000468A1 (en) 1999-04-19 2002-01-03 Pradeep K. Bansal System and method for scanning & storing universal resource locator codes
US6396481B1 (en) 1999-04-19 2002-05-28 Ecrio Inc. Apparatus and method for portable handwriting capture
US7178718B2 (en) * 1999-05-25 2007-02-20 Silverbrook Research Pty Ltd Methods and systems for object identification and interaction
AUPQ291299A0 (en) 1999-09-17 1999-10-07 Silverbrook Research Pty Ltd A self mapping surface and related applications
US6830196B1 (en) 1999-05-25 2004-12-14 Silverbrook Research Pty Ltd Identity-coded surface region
US7170499B1 (en) 1999-05-25 2007-01-30 Silverbrook Research Pty Ltd Handwritten text capture via interface surface
AUPQ363299A0 (en) 1999-10-25 1999-11-18 Silverbrook Research Pty Ltd Paper based information inter face
US7099019B2 (en) 1999-05-25 2006-08-29 Silverbrook Research Pty Ltd Interface surface printer using invisible ink
US7721948B1 (en) 1999-05-25 2010-05-25 Silverbrook Research Pty Ltd Method and system for online payments
EP1188143B1 (en) 1999-05-28 2010-05-26 Anoto AB Position determination
US6502756B1 (en) 1999-05-28 2003-01-07 Anoto Ab Recording of information
JP2003503905A (en) 1999-06-28 2003-01-28 アノト・アクティエボラーク Recording information
SE516561C2 (en) 1999-06-28 2002-01-29 C Technologies Ab Reading pen for reading text with light emitting diodes placed in the body on the large face of a printed circuit board to supply illumination
US6405167B1 (en) * 1999-07-16 2002-06-11 Mary Ann Cogliano Interactive book
US6304989B1 (en) 1999-07-21 2001-10-16 Credence Systems Corporation Built-in spare row and column replacement analysis system for embedded memories
US6763995B1 (en) * 1999-08-09 2004-07-20 Pil, L.L.C. Method and system for illustrating sound and text
US6363239B1 (en) * 1999-08-11 2002-03-26 Eastman Kodak Company Print having attached audio data storage and method of providing same
SE0000939L (en) 2000-02-18 2001-08-19 Anoto Ab Inenhetsarrangemang
AU7046700A (en) 1999-08-30 2001-03-26 Anoto Ab Notepad
US6183262B1 (en) 1999-09-10 2001-02-06 Shao-Chien Tseng Magnetic drawing board structure
SE517445C2 (en) 1999-10-01 2002-06-04 Anoto Ab Position determination on a surface provided with a position coding pattern
US6304898B1 (en) 1999-10-13 2001-10-16 Datahouse, Inc. Method and system for creating and sending graphical email
US6564249B2 (en) 1999-10-13 2003-05-13 Dh Labs, Inc. Method and system for creating and sending handwritten or handdrawn messages
US6493734B1 (en) * 1999-10-15 2002-12-10 Softbook Press, Inc. System and method to efficiently generate and switch page display views on a portable electronic book
US6322369B1 (en) * 1999-10-20 2001-11-27 Yetta L. Patterson Christian learning tool
US6241528B1 (en) 1999-10-29 2001-06-05 Edwin C. Myers Reusable writing table
US6886036B1 (en) * 1999-11-02 2005-04-26 Nokia Corporation System and method for enhanced data access efficiency using an electronic book over data networks
US7006116B1 (en) * 1999-11-16 2006-02-28 Nokia Corporation Tangibly encoded media identification in a book cover
US20030046256A1 (en) 1999-12-23 2003-03-06 Ola Hugosson Distributed information management
US7295193B2 (en) 1999-12-23 2007-11-13 Anoto Ab Written command
US6724373B1 (en) * 2000-01-05 2004-04-20 Brother International Corporation Electronic whiteboard hot zones for controlling local and remote personal computer functions
US6532314B1 (en) 2000-01-28 2003-03-11 Learning Resources, Inc. Talking toy scanner
US6697602B1 (en) * 2000-02-04 2004-02-24 Mattel, Inc. Talking book
AU4060701A (en) 2000-02-16 2001-08-27 Telefonaktiebolaget Lm Ericsson (Publ) Method and system for configuring and unlocking an electronic reading device
US6738053B1 (en) 2000-02-16 2004-05-18 Telefonaktiebolaget Lm Ericsson (Publ) Predefined electronic pen applications in specially formatted paper
US6885878B1 (en) * 2000-02-16 2005-04-26 Telefonaktiebolaget L M Ericsson (Publ) Method and system for using an electronic reading device as a general application input and navigation interface
US6992655B2 (en) 2000-02-18 2006-01-31 Anoto Ab Input unit arrangement
KR100460105B1 (en) 2000-02-22 2004-12-03 엘지전자 주식회사 Method for searching a menu in a mobile communication terminal
US6556188B1 (en) * 2000-02-25 2003-04-29 Ncr Corporation Three-dimensional check image viewer and a method of handling check images in an image-based check processing system
US6572378B1 (en) * 2000-02-29 2003-06-03 Rehco, Llc Electronic drawing assist toy
SE0000949L (en) 2000-03-21 2001-09-22 Anoto Ab location information
SE516109C2 (en) 2000-03-21 2001-11-19 Anoto Ab Procedure, systems and computer programs for document management using position coding patterns
SE518962C2 (en) 2000-03-21 2002-12-10 Anoto Ab Product and method for encoding data into a matrix-shaped coding pattern
SE517984C2 (en) 2000-03-21 2002-08-13 Anoto Ab Arrangement for input of information
US6442350B1 (en) 2000-04-04 2002-08-27 Eastman Kodak Company Camera with sound recording capability
US7094977B2 (en) 2000-04-05 2006-08-22 Anoto Ip Lic Handelsbolag Method and system for information association
SE516310C2 (en) 2000-04-05 2001-12-17 Anoto Ab Product with two coding patterns, including raster points; and method, computer program and device for reading the coding patterns
SE517875C2 (en) 2000-04-05 2002-07-30 Anoto Ab Procedure and arrangement for associating a value document with a subset of a position coding pattern
SE516281C2 (en) 2000-04-05 2001-12-10 Anoto Ab Product and method for information management, where a removable writing surface with absolute position coding pattern is written and read
US6771283B2 (en) * 2000-04-26 2004-08-03 International Business Machines Corporation Method and system for accessing interactive multimedia information or services by touching highlighted items on physical documents
US20010056439A1 (en) * 2000-04-26 2001-12-27 International Business Machines Corporation Method and system for accessing interactive multimedia information or services by touching marked items on physical documents
US6668156B2 (en) 2000-04-27 2003-12-23 Leapfrog Enterprises, Inc. Print media receiving unit including platform and print media
US6661405B1 (en) 2000-04-27 2003-12-09 Leapfrog Enterprises, Inc. Electrographic position location apparatus and method
WO2001086612A1 (en) 2000-05-12 2001-11-15 Jrl Enterprises, Inc. An interactive, computer-aided handwriting method and apparatus with enhanced digitization tablet
US6956562B1 (en) 2000-05-16 2005-10-18 Palmsource, Inc. Method for controlling a handheld computer by entering commands onto a displayed feature of the handheld computer
US6421524B1 (en) * 2000-05-30 2002-07-16 International Business Machines Corporation Personalized electronic talking book
SE516567C2 (en) 2000-06-07 2002-01-29 Anoto Ab Procedure and apparatus for secure wireless transmission of information
US6304667B1 (en) 2000-06-21 2001-10-16 Carmen T. Reitano System and method for incorporating dyslexia detection in handwriting pattern recognition systems
GB2365614A (en) 2000-06-30 2002-02-20 Gateway Inc An apparatus and method of generating an audio recording having linked data
US20020077902A1 (en) 2000-06-30 2002-06-20 Dwight Marcus Method and apparatus for verifying review and comprehension of information
US7289110B2 (en) * 2000-07-17 2007-10-30 Human Messaging Ab Method and arrangement for identifying and processing commands in digital images, where the user marks the command, for example by encircling it
US6933928B1 (en) * 2000-07-18 2005-08-23 Scott E. Lilienthal Electronic book player with audio synchronization
US20020023957A1 (en) 2000-08-21 2002-02-28 A. John Michaelis Method and apparatus for providing audio/visual feedback to scanning pen users
AU2001239008A1 (en) * 2000-08-31 2002-03-13 The Gadget Factory Pty Ltd Computer publication
US6704699B2 (en) 2000-09-05 2004-03-09 Einat H. Nir Language acquisition aide
US6647369B1 (en) 2000-10-20 2003-11-11 Silverbrook Research Pty Ltd. Reader to decode sound and play sound encoded in infra-red ink on photographs
JP4552308B2 (en) 2000-10-24 2010-09-29 パナソニック株式会社 Ultrasonic coordinate input device and interactive board
US6940491B2 (en) * 2000-10-27 2005-09-06 International Business Machines Corporation Method and system for generating hyperlinked physical copies of hyperlinked electronic documents
WO2002042894A1 (en) 2000-11-25 2002-05-30 Silverbrook Research Pty Ltd Interactive printer
US7193618B2 (en) 2000-12-01 2007-03-20 Hewlett-Packard Development Company, L.P. Electronic ink ball point pen with memory
JP2004516542A (en) * 2000-12-15 2004-06-03 フィンガー システム インク. Pen-type optical mouse device and method of controlling pen-type optical mouse device
US6924822B2 (en) * 2000-12-21 2005-08-02 Xerox Corporation Magnification methods, systems, and computer program products for virtual three-dimensional books
US7015910B2 (en) * 2000-12-21 2006-03-21 Xerox Corporation Methods, systems, and computer program products for the display and operation of virtual three-dimensional books
US7139982B2 (en) * 2000-12-21 2006-11-21 Xerox Corporation Navigation methods, systems, and computer program products for virtual three-dimensional books
US7069518B2 (en) * 2000-12-21 2006-06-27 Xerox Corporation Indexing methods, systems, and computer program products for virtual three-dimensional books
US7240291B2 (en) * 2000-12-21 2007-07-03 Xerox Corporation Methods, systems, and computer program products for display of information relating to a virtual three-dimensional book
US6798907B1 (en) 2001-01-24 2004-09-28 Advanced Digital Systems, Inc. System, computer software product and method for transmitting and processing handwritten data
US6802586B2 (en) 2001-02-27 2004-10-12 Hewlett-Packard Development Company, L.P. Method and apparatus for software updates
JP2002297308A (en) 2001-03-30 2002-10-11 Brother Ind Ltd Input device
FR2823337B1 (en) 2001-04-05 2004-10-15 Netseniors METHOD FOR READING, PROCESSING, TRANSMISSION AND OPERATION OF A BAR CODE
US7107533B2 (en) 2001-04-09 2006-09-12 International Business Machines Corporation Electronic book with multimode I/O
US6831632B2 (en) * 2001-04-09 2004-12-14 I. C. + Technologies Ltd. Apparatus and methods for hand motion tracking and handwriting recognition
US6535799B2 (en) * 2001-04-30 2003-03-18 International Business Machines Corporation Dynamic technique for using corrective actions on vehicles undergoing excessive turns
WO2002093467A1 (en) 2001-05-11 2002-11-21 Anoto Ab Electronic pen with actuation through removal of cap
WO2002093530A1 (en) 2001-05-11 2002-11-21 Shoot The Moon Products Ii, Llc Interactive book reading system using rf scanning circuit
US6954199B2 (en) 2001-06-18 2005-10-11 Leapfrog Enterprises, Inc. Three dimensional interactive system
US7085693B2 (en) * 2001-06-19 2006-08-01 International Business Machines Corporation Manipulation of electronic media using off-line media
US6641401B2 (en) 2001-06-20 2003-11-04 Leapfrog Enterprises, Inc. Interactive apparatus with templates
US6608618B2 (en) 2001-06-20 2003-08-19 Leapfrog Enterprises, Inc. Interactive apparatus using print media
US6966495B2 (en) * 2001-06-26 2005-11-22 Anoto Ab Devices method and computer program for position determination
US20030016212A1 (en) 2001-06-27 2003-01-23 Stefan Lynggaard Method, computer program product and device for wireless connection
US20030001020A1 (en) 2001-06-27 2003-01-02 Kardach James P. Paper identification information to associate a printed application with an electronic application
US20030013483A1 (en) 2001-07-06 2003-01-16 Ausems Michiel R. User interface for handheld communication device
US20030024975A1 (en) 2001-07-18 2003-02-06 Rajasekharan Ajit V. System and method for authoring and providing information relevant to the physical world
US6516181B1 (en) * 2001-07-25 2003-02-04 Debbie Giampapa Kirwan Interactive picture book with voice recording features and method of use
GB2378073B (en) * 2001-07-27 2005-08-31 Hewlett Packard Co Paper-to-computer interfaces
AU2002355530A1 (en) 2001-08-03 2003-02-24 John Allen Ananian Personalized interactive digital catalog profiling
JP4261145B2 (en) 2001-09-19 2009-04-30 株式会社リコー Information processing apparatus, information processing apparatus control method, and program for causing computer to execute the method
US6584249B1 (en) 2001-10-17 2003-06-24 Oplink Communications, Inc. Miniature optical dispersion compensator with low insertion loss
US20030089777A1 (en) 2001-11-15 2003-05-15 Rajasekharan Ajit V. Method and system for authoring and playback of audio coincident with label detection
US20030095115A1 (en) 2001-11-22 2003-05-22 Taylor Brian Stylus input device utilizing a permanent magnet
EP1315085B1 (en) 2001-11-27 2006-01-11 Sun Microsystems, Inc. Automatic image-button creation process
US20030134257A1 (en) 2002-01-15 2003-07-17 Ahmed Morsy Interactive learning apparatus
KR20040091016A (en) 2002-02-06 2004-10-27 리이프프로그 엔터프라이시스, 인코포레이티드 Write on interactive apparatus and method
US6938222B2 (en) 2002-02-08 2005-08-30 Microsoft Corporation Ink gestures
KR20050027093A (en) 2002-05-24 2005-03-17 에스엠티엠 테크놀러지스 엘엘씨 Method and system for skills-based testing and training
US7402042B2 (en) 2002-05-30 2008-07-22 Mattel, Inc. Electronic learning device for an interactive multi-sensory reading system
MXPA04012018A (en) 2002-05-30 2005-03-07 Mattel Inc Interactive multi-sensory reading system electronic teaching/learning device.
US7062090B2 (en) 2002-06-28 2006-06-13 Microsoft Corporation Writing guide for a free-form document editor
US6915103B2 (en) 2002-07-31 2005-07-05 Hewlett-Packard Development Company, L.P. System for enhancing books with special paper
US6966777B2 (en) * 2002-08-01 2005-11-22 Teresa Robotham Tool device, system and method for teaching reading
US20040104890A1 (en) * 2002-09-05 2004-06-03 Leapfrog Enterprises, Inc. Compact book and apparatus using print media
US7386804B2 (en) * 2002-09-13 2008-06-10 E-Book Systems Pte. Ltd. Method, system, apparatus, and computer program product for controlling and browsing a virtual book
US6943670B2 (en) * 2002-10-24 2005-09-13 Tlcd, Ltd. Writing instrument with display module
US7090020B2 (en) 2002-10-30 2006-08-15 Schlumberger Technology Corp. Multi-cycle dump valve
JP4244614B2 (en) 2002-10-31 2009-03-25 株式会社日立製作所 Handwriting input device, program, and handwriting input method system
US20040125075A1 (en) 2002-12-31 2004-07-01 Diercks Richard A. DVD remote control with interchangeable, title-specific interactive panels
US20040229195A1 (en) 2003-03-18 2004-11-18 Leapfrog Enterprises, Inc. Scanning apparatus
US7080103B2 (en) 2003-05-08 2006-07-18 International Business Machines Corporation Personal information management system and method with audit functionality
AU2003304306A1 (en) 2003-07-01 2005-01-21 Nokia Corporation Method and device for operating a user-input area on an electronic display device
US6985138B2 (en) 2003-08-29 2006-01-10 Motorola, Inc. Input writing device
US7555705B2 (en) 2003-09-10 2009-06-30 Microsoft Corporation Annotation management in a pen-based computing system
CN1581131B (en) * 2003-10-07 2010-05-12 赵舜培 Reading material capable of automatically identifying content
US20050208458A1 (en) 2003-10-16 2005-09-22 Leapfrog Enterprises, Inc. Gaming apparatus including platform
US7848573B2 (en) 2003-12-03 2010-12-07 Microsoft Corporation Scaled text replacement of ink
US7558744B2 (en) 2004-01-23 2009-07-07 Razumov Sergey N Multimedia terminal for product ordering
EP1569140A3 (en) 2004-01-30 2006-10-25 Hewlett-Packard Development Company, L.P. Apparatus, methods and software for associating electronic and physical documents
US20060125805A1 (en) 2004-03-17 2006-06-15 James Marggraff Method and system for conducting a transaction using recognized text
US7853193B2 (en) 2004-03-17 2010-12-14 Leapfrog Enterprises, Inc. Method and device for audibly instructing a user to interact with a function
US20060067576A1 (en) 2004-03-17 2006-03-30 James Marggraff Providing a user interface having interactive elements on a writable surface
US7831933B2 (en) 2004-03-17 2010-11-09 Leapfrog Enterprises, Inc. Method and system for implementing a user interface for a device employing written graphical elements
JP4626251B2 (en) 2004-10-06 2011-02-02 株式会社日立製作所 Combustor and combustion method of combustor
JP4580293B2 (en) 2005-06-30 2010-11-10 株式会社東芝 Image forming apparatus

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5943039A (en) * 1991-02-01 1999-08-24 U.S. Philips Corporation Apparatus for the interactive handling of objects
US6091675A (en) * 1996-07-13 2000-07-18 Samsung Electronics Co., Ltd. Integrated CD-ROM driving apparatus for driving different types of CD-ROMs in multimedia computer systems
US6570597B1 (en) * 1998-11-04 2003-05-27 Fuji Xerox Co., Ltd. Icon display processor for displaying icons representing sub-data embedded in or linked to main icon data
US6426761B1 (en) * 1999-04-23 2002-07-30 Internation Business Machines Corporation Information presentation system for a graphical user interface
US6476834B1 (en) * 1999-05-28 2002-11-05 International Business Machines Corporation Dynamic creation of selectable items on surfaces
US6646633B1 (en) * 2001-01-24 2003-11-11 Palm Source, Inc. Method and system for a full screen user interface and data entry using sensors to implement handwritten glyphs
US20050024341A1 (en) * 2001-05-16 2005-02-03 Synaptics, Inc. Touch screen with user interface enhancement
US20030014615A1 (en) * 2001-06-25 2003-01-16 Stefan Lynggaard Control of a unit provided with a processor
US7530023B2 (en) * 2001-11-13 2009-05-05 International Business Machines Corporation System and method for selecting electronic documents from a physical document and for displaying said electronic documents over said physical document
US20050251755A1 (en) * 2004-05-06 2005-11-10 Pixar Toolbar slot method and apparatus
US7565625B2 (en) * 2004-05-06 2009-07-21 Pixar Toolbar slot method and apparatus

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170242494A1 (en) * 2008-11-25 2017-08-24 Kenji Yoshida Handwriting input/output system, handwriting input sheet, information input system, and information input assistance sheet
WO2014036397A3 (en) * 2012-08-31 2014-05-15 Ebay Inc. Expanded icon functionality
US9495069B2 (en) 2012-08-31 2016-11-15 Paypal, Inc. Expanded icon functionality

Also Published As

Publication number Publication date
US7831933B2 (en) 2010-11-09
US20060067577A1 (en) 2006-03-30
KR100806241B1 (en) 2008-02-22
WO2006076075A2 (en) 2006-07-20
WO2006076075A3 (en) 2006-10-05
JP2006244463A (en) 2006-09-14
CN1855012A (en) 2006-11-01
EP1681624A1 (en) 2006-07-19
CA2532611A1 (en) 2006-07-12
KR20060082422A (en) 2006-07-18

Similar Documents

Publication Publication Date Title
US7831933B2 (en) Method and system for implementing a user interface for a device employing written graphical elements
US20060066591A1 (en) Method and system for implementing a user interface for a device through recognized text and bounded areas
US20060067576A1 (en) Providing a user interface having interactive elements on a writable surface
KR100814052B1 (en) A mehod and device for associating a user writing with a user-writable element
US7853193B2 (en) Method and device for audibly instructing a user to interact with a function
US20060078866A1 (en) System and method for identifying termination of data entry
US7281664B1 (en) Method and system for hierarchical management of a plurality of regions of an encoded surface used by a pen computer
US20070280627A1 (en) Recording and playback of voice messages associated with note paper
EP1681623A1 (en) Device user interface through recognized text and bounded areas
WO2006076118A2 (en) Interactive device and method
CA2535505A1 (en) Computer system and method for audibly instructing a user

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION