Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20050278647 A1
Publication typeApplication
Application numberUS 11/139,612
Publication date15 Dec 2005
Filing date31 May 2005
Priority date9 Nov 2000
Also published asUS6918091, US20020085037, US20060085763, WO2002039245A2, WO2002039245A3, WO2002039245A9
Publication number11139612, 139612, US 2005/0278647 A1, US 2005/278647 A1, US 20050278647 A1, US 20050278647A1, US 2005278647 A1, US 2005278647A1, US-A1-20050278647, US-A1-2005278647, US2005/0278647A1, US2005/278647A1, US20050278647 A1, US20050278647A1, US2005278647 A1, US2005278647A1
InventorsJoseph Leavitt, Scott Mills
Original AssigneeChange Tools, Inc.
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
User definable interface system and method
US 20050278647 A1
Abstract
In a cursor-based computing environment having a display, a user definable interface (UDI) is displayed upon activation by a user. The UDI has a plurality of levels each having a plurality of buttons and is displayed in a selectable position about a pointer position in a display area to reduce pointer commute. The user selects a visual appearance and shape of the UDI, and the number of buttons. The user assigns a command to each of the plurality of buttons at each of the plurality of levels by dragging and dropping from one or more applications of the apparatus.
Images(29)
Previous page
Next page
Claims(24)
1. A system, comprising:
a display that displays a user definable interface (UDI), the UDI having a plurality of levels each having a plurality of command regions, wherein the visual appearance of the UDI remains substantially the same for each of the plurality of levels, wherein only one of the levels appears at a given time;
a selecting device that selects a visual appearance of the UDI including a shape and number of the command regions; and
an assigning device that assigns a command to each of the plurality of command regions for each respective one of the plurality of levels from one or more applications associated with the system.
2. The system of claim 1, wherein the display, selecting device, and assigning device are in a hand-held device.
3. The system of claim 1, wherein the display, selecting device, and assigning device are in video equipment.
4. The system of claim 1, wherein the display, selecting device, and assigning device are in a home entertainment system.
5. The system of claim 1, wherein the display, selecting device, and assigning device are in a camera.
6. The system of claim 1, wherein the display, selecting device, and assigning device are in a wireless telephone.
7. The system of claim 1, wherein the display, selecting device, and assigning device are in a copier.
8. The system of claim 1, wherein the display, selecting device, and assigning device are in a remote control.
9. The system of claim 1, wherein the display, selecting device, and assigning device are in a household appliance.
10. The system of claim 1, wherein the display, selecting device, and assigning device are in a commercial appliance.
11. The system of claim 1, further comprising:
an interaction device that is one of one or more buttons, one or more rotary dials, a touch screen, a voice input system, or combinations thereof, the interaction device being used to choose one of the command regions at each of the levels.
12. The system of claim 1, wherein:
an application window is displayed as part of the UDI upon a user choosing one of the command regions that was assigned to the application;
the application window allows the user to interact with the application; and
the command regions are accessible to the user after the application window is displayed.
13. A method, comprising:
(a) managing a visual appearance of a UDI, which includes a shape and number of command regions, the visual appearance remaining substantially the same at each of a plurality of levels, wherein only one of the levels appears at a given time;
(b) assigning application functions to each of the command regions; and
(c) managing the UDI in response to interactions with the command regions.
14. The method of claim 13, wherein steps (a)-(c) are performed in a hand-held device.
15. The method of claim 13, wherein steps (a)-(c) are performed in video equipment.
16. The method of claim 13, wherein steps (a)-(c) are performed in a home entertainment system.
17. The method of claim 13, wherein steps (a)-(c) are performed in a camera.
18. The method of claim 13, wherein steps (a)-(c) are performed in a wireless telephone.
19. The method of claim 13, wherein steps (a)-(c) are performed in a copier.
20. The method of claim 13, wherein steps (a)-(c) are performed in a remote control.
21. The method of claim 13, wherein steps (a)-(c) are performed in a household appliance.
22. The method of claim 13, wherein steps (a)-(c) are performed in a commercial appliance.
23. The method of claim 13, further comprising:
(d) interacting with one of one or more buttons, one or more rotary dials, a touch screen, a voice input system or combinations thereof, the interacting step being used to choose one of the command regions at each of the levels.
24. A user definable interface (UDI), comprising:
a display device that displays a user-selectable geometric arrangement of command regions that remains substantially the same at each of a plurality of levels, wherein only one of the plurality of levels appears at a given time; and
an associating device that allows one or more functions to be associated by a user or pre-assigned to each command region, so as to define the plurality of levels of the command regions.
Description
    CROSS-REFERENCE TO RELATED APPLICATIONS
  • [0001]
    This application is a continuation of U.S. Ser. No. 09/986,765, filed Nov. 9, 2001 (now allowed), which application claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 60/247,643, filed Nov. 9, 2000, and 60/325,179, filed Sep. 28, 2001, which are all incorporated herein by reference in their entireties.
  • COPYRIGHT AND TRADEMARK NOTICES
  • [0002]
    A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent files or records, but otherwise reserves all copyright rights whatsoever.
  • [0003]
    Zenu is a trademark of Change Tools Inc. Unix is a registered trademark of The Open Group. Microsoft, Microsoft Windows, Window NT and/or other Microsoft products referenced herein are either trademarks or registered trademarks of Microsoft Corporation. Various terms and icons in the figures may be trademarks or registered trademarks of other companies.
  • BACKGROUND OF THE INVENTION
  • [0004]
    1. Field of the Invention
  • [0005]
    The present invention relates generally to the field of user interfaces within a data processing system and more particularly, to a user definable interface overlay capable of manipulating multiple functions and windows in a graphical display.
  • [0006]
    2. Background Art
  • [0007]
    The manipulation of data in a data processing system is well known in the art and the large amounts of data that are available to the user of a modern state-of-the-art data processing system often become overwhelming in magnitude and complexity. Similarly, many consumer devices have interfaces requiring human interaction to control the device or a peripheral connected thereto. As a result of this increasing complexity, simplified interface methods and systems are needed between the user and the data processing system or device.
  • [0008]
    One example of a simplified system and method is the utilization of a graphic user interface (“GUI”). A GUI is an interface system by which a user interacts with system components, and/or system applications via a visible display having, for example, windows or view ports, icons, menus, pointing devices, etc. One of the many advantages of GUIs in the computer field is their ability to represent computer application programs, documents and data as graphical display elements or icons as opposed to text-based elements.
  • [0009]
    Menu driven software programs are a specific example of a GUI. Such software programs enable a user to chose from a list of items that can be accessed directly by pulling down different menus from the menu bar, rather than requiring the user to remember the name and syntax of a command. GUIs were developed so that novice users could more easily make a selection among available commands and, thus, operate computers. In the computer field, these menu driven software programs eventually lead to the development of a windowing environment in which the user may have multiple programs and files operable at one time with a selection among multiple commands. Each command appears in a window of the program data file being worked on. To effect selection within applications and switching between windows, a hand operated pointing device becomes a critical component of a computer system running windows based software applications. One example pointing device is a mouse.
  • [0010]
    Applications running in a windowed environment typically have a main menu bar with more specific commands being displayed in “pull down” menus stemming from specific portions of the main menu bar command headings. When the user wants to execute a command, the user must move the pointing device so that a cursor on the display points to the command on the desired menu heading. The command heading activates a pull down menu that displays a plurality of commands available for execution. In some instances, computer systems create hierarchies of menus (also referred to as “nesting”) leading to submenus to avoid excessively large menus or inappropriate menu chains. A command from the pull down menu may then be selected for execution. In accordance with conventional methods, only one command is executed at any given time since the pull down menu is typically limited to a single column of possible choices or objects. Movement amongst the menu bar and the pull down menus requires a great deal of movement of the pointing device (and thereby the cursor) to manipulate multiple windows or applications and their related commands. This movement is called “cursor commute.” This results in a time-consuming, less efficient and confusing user interface. Therefore, it is difficult for the young, the elderly, handicapped, or any novice user to traverse and coordinate the position of the pointing device and, thus, the cursor with which the execution is made.
  • [0011]
    One attempt to avoid a long horizontal list of menu options has resulted in “pop-up menus.” These menus have the advantage of bringing the menu to the cursor, rather than having to move the cursor to the menu. When a trigger event occurs, for example depressing the right button (known in the art as “right clicking”) on the pointing device (e.g., a mouse), a window is displayed next to the cursor position and the menu items to be related are listed. When the user chooses a menu item, the menu is removed and the action corresponding to the item is initiated. Pop-up menus, however, are limited to the number of commands they can contain and they often cover up part of the work area.
  • [0012]
    Pie menus enhance pop-up menus by allowing directional selection to choose menu items. A pie menu is similar to a pop-up menu, but the pie shaped menu items surround the cursor position in a circle. In their two-dimensional form, pie menus may be round menus. The menu items are positioned around a small inactive region in the center of the circle like slices of a pie, rather than in rows or colurns as in conventional linear menus. In operation, the cursor is initially located in the center of the pie in a small inactive region. The active regions representing the menu items are therefore adjacent the cursor, but each in a different direction, and menu items are selected by clicking the mouse and then pointing in the direction of the menu item.
  • [0013]
    What is needed is an interface to provide users with a definable interface that minimizes cursor commute and does not clutter the work area.
  • BRIEF SUMMARY OF THE INVENTION
  • [0014]
    The present invention relates to a user definable interface that minimizes cursor commute.
  • [0015]
    The present invention is to be implemented in a cursor-based computing environment having a display. According to the present invention a user definable interface (UDI) is displayed upon activation by a user, wherein the UDI has a plurality of buttons and is displayed in a relative position about a cursor position to reduce cursor commute. The present invention permits the user to select a visual appearance and shape of the UDI, and the number of buttons. The present invention also permits the user to assign a command to each of the plurality of buttons by dragging and dropping from one or more applications of the apparatus.
  • [0016]
    The present invention further permits the user to form a first group of buttons and at least a second group of buttons. The user is permitted to assign a first icon representing a first specific one of the one or more applications to a first given button of the first group and assign commands, associated with the first specific one of the one or more applications to the second group of buttons. The present invention further permits the user to assign a second icon representing a second specific one of the one or more applications to a second given button of the first group and assign commands, associated with the second specific one of the one or more applications to the second group of buttons. The appearance of, and commands associated with, the second group of buttons change based on which button of the first group of buttons is selected.
  • [0017]
    The present invention further permits the user to activate the UDI by the user comprises at least one of clicking a hotkey, clicking a mouse button, or turning on the apparatus.
  • [0018]
    In a data processing system having a user defined interface (UDI), an alternative a method of the present invention comprises the steps of managing the UDI in response to user commands, providing at least one template that defines position for a plurality of command regions corresponding to the UDI, and providing a theme that defines attributes and commands for the for a plurality of command regions.
  • [0019]
    The present invention is also characterized as an apparatus comprising a user defined interface (UDI) having a plurality of command regions, a command processor that manages an interactive skin (IS) and a customizer. The IS includes a template that defines position information for the plurality of command regions corresponding to the UDI and at least one of default attributes and default commands for the plurality of command regions, and a theme that defines attributes if the template only defines default commands for the plurality of command regions, or commands if the template only defines default attributes for the plurality of command regions. The customizer permits user replacing or user extending of the default attributes or the default commands of one or more of the plurality of command regions. Typically the user is an end user of the apparatus, but the invention is not so limited.
  • [0020]
    The customizer permits a user to: hide the UDI; hide a portion of the UDI; have the UDI display upon launch; launch the UDI from a system tray; and scale the size of the UDI. Moreover, the can be UDI is displayed in a relative position about a cursor position. The customizer permits a user to define that relative position.
  • [0021]
    Another aspect of the present invention is directed to a user definable interface that enables each user to control interaction with any given software package or operating system through a customized set of interactive nestable commands and functions based upon user preference with the convenience of edit functionality.
  • [0022]
    Another aspect of the present invention relates to a user definable interface that allows selection of multiple actions with a single user interaction.
  • [0023]
    It is also another aspect of the present invention to provide a user definable interface that is invisible until prompted by the user and can be set to disappear again after a user selection.
  • [0024]
    It is another aspect the present invention is directed to a user definable interface that is executable during work on an active file.
  • [0025]
    Another aspect of the present invention relates to a device, method and computer program product that provide an efficient on-screen work environment tailored to the user's needs.
  • [0026]
    It is yet another aspect of the present invention to provide a translucent executable user definable interface on a display screen that enables a user to observe the work space depicted beneath the user definable interface through a centrally positioned window in the user definable interface.
  • [0027]
    It is another aspect of the present invention to permit selection of commands or functions by “clicking” an icon from a plurality of icons that enclose or partially enclose a central window. Clicking on a button causes one or more additional interface buttons to appear, launches an application, opens a file, or opens a container.
  • [0028]
    Another aspect of the present invention is directed to an Internet browser and application launching tool.
  • [0029]
    The user definable interface of the present invention provides a number of advantages over other interface overlays known in the art. For example, it allows users to customize commands according to the user's preference. In addition, it eliminates screen clutter by being invisible until activated and disappearing once a selection is made. Further, the present invention provides the user with the ability to view the work area on a display screen while the user definable interface is activated.
  • [0030]
    These and additional features and advantages of the invention will be set forth in the detailed description that follows, and in part will be readily available to those skilled in the art from that description or recognized by practicing the invention as described herein.
  • [0031]
    It is to be understood that both the foregoing general description and the following detailed description are merely exemplary of the invention and are intended to provide an overview of framework for understanding the nature and character of the invention as it is claimed.
  • BRIEF DESCRIPTION OF THE DRAWINGS/FIGURES
  • [0032]
    The features and advantages of the present invention will become more apparent from the detailed description set forth below when taken in conjunction with the drawings/figures in which like reference numbers indicate identical or functionally similar elements. Additionally, the left-most digit of a reference number identifies the drawing in which the reference number first appears.
  • [0033]
    FIG. 1 is a screen grab depicting a cluttered desktop.
  • [0034]
    FIGS. 2A, 2B and 2C are screen grabs depicting a Zenu™ UDI, in accordance with the present invention.
  • [0035]
    FIG. 3A illustrates a conventional toolbar accessing a web page.
  • [0036]
    FIG. 3B illustrates a Zenu™ UDI accessing the web page of FIG. 3A, in accordance with the present invention.
  • [0037]
    FIG. 4 illustrates a Zenu™ UDI corresponding to the present invention.
  • [0038]
    FIG. 5 illustrates the opening of a file with the Zenu™ UDI of FIG. 4, in accordance with the present invention.
  • [0039]
    FIG. 6 illustrates an alternative control capability of a Zenu™ UDI, in accordance with the present invention.
  • [0040]
    FIG. 7 illustrated a Zenu™ UDI configured with an instant messager plug-in, in accordance with the present invention.
  • [0041]
    FIGS. 8A-F illustrate six exemplary Zenu™ UDIs, in accordance with the present invention.
  • [0042]
    FIG. 9A illustrates a Zenu™ UDI and a interactive skin control panel, which is accessed by the user selecting Zenu™ UDI customization button, in accordance with the present invention.
  • [0043]
    FIG. 9B illustrates an alternative to the Zenu™ UDI and a interactive skin control panel of FIG. 9A, in accordance with the present invention.
  • [0044]
    FIG. 10A illustrates a Zenu™ UDI and a functionality control panel, in accordance with the present invention.
  • [0045]
    FIG. 10B illustrates a Zenu™ UDI and a properties control panel, which permits the user to define various “Startup Options”, in accordance with the present invention.
  • [0046]
    FIG. 11 depicts an exemplary architecture having a command processor that manages an interactive skin (IS), in accordance with the present invention.
  • [0047]
    FIGS. 12 through 19 are flow diagrams illustrating the operation of an exemplary Zenu™ UDI system and method according to an embodiment of the present invention.
  • [0048]
    FIG. 20 illustrates an example of a computer system capable of carrying out the functionality described herein, in accordance with the present invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • [0049]
    The preferred embodiment of the present invention will now be discussed in detail. While specific features, configurations and arrangements are discussed, it should be understood that this is done for illustration purposes only. A person skilled in the relevant art will recognize that other steps, configurations and arrangements may be used without departing from the spirit and scope of the invention. Indeed, for the sake of brevity, conventional electronics, software and/or computer architecture, and other functional aspects of the method/apparatus (and components of the individual operating components of the apparatus) may not be described in detail herein. Furthermore, for purposes of brevity, the invention is frequently described herein as pertaining to data processing devices, such as personal computer or laptop computers, or set-top boxes in a television computing environment. It should be appreciated, however, that many other devices having a user viewable display for interaction therewith, and/or control thereof could be readily modified to included the present invention, and thus the techniques described herein could be used in connection with other such devices. Moreover, it should be understood that the spatial descriptions (e.g., “next to”, “above”, “below”, “up”, “down”, etc.) made herein are for purposes of illustration only.
  • [0050]
    The term “button” is used herein according to its customary meaning to refer to a graphical representation of an electrical push-button appearing as part of a graphical user interface, as would be apparent to a person skilled in the relevant art. Moving the pointer device over the graphical “button” and pressing (or “clicking”) one of the physical buttons of the pointing device, for example, starts some software action such as closing a window or deleting a file.
  • [0051]
    The term “command” is used herein to refer to a software action taken when a button is activated. A command can launch an application, open a file, or perform some predefined function or set of functions.
  • [0052]
    The term “cursor” is used herein according to its customary meaning to refer to a movable symbol on a display device that shows where the user is working, whether typing in text, drawing lines, or moving something around. The cursor can be moved with the arrow keys or a pointing device. It usually appears in text programs as a blinking dash or rectangle, or an arrow. In graphics programs the cursor is often called a pointer, and can take many different shapes such as a brush, pencil, or hand, as would be apparent to a person skilled in the relevant art.
  • [0053]
    The term “display device” is used herein according to its customary meaning to refer to a device capable of displaying an image, such as a cathode ray tube (CRT) monitor, liquid crystal display (LCD), plasma display, or like device used to display text, graphics, images, etc., to a user, as would be apparent to a person skilled in the relevant art.
  • [0054]
    The term “pointing device” is used herein according to its customary meaning to refer to a mouse, track ball, touch pad, joy stick, voice activated control system, or the like device used to position a cursor on a display device, as would be apparent to a person skilled in the relevant art.
  • [0055]
    The terms “user definable interface” and “Zenu™ UDI” are used interchangeably herein to refer to the present invention as described below.
  • [0056]
    The term “window” or “windows” are used herein according to their customary meaning to refer to portions of the display device that are divided into areas, which function as a separate input/output device under the control of different application programs, as would be apparent to a person skilled in the relevant art. This gives the user the ability to see the output of several processes at once and to choose which one will receive input by selecting its window usually with a pointing device. WINO refers to Windows, Icons, Menus and Pointers (or maybe windows, icons, mouse, pull-down menus). The style of graphical user interface invented at Xerox PARC, later popularized by the Apple Macintosh and now available in other varieties such as the X Window System, OSF/Motif, NeWS RISC OS, and Microsoft® Windows, as would be apparent to a person skilled in the relevant art.
  • [0000]
    I. Overview
  • [0057]
    FIG. 1 illustrates a conventional desktop 100 as presented on a display 102 in a window 104. The desktop 100 includes a taskbar 106, and a plurality of applications, folders, files, shortcuts, and the like (referred to generally as 108) cluttering the desktop. The desktop typically occupies the whole display, and attempts to represent the top of an office desk (i.e., a real desktop). On a conventional graphical user interface, the icons on the screen resemble objects that would be found on a real desktop, such as file folders, a clock, etc. Users like to locate applications, folders, files, shortcuts, and the like on the desktop for easy access. As is typical, access is no longer easy when the desktop becomes cluttered. Among the many advantages and uses of the present invention, it brings new order to the desktop.
  • [0058]
    FIG. 2A illustrates a Zenu™ 200, according to one embodiment of the present invention. In a cursor-based computing apparatus having a display 102, the Zenu™ 200 comprises a user definable interface (UDI) that is displayed upon activation by a user. The UDI has a plurality of buttons and is displayed in a relative position about the cursor position to substantially reduce cursor commute. The Zenu™ (UDI) 200 permits the user to select a visual appearance and shape of the UDI, as well as other characteristics, such as the number of buttons to be displayed and the commands associated with those buttons. Also, the Zenu™ 200 permits the user to assign commands to the buttons by dragging and dropping from one or more applications associated with (e.g., capable of running on, or otherwise coupled to) the apparatus.
  • [0059]
    In this embodiment, ZenU™ 200 can have multiple groups of buttons. The multiple groups of buttons can have different functionality. For example, as illustrated in FIG. 2A, a first group of buttons on the lower half of the Zenu™ 200 (buttons 204, 206 and those located on the outer circumference there between) can each have a first class of functionality. The second group of buttons (such as the remaining buttons on the outer circumference on the top portion of Zenu™ 200) can have a second class of functionality, the second class of functionality having some association with the first class of functionality. For example, the first class of functionality can cause icons to appear on other buttons, and the second class of functionality can cause some action associated with another button to occur.
  • [0060]
    An example of the association between the first class of functionality of the first group of buttons and the second class of functionality of the second group of buttons is illustrated at FIG. 2B. By way of example, not limitation, reference is made to the “My computer” button 206. My computer button 206 is a button in the first group. When My computer button 206 is clicked or otherwise selected using a pointing device, software action causes icons to appear on the second group of buttons. In this example, buttons 208 through 216 of the second group of buttons display icons corresponding to options, commands, files, or the like, associated with My computer button 206 of the first group of buttons.
  • [0061]
    Similarly, as shown in FIG. 2C, when the user clicks or otherwise selects Internet browser button 220 of the first group of buttons, the second group of buttons will display features, commands, URLs, or the like, associated with the Internet browser button 220. Selecting the applications button 220 causes various icons corresponding to resident applications to be displayed on the second group of buttons, such as Zenu™ button 222, find button 224, Internet browser button 226, etc.
  • [0062]
    Zenu™ 200 can comprise additional groups of buttons as illustrated generally at 230. The commands associated with button groups 230 can comprise common cursor control operations as illustrated by the arrows at upper and lower groups 230, or the like.
  • [0063]
    FIG. 3A illustrates a conventional menu driven display, which in this case is a tool bar 302 of Microsoft® Internet Explorer. Illustrated in the main window is a web page 304 corresponding to the Internet address at 306. FIG. 3B illustrates Microsoft® Internet Explorer at the same web page after being launched by a previously invoked Zenu™ 310. The commands of tool bar 302 in FIG. 3A are illustrated in the upper button group of Zenu™ 310, as shown generally at 312.
  • [0064]
    Thus, according to the present invention, use of the Zenu™ 310 in this example simplifies the user's interaction with the Microsoft® Internet Explorer application by providing common Microsoft® Internet Explorer commands on the Zenu™ 310 for easy access by the user. Also, as will be described in detail below, the user can define the commands associated with the first group of buttons. For example, the web page displayed in FIG. 3B could be the home page associated with the Microsoft® Internet Explorer application. In this case, the user's selection of the icon 314 would bring up this web page. Further description of the functionality including the operation and definability of a Zenu™ UDI will be addressed in the next sections.
  • [0000]
    II. Functionality
  • [0065]
    A. Title Operation
  • [0066]
    1. What Can the Zenu™ UDI Do?
  • [0067]
    FIG. 4 illustrates user definable interface (UDI or Zenu™) 400 used to launch applications, files, or web pages, or the like, according to an embodiment of the present invention. In this embodiment, Zenu™ 400 is in the form of a ring 402 having a group of buttons along its outer circumference and an open central section 404. Selected buttons have been associated with various software applications, files, folders, and the like. The illustration of such software applications, files and folders in the context of a personal computer or laptop is provided by way of example, not limitation. The present invention can be implemented with any apparatus having a display device for user interaction with the device. In this example, selection of the resume button 406 opens the file “resume.doc” and launches the associated software application (Microsoft® Word in this example), assuming that the application was not currently running. FIG. 5 shows the opened file and an associated application. Alternatively, the Zenu™ 400 can be modified by the user to launch various software applications. As would become apparent to a person skilled in the relevant art, the launching of software applications, opening of files, or accessing web sites are only examples of the type of launching that can be done from a Zenu™ UDI. The present invention should not be limited to such examples.
  • [0068]
    According to the present invention the Zenu™ UDI also functions as a controller. For example, after opening of the file 406, the appearance and command functionality of the buttons on ring 402 of Zenu™ 400 change, as illustrated in FIG. 5. Zenu™ 400's buttons now correspond to different commands than those of Zenu™ 400. Upon opening of the resume.doc file, the commands associated with Zenu™ UDI automatically switch to commands that correspond to various Microsoft® Word menu choices/actions. See, for example, “bold” action 502.
  • [0069]
    The Zenu™ UDI can have default commands associated with the various software applications, such as Microsoft® Word, or any other software application or control system capable of being controlled by a display device. Alternatively, the Zenu™ UDI can be modified by the user to launch various software applications. As would become apparent to a person skilled in the relevant art, the control of software applications, control systems or other apparatus coupled to a display device are only examples of the type of control functionality that can be performed by a Zenu™ UDI. The present invention should not be limited to such examples.
  • [0070]
    Such alternative control capability of a Zenu™ UDI is illustrated in FIG. 6. In this figure, a Zenu™ 600 is visible on an automobile dashboard display device 602. The pointer device can be buttons integrated on the steering wheel, rotary dials and buttons on the dashboard, a touch screen on the display itself, a voice input system, and combinations thereof, as would become apparent to a person skilled in the relevant art.
  • [0071]
    A major advantage of such Zenu™ UDI integration is commonality. Once users become accustomed to the Zenu™ UDI's appearance, operation and definability, their efficiency in using new devices incorporating a Zenu™ UDI will dramatically improve. Many consumers complain that “I can't even program the clock on my VCR, yet alone my . . . .” This unwillingness and frustration of the general public to program consumer electronic devices, controllers, appliances, and the like can be redressed by integration of the Zenu™ UDI into a wide variety of devices. Thus, in other applications, the ZenU™ UDI can be integrated with hand-held controllers, such a remote controls for televisions, video equipment, home entertainment systems, cameras, household, including wireless telephones, copiers, as wells as commercial appliances and tools, and the like, as would become apparent to a person skilled in the relevant art.
  • [0072]
    A further use of the Zenu™ UDI is as a container. FIG. 7 illustrated a Zenu™ 700 configured with an instant messaging plug-in. In this example, a container 702 is appended to the top of the Zenu™ 700 upon selection of the instant messaging plug-in button 704. This button 704 can be a button on the outer button ring 706 or elsewhere on the Zenu™ 700. Once the container 702 is displayed, the upper buttons of the Zenu™ 700 are automatically flipped to form an outer bottom hemisphere ring (hemi-ring) 708. Also, upper inner ring buttons (shown generally at 710) are re-located to the bottom inner ring automatically. Various controls for the container 702 can be located at region 712, or along top 714 or sides 716, 718 of the container 702.
  • [0073]
    Alternative types of Zenu™ UDI containers include, but are not limited to tickers, video clip viewing, image viewing, quick file viewing, or the like, as would become apparent to a person skilled in the relevant art.
  • [0074]
    2. What Can the Zenu™ UDI Look Like?
  • [0075]
    The user definable characteristics of a Zenu™ UDI are extensive. A main definable characteristic of a Zenu™ UDI is its shape. FIGS. 8A-F are illustrate six exemplary Zenu™ UDIs. These examples are presented by way of example and not limitation. FIG. 8A illustrates a rectangular Zenu™ UDI comprising rows and columns of Zenu™ buttons. FIG. 8B illustrates a Zenu™ UDI similar to FIG. 8A, but the rows have an arch appearance. FIG. 8C illustrates a ring shaped Zenu™ UDI, with a central transparent window 800 and the same Zenu™ buttons of FIGS. 8A and 8B connected in a contiguous manner. Alternatively, a ring shaped Zenu™ UDI can have round buttons arranged in a noncontiguous manner, as illustrated in FIG. 8F.
  • [0076]
    Two more stylized Zenu™ UDIs are shown in FIGS. 8D and E. In FIG. 8D, the Zenu™ UDI is in a rectangle, but in contrast to FIG. 8A, the Zenu™ buttons enclose a central rectangular area 802. Central rectangular area can comprise an advertisement, corporate brand, customizable text or images, or the like, or it can be transparent like area 800. Alternatively, the central section may comprise a Zenu™ container as described above in connection with FIG. 7. A further stylized Zenu™ UDI is shown in FIG. 8E. Here noncontiguous Zenu™ buttons surround a central circular portion similar to the enclosed area 802 of FIG. 8D. An inner 804 border is also included.
  • [0077]
    B. Definability
  • [0078]
    1. Defining Zenu™ UDI Look
  • [0079]
    Another aspect of the present invention is the ability of users to readily modify the appearance of the Zenu™ UDI. According to this aspect of the present invention, the user's ability to define the appearance of the Zenu™ UDI is hereafter referred to as providing an “interactive skin” for the Zenu™ UDI. FIG. 9A illustrates a Zenu™ 900 and a interactive skin control panel 902, which is accessed by the user selecting Zenu™ customization button 904. The interactive skin control panel 902 presents to the user a plurality of predetermined Zenu™ interactive skins (shown generally at 906), and a separate Zenu™ 908 for interacting with the interactive skin control panel 902. Interactive skin control panel 902 permits the user to select an interactive skin of the Zenu™ UDI according to various possibilities as described above in connection with FIGS. 8A-8F, for example.
  • [0080]
    The user's ability to define the appearance of the Zenu™ UDI using the interactive skin control panel 902 to select an interactive skin is not limited to selecting the shape and arrangement of buttons, as described above in connection with FIGS. 8A-F. Various Zenu™ interactive skin attributes can be modified, including, but not limited to color, shading, texture mapping, animation, scaling, and various other computer graphic affects, as would be apparent to a person skilled in relevant art.
  • [0081]
    The interactive skin control panel 902 illustrated in FIG. 9A is an example of a novice user control panel. FIG. 9B illustrates an alternative to the Zenu™ UDI and an interactive skin control panel of FIG. 9A, in accordance with the present invention. Examples of advanced user control panels are illustrated in FIGS. 10A and 10B.
  • [0082]
    2. Defining Zenu™ UDI Functionality
  • [0083]
    In addition to permitting the user to define an interactive skin for a Zenu™ UDI, according to another embodiment of the present invention, users can also define the functionality of the Zenu™ UDI. FIG. 10A illustrates a Zenu™ 1000 and a Zenu™ UDI functionality control panel 1002. The Zenu™ UDI functionality control panel 1002 is also accessible via the Zenu™ customization button 904. The Zenu™ functionality control panel 1002 represents an advanced user control panel.
  • [0084]
    Various software application commands can be assigned using the “plugins” 1004 and 1006. The currently available applications are listed in window 1004 and their associated commands are listed in window 1006. “Glossary Commands” are available to the user in a window 1008. New glossary commands can be added via a button 1010, and/or edited via a button 1012. Button resets are available via button 1014. Selection of an available theme, as described in detail below, can be applied via a pull-down menu 1016. Alternatively, a program to be started upon clicking the button being defined can be selected by a “Browse” button 1018. The new command string for the button being defined is displayed in a window 1020. Conventional control panel buttons “OK”, “Cancel” and “Apply to Button” (1022, 1024 and 1026, respectively) are also provided.
  • [0085]
    In this figure, the Zenu™ functionality control panel 1002 illustrates the functionality of an exemplary button 1003 (as shown at the top of the Zenu™ functionality control panel 1002). Button 1003 is “Square,” and its current command is associated with the Internet Explorer “Refresh” action. Button 1003's command string is listed in window 1020. The Refresh icon (two opposing arrows) is shown on the Zenu™ 1000.
  • [0086]
    A “ToolTip per Theme” option permits the user to assign, via a pull-down menu, text that is to be displayed when the cursor floats over a button. The ToolTip text explains the command to be performed, such as “My Computer” when the cursor passes over button 206, as shown in FIG. 2B.
  • [0087]
    FIG. 10B illustrates a Zenu™ 1050 and Zenu™ properties control panel 1052, which permits the user to define various “Startup Options” 1054. Alternatively, the user can select to “Reset Overrides” by selecting tab 1056 (the Startup Options 1054 occults the Reset Overrides options in the figure).
  • [0088]
    3. Disabilities Act Compliance
  • [0089]
    In 1998, Congress amended the Rehabilitation Act to require Federal agencies to make their electronic and information technology accessible to people with disabilities. Inaccessible technology interferes with an individual's ability to obtain and use information quickly and easily. Section 508 of the Rehabilitation Act of 1973, as amended (29 U.S.C. 794d), was enacted to eliminate barriers in information technology, to make available new opportunities for people with disabilities, and to encourage development of technologies that will help achieve these goals. The law applies to all Federal agencies when they develop, procure, maintain, or use electronic and information technology. Under Section 508, agencies must give disabled employees and members of the public access to information that is comparable to the access available to others.
  • [0090]
    According to another embodiment, the Zenu™ UDI of the present invention can be adopted for the following, non-exhaustive list of Technical Standards of Subpart B, Section 508: 1194.21 Software applications and operating systems; 1194.22 Web-based intranet and internet information and applications; 1194.23 Telecommunications products; 1194.24 Video and multimedia products; 1194.25 Self contained, closed products; and 1194.26 Desktop and portable computers. Those skilled in the art will readily envision other similar applications for the Zenu™ UDI of the present invention.
  • [0000]
    III. Exemplary Architecture
  • [0091]
    A. Interactive Skin
  • [0092]
    This section describes an exemplary architecture for implementing a Zenu™ UDI having a plurality of command regions. Command regions correspond to the various Zenu™ buttons described above, for example. According to an embodiment of the present invention, FIG. 11 depicts an exemplary architecture 1100 having a command processor 1104 that manages an interactive skin (IS) 1106. IS 1106 comprises a template 1108 and a theme 1110. The template 1108 defines position information for the plurality of command regions corresponding to the UDI (not shown). Template 1108 also defines default attributes 1112 or default commands 1114 for the plurality of command regions. The theme 1110 defines (1) attributes 1116 if the template 1108 only defines default commands 1114 for the plurality of command regions, and/or (2) commands 1118 if the template 1108 only defines default attributes 1112 for the plurality of command regions. A customizer 1120 is provided to permit the user to replace or extend any of the default attributes 1112 or the default commands 1114 of any of the plurality of command regions.
  • [0093]
    B. Exemplary Architecture Syntax
  • [0094]
    This section describes the various syntactical expressions used to create the UDI interface and the functionality applied to certain click areas, whether they appear as buttons or just a portion of an image. These settings are stored in a Template file (.tpl), and a Theme file (.thm) located in the subdirectories “Template” and “Theme” respectively, for example. Settings can also be stored in a text initialization file (.ini).
  • [0095]
    Order of precedence dictates which settings are used. Settings that originate in the Template file can be replaced by settings in a Theme file, and the resulting settings can further be replaced by user/application-defined settings in the initialization (i.e., customization) file.
  • [0096]
    For the purpose of this document, action areas will be called “buttons” although they can appear as any bitmap that is specified in the resource file, thereby making it possible for a button to take most any shape or look that is possible using combinations of background bitmaps, and button bitmaps.
  • [0097]
    1. Settings
  • [0098]
    The currently available settings for UDI buttons or action/click areas are defined as follows in Table1:
    TABLE 1
    A Indicates that this area is available to be morphed into a
    Quicklaunch or Internet Favorites button.
    Trans- Indicates whether there is a transparency color in the bitmap
    parent that is to be used for the button.
    Bitmap The Name of the Bitmap resource that is to be used for the
    button.
    BtnType The number of states that the bitmap has, for instance:
    normal, pressed, flyover, and disabled.
    Tooltip The tool tip to display when a user hovers the mouse cursor
    over the button. *Note, when in a template or theme file there
    is no need for a theme specific indication because that
    particular file is already theme specific. However, in the
    initialization file, tool tips must be associated with specific
    themes because the .ini file itself is not theme specific.
    Tweak Provides a means of adjusting button locations more
    accurately than a dialog resource allows.
    Auto- Indicates whether a button repeats the “click” command if a
    Repeat user holds the left mouse down while clicking. An example
    might be a button that is used for scrolling.
    Check- Indicates whether the button stays in the pressed position
    button until another button on the window is pressed.
    Icon Specifies the name of the Icon (or bitmap) located in the
    resource file to be used with a button.
    Icon- Indicates whether the name specified by “Icon” was a bitmap
    Type or Icon.
    Tem- Indicates opening the specified window named.
    plate
    Theme If the window specified by “Template” is found, this setting
    will make the window apply the theme specified.
  • [0099]
    2. Settings Syntax
  • [0100]
    Acceptable Values are listed below in Table 2:
    TABLE 2
    A “QL” for Quicklaunch or “BM” for Internet Favorites
    Book mark
    Transparent True or False
    Bitmap The name of the bitmap in the resource file.
    BtnType A number between 1 and 4
    Tooltip Text
    Tweak X, Y with both values being an integer number
    AutoRepeat True or False
    CheckButton True or False
    Icon Name of an Icon or Bitmap located in resource file
    IconType “Bitmap” or “Icon”
    Template The text name of a Template.
    Theme The text name of a Theme to apply to the window.

    *There should not be any spaces in the text, except when a name has a space in it, for instance “Bitmap=Cat Eye;” where the bitmap is named “Cat Eye”.
  • [0101]
    The follow is an examples string using proper syntax:
      • BtnType=3; Bitmap=TWO; Transparent=FALSE; Template=Zenu; Theme=MyComputer;.
  • [0103]
    Where: BtnType indicates the button has 3 states—normal, pressed, and flyover. Bitmap specifies that there is a bitmap in the resource file with the name of “TWO” that is to be applied to this button. “Transparent=False;” states that there is no transparency color in the bitmap. The “Template=zenu; Theme=MyComputer;” indicates that when the button is clicked, the main window named “Zenu™” will change its theme to the “MyComputer” theme.
  • [0104]
    3. Button Command Syntax
  • [0105]
    Each command entered for a button to process upon clicking, must be separated with a semi-colon, for instance:
      • CMD:http://www.cuspis.com;c:\winnt\system32\calc.exe;.
  • [0107]
    The “CMD:” shown above is a keyword specifying that this command is not from a plugin. If the use wishes to process a plugin command, the “PLUGIN:” keyword would have to precede the command itself, for instance:
      • PLUGIN:word.bold;.
  • [0109]
    If the user wants to process commands from both plugins and non-plugin commands, the keyword “CMD:” or “PLUGIN:” must precede the command in the text. In addition to the “CMD:”, the user can also specify a double-click action by using the keyword “CMDDBL:”.
  • [0110]
    For Instance:
      • CMDDBL:“Template=zenu;Theme=Quicklaunch; Icon=QUICKLAUNCH;IconType=Icon; Tooltip=Quicklaunch”;.
  • [0112]
    Another example is as follows:
      • CMD :http://www.cuspis.com;c:\winnt\
      • system32\calc.exe;PLUGIN:word.bold;.
  • [0115]
    In response to the text of this second example, the program will open (if not open already) the web browser, and navigate to http://www.cuspis.com, next it will open the calculator program (if the path to the file is correct), and if MS word is open, will process the plugin command word.bold.
  • [0116]
    4. Theme Specific Commands
  • [0117]
    A user can specify commands that are only activated while using a specific theme. For instance, if the user wanted a button to open the calculator program when using a theme called “Math”, but wanted this button to open http://www.amazon.com any other time, the user could write the following command:
      • AllOther=CMD:http://www.amazon.com;
      • Math=CMD:C:\winnt\system32\calc.exe;.
  • [0120]
    In the above sample command, if the theme called “Math” was the current theme, the calculator program will open, otherwise for all other themes, this button will open the browser to amazon.com.
  • [0121]
    The following is yet another sample command:
      • Math=CMD:c:\winnt\system32\calc.exe;Favorites=http://www.msn.com;.
  • [0123]
    By using of a theme called “Retro”, and the user then click the button with the command above, the Zenu™ UDI will not use either command.
  • [0124]
    Instead, the Zenu™ UDI will look for the default command for the button in the template resource file.
  • [0125]
    5. Click and Drag
  • [0126]
    Users can click and drag shortcuts from the Windows desktop or Windows Explorer to a Zenu™ button of the present invention. This will cause the button to have the same action as the shortcut. If a file that is not a shortcut is dragged from Windows Explorer to a Zenu™ button, the Zenu™ UDI will make the button a shortcut pointing to the file that was dragged. For instance, if the user drags a Microsoftit Word or notepad document onto a Zenu™ button, clicking that Zenu™ button will now open the document that was dragged onto the button. This overrides the default action of the button defined in the template or theme file as well as user-defined commands.
  • [0127]
    6. Hot Key
  • [0128]
    A “Hot Key” can be assigned by the user to show/hide the Zenu™ UDI. For example, a specific, default set of key strokes, say Alt+F10, can be used. The user can readily change the default Hot Key, by right clicking anywhere on the UDI, and accessing “customize” and then the “Hot Key” feature. Next, all the user needs to do is press the desired key combination for the Hit Key, and it will be recorded in the text box of the Hot Key window. When the user is finished selecting the desired Hot Key, the user simply clicks “OK”. This will change the Hot Key, and store it so that the next time Zenu™ UDI is executed, it will use the same Hot Key combination to show/hide.
  • [0129]
    D. Sizing of the Zenu™ UDI
  • [0130]
    The Zenu™ customizer of the present invention also permits the user to change the size of the Zenu™ UDI (i.e., the space occupied on the screen by the Zenu™ UDI). Sizing can be an integral component of the tool, permitting the user to scale (stretch or shrink) the Zenu™ UDI to match the desired size. Sizing could be arbitrary, as in permitting the user click on an edge or “handle” and changed the size of the Zenu™ UDI. In a preferred embodiment, however, the user would be presented with a finite number of size option, say three sizes: smaller, normal, and larger. This allows exact scaling of the Zenu™ UDI to eliminate distortion and to maintain its aspect ratio. In another embodiment, up to ten different sizes are available including “full screen”, which becomes an attractive option when the Zenu™ UDI acts as the container for chat, video, browsing, mail, and the like.
  • [0131]
    The template and the theme can be designed to include knowledge of the multiple sizes available. Alternatively, all that is required is the “Normal” set of templates and theme components. As the user selects to change the size, the template/theme combination is scanned for prior-knowledge of possible sizes. The customizer can presents user with the additional choice(s). Once a new size is chosen, imagery designed specifically for the different size is used. In the event that an image is not provided or available at the different size, the normal image is scaled to match the destination, as would become apparent to a person skilled in the art of developing window-based applications. Sizing of the Zenu™ UDI provides greater flexibility when implemented with the templates and themes. The system that allows the designers to choose whether they desired to re-use a simple graphic at multiple resolutions, or to duplicate an image for different resolutions increasing or decreasing the amount of detail included in the image. This is similar to conventional icon on the desktop; an icon (.ico) can contain up to four different images, two each in black and white and color at 16×16 and 32×32 pixels.
  • [0132]
    E. Exemplary Architecture Operation
  • [0133]
    FIGS. 12 through 19 are flow diagrams illustrating the operation of an exemplary Zenu™ UDI system and method according to an embodiment of the present invention. For ease of explaining this example, the Zenu™ UDI comprises an application executing on a personal computer in a Microsoft® Windows environment. The Zenu™ UDI from a file resident in the computer system, such as in a fixed drive or other memory medium.
  • [0134]
    Turning to FIG. 12, a step 1202 represents launching of the Zenu™ UDI. Thus, once launched in this manner, the Zenu™ UDI is loaded in the computer's random access memory (RAM) and either appears as an icon in the Window's system tray, or is displayed for the first time. At a step 1204 available themes and templates are enumerated according to their associated file names so as to create a main UDI window, as shown at a step 1206. As part of the launching process, a decision is made at a step 1208 as to whether a default Hot Key has been overridden. If not, a default Hot Key is assumed as shown at a Step 1210. If the default Hot Key was overridden, the system will use the Hot Key override as shown at a step 1212. Next, at a decision step 1214, it is determined whether the UDI is to be shown at startup. If YES, the UDI is displayed, as shown generally at a step 1216. Otherwise, the UDI is started and placed in the Window's system tray as an icon, as shown at a step 1218. The system then waits for an event, shown generally at a step 1220.
  • [0135]
    FIG. 13 is a flow diagram representing further details of step 1206 (create main UDI window) of FIG. 12. FIG. 13 illustrates how the Zenu™ UDI obtains its appearance and what it does in order to render such appearance using system windows from Microsoft® Windows. Creating basic windows for the UDI begins at a start step 1302. Then a set of application resources are passed to a Template .dll, as a step 1304. The Template .dll comprises the resources within the UDI that gives the Zenu™ UDI its shape and controls position of the buttons. At a next step 1306, a Theme .dll is loaded, as specified in an .ini file, or the like. The load Theme .dll stores all of the individual overrides of the Template in terms of its default appearance. At a next step, 1308, the relevant “UDI window look” is applied, together with other window settings. The details of step 1308 are described below in connection with FIG. 14. Next, at a step 1310, the buttons of the Zenu™ UDI are created as basic window definitions. The details of button creation are described below in connection with FIG. 15. Various settings of the buttons are then applied at a step 1312. The details of the button settings are described below in connection with FIG. 16. At a next step, 1314, the Zenu™ UDI waits for an event to occur. At this stage two events can occur; a button can be clicked, or a drop file unbutton event can occur. A “button click” is described below in connection with FIGS. 17 and 18. A “drop file unbutton event” is described below in connection with FIG. 19.
  • [0136]
    FIG. 14 describes the details of a plain UDI window look and other window settings as introduced at step 1308. This process starts at a step 1402. Based on the availability of the template and theme information, a decision is made (step 1404) to determine if the “main window look” is overridden in the theme file. If YES (i.e., the theme is going to be used), a region is created according to a “main UDI window look” in the theme, as shown at a step 1406. Otherwise, information from the default template will be used to create the region, as shown at step 1408. Next, at a step 1410, the window pop-up position information is retrieved from the .ini file. In other words, the position where the Zenu™ UDI was last displayed is obtained, or a cursor relative position is determined. At a step 1412 the window is then registered as an application object with a operating system for tracking and access purposes. Thus, the operation is performed and the flow FIG. of 14 represents a “shell” of the window for the Zenu™ UDI. Next, at a step 1414, the process flows to step 1310, which is further described in connection with FIG. 15.
  • [0137]
    FIG. 15 illustrates the flow in connection with “creating buttons” as introduced at step 1310. The flow begins at a start step 1502. Steps 1504 through 1518 access the template to determine the designated number of buttons that make up the Zenu™ UDI. For example, the Zenu™ UDI of FIG. 8A comprises 12 buttons; two rows of 6 buttons each. In essence, this figure represents the creation of a small window corresponding to each button and links them together to create a Zenu™ shell. Every time a new Zenu™ UDI session is initialized, i.e., the Zenu™ UDI is launched, small windows called “child windows” corresponding to each button of the Zenu™ UDI must be created to form the UDI, as shown at a step 1504. A first button is processed as shown at a step 1506. At a step 1508, it is then determined whether the child window is a button. If YES, a UDI button is created and it is assigned a subclass as a child window, at a step 1510. Next, a pointer to the button is stored in a list for future access, as shown at a step 1512. If an additional child window is to be processed, as determined at a step 1514, the next button is retrieved, as shown at a step 1516. The process then flows back to step 1508, otherwise the flow proceeds to step 1312 as shown by step 1518. Step 1312 applies various settings to each button just created, the details of which are described in connection with FIG. 16.
  • [0138]
    FIG. 16 further illustrated the process of step 1312 for applying various settings to the buttons created in FIG. 15. Thus, the steps illustrated in FIG. 16 are performed for each button for which a region was defined in FIG. 15. The flow starts at a step 1602 and proceeds to get a first button for processing at step 1604. At a step 1606 a theme name and parent template name are set for the button. Next, at a step 1608, the button configuration string from the .ini file is set. Next, at a step 1610, an application resource is set to the UDI template file. At a step 1612, the configuration string from the button for the template file is loaded. At a step 1614, the application resource is set to the UDI windows current theme file. At step 1616, the configuration string from the button is loaded from the theme file. At a step 1618, the theme settings string is merged with the template settings thereby overriding the values in the template setting string (i.e., theme settings are replaced by template settings).
  • [0139]
    At a step 1620, the button configuration that was loaded from the .ini file is merged with the existing string, thereby overriding values with values that originated in the .ini file. In other words, the settings that are in the .ini file are used to replace the existing settings in the configuration string. At steps 1622 through 1662, the available settings for UDI buttons (buttons are sometimes referred to as action, or click areas) that are defined in Table 1 are applied to the button. Thus, at step 1622 the “setting” is applied to the button to thereby associate Microsoft® Window Quicklaunch or Favorites with the Zenu™ UDI for easy access by the user.
  • [0140]
    At a step 1624 a transparent setting can be applied to the button. Transparency allows buttons, and the like, to be visible while at the same time allowing the underlying image to be partially visible. Various known transparency techniques can be employed, as would become apparent to a person skilled in the computer graphics art. At step 1622 a resource bitmap is located, if so specified in the configuration string for the button in the theme file. If a bitmap is located, as determined at step 1628, then the button bitmap is set to the located bitmap, at a step 1630. Otherwise, the application resource is set to the UDI windows template file and the associated bitmap resource is searched for, as shown at step 1632. If a template file bitmap resource is located, as determined at a step 1634, flow proceeds to 1630. Otherwise, the button is deleted as shown at a step 1636. If the button is deleted, further buttons can be processed, as determined at a step 1638. If so, a pointer to the next button is located, at a step 1640, and flow proceeds to step 1606.
  • [0141]
    After a bitmap is determined at either of step 1628 or step 1634, it is applied at step 1630. Next, a theme specific tool tip is set at a step 1644. Then, at step 1646, a “tweak” amount is set for button positioning. At a step 1548 an “auto repeat” feature is applied to the button if so desired. At step 1650, a “check button” setting is applied to the button if so desired.
  • [0142]
    Next, at a step 1652, it is determined whether a theme specific icon is specified for the button. If so, the theme specific icon is applied to the button, at step 1654. Otherwise, flow proceeds to a step 1656 to determine whether there is a bitmap to use as an icon. If so, the icon bitmap is applied to the button at a step 1658. Otherwise, flow proceeds to a step 1660.
  • [0143]
    At step 1660 it is determined whether the “A” setting indicates that the button is to accept Quicklaunch or Favorites features. If YES, the appropriate attributes are applied to the button at a step 1662. If not, flow proceeds to step 1638 so as to process any further buttons. Once all buttons are processed, flow continues back to step 1314 as shown at a step 1642.
  • [0144]
    These collections of styles that can be applied to a button (attributes, properties, or the like, e.g., a bitmap, a font, tool tip, flyover characteristic) have a particular precedence. Such characteristics are defined in the configuration file. If no such characteristics are found in the configuration file, the theme is searched. If such a characteristic is found in the theme it is applied. If no such characteristic is found in the theme, or it wasn't in the configuration file, the template is searched. If no such characteristic is located in the template, any predetermined default is applied. Thus, softer defaults as well as overrides at the template level, theme level, and user configuration level are available according to this embodiment of the present invention. The flow of FIG. 16 follows this iteration to determine what attributes to apply to the buttons. For example, the “set tweak amount” for positioning at step 1646 searches the configuration file and the theme to determine whether an modification has been made to the position of the button. In the case in which Zenu™ buttons are nested, in other words, clicking a button opens another level of buttons, each level of buttons has different parent templates, and a theme associated with each level. Thus, the position of buttons of a particular level is determined by their template and their appearance is determined by the theme of that level.
  • [0145]
    Turning again to the “waiting for an event” step 1314, two events can occur: a “button click”, which is described in connection with FIGS. 17 and 18, or “a dropped file unbutton file event”, which is described at FIG. 19.
  • [0146]
    FIGS. 17 and 18 describe the process that occurs when a button is clicked. This process begins at a step 1702, and proceeds to determine whether the UDI is in button configuration mode, at a step 1704. If so, the current settings for the button are displayed in the configuration window at a step 1706. The process then enters the “wait for event” mode, at a step 1716 (which is equivalent to the wait for event step 1314). If not in the configuration mode, flow proceeds to a step 1708, which determines if there is a user defined button command for the button that applies to this theme. If so, the command is executed at a step 1710, then flow proceeds to step 1716. If no button command is defined, flow proceeds to step 1712 to determine if there is a built-in command for the button with the theme applied to the UDI as specified in the theme file. If YES, that command is then executed at step 1710. If not, flow proceeds to step 1714 to determine if a default command for the button is found in the default template file. If so, the default command is executed at step 1710. Otherwise, flow proceeds to step 1716 to wait for another event.
  • [0147]
    FIG. 18 illustrates the process for executing a command formed at step 1710. The execute command process begins at a step 1802. The command string is parsed at a step 1804. The syntax of the command string as described above in connection with Tables 1 and 2. Next, the first command is evaluated at a step 1806. If the command is a plug-in, as determined at a step 1808, the plug-in .dll is loaded to create a plug-in object and a command is executed at a step 1810. If the command is not a plug-in, it is determined whether the command specifies opening a template or theme, at a step 1812. If so, the theme, template, or both are opened, at a step 1814. Next, it is determined whether the command was to close a template, at a step 1816. If so, the closed template specified is performed, at a step 1818. As a result of steps 1810, 1814 and step 1818, or if the result of step 1816 is NO, it is next determined whether a command has been executed, at a step 1820. If YES, it is then determined whether there is another command in the string to process, at a step 1822. If so, the next command is obtained, at a step 1826, and flow proceeds to evaluate the command, at step 1808. If the result of the query in step 1820 is NO, a shell execute command is performed, at a step 1824. Control then proceeds after step 1824 to step 1822. If no other commands are to be executed in the string flow proceeds to step 1828 to wait for an event, which is the equivalent of “wait for event” step 1314.
  • [0148]
    FIG. 19 illustrates the process for handling a “dropped file on button event.” Flow begins at a step 1902. Next, a short-cut is created to the file that is dropped on the button, and that short-cut is placed in a Zenu™ short-cut directory, at a step 1904. Next, a short-cut icon for the file association is placed on the button and is modified according to the theme/layer specific characteristics, at a step 1906. Finally, the button command is edited based on the current theme, so that when the button is clicked the file is opened if that theme is currently applied, at a step 1908. The wait for event step is entered again at a step 1910.
  • [0000]
    IV. Example Computer System and Computer Program Product Implementations
  • [0149]
    The Zenu™ UDI of the present invention can be implemented using hardware, software or a combination thereof and may be implemented in one or more computer systems or other processing systems. In fact, in one embodiment, the invention is directed toward one or more computer systems capable of carrying out the functionality described herein. An example of a computer system 2000 is shown in FIG. 20. The computer system 2000 includes one or more processors, such as processor 2004. Processor 2004 can support various operating systems such as Microsoft® Windows, Unix, Lixux, or the like. The processor 2004 is connected to a communication infrastructure 2006 (e.g., a communications bus, cross-over bar, or network). Various software embodiments are described in terms of this exemplary computer system. After reading this description, it will become apparent to a person skilled in the relevant art(s) how to implement the invention using other computer systems and/or computer architectures.
  • [0150]
    Computer system 2000 can include a display interface 2002 that forwards graphics, text, and other data from the communication infrastructure 2006 (or from a frame buffer not shown) for display on the display device 2030.
  • [0151]
    Computer system 2000 also includes a main memory 2008, preferably random access memory (RAM), and can also include a secondary memory 2010. The secondary memory 2010 can include, for example, a hard disk drive 2012 and/or a removable storage drive 2014, representing a floppy disk drive, a magnetic tape drive, an optical disk drive, or the like. The removable storage drive 2014 reads from and/or writes to a removable storage unit 2018 in a well known manner. Removable storage unit 2018, represents a floppy disk, magnetic tape, optical disk, of the like, which is read by and written to by removable storage drive 2014. As will be appreciated, the removable storage unit 2018 includes a computer usable storage medium having stored therein computer software and/or data.
  • [0152]
    In alternative embodiments, secondary memory 2010 can include other similar means for allowing computer programs or other instructions to be loaded into computer system 2000. Such means can include, for example, a removable storage unit 2022 and an interface 2020. Examples of such can include a program cartridge and cartridge interface (such as that found in video game devices), a removable memory chip (such as an EPROM, or PROM) and associated socket, and other removable storage units 2022 and interfaces 2020 which allow software and data to be transferred from the removable storage unit 2022 to computer system 2000.
  • [0153]
    Computer system 2000 can also include a communications interface 2024. Communications interface 2024 allows software and data to be transferred between computer system 2000 and external devices. Examples of communications interface 2024 can include a modem, a network interface (such as an Ethernet card), a communications port, a PCMCIA slot and card, infrared, radio frequency (RF), or the like. Software and data transferred via communications interface 2024 are in the form of signals 2028 which can be electronic, electromagnetic, optical or other signals capable of being received by communications interface 2024. These signals 2028 are provided to communications interface 2024 via a communications path (i.e., channel) 2026. This channel 2026 carries signals 2028 and can be implemented using wire or cable, fiber optics, a phone line, a cellular phone link, an RF link and other communications channels.
  • [0154]
    In this document, the terms “computer program medium” and “computer usable medium” are used to generally refer to media such as removable storage drive 2014, a hard disk installed in hard disk drive 2012, and signals 2028. These computer program products are means for providing software to computer system 2000. The invention is directed to such computer program products.
  • [0155]
    Computer programs (also called computer control logic) are stored in main memory 2008 and/or secondary memory 2010. Computer programs can also be received via communications interface 2024. Such computer programs, when executed, enable the computer system 2000 to perform the features of the present invention as discussed herein. In particular, the computer programs, when executed, enable the processor 2004 to perform the features of the present invention. Accordingly, such computer programs represent controllers or modules of the computer system 2000.
  • [0156]
    In an embodiment where the invention is implemented using software, the software can be stored in a computer program product and loaded into computer system 2000 using removable storage drive 2014, hard drive 2012 or communications interface 2024. The control logic or modules (software), when executed by the processor 2004, causes the processor 2004 to perform the functions of the invention as described herein.
  • [0157]
    In another embodiment, the invention is implemented primarily in hardware using, for example, hardware components such as application specific integrated circuits (ASICs). Implementation of the hardware state machine so as to perform the functions described herein will be apparent to persons skilled in the relevant art(s).
  • [0158]
    In yet another embodiment, the invention is implemented using a combination of both hardware and software.
  • [0000]
    V. Conclusion
  • [0159]
    While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the invention. This is especially true in light of technology and terms within the relevant art(s) that may be later developed.
  • [0160]
    The present invention has been described above with the aid of functional building blocks or modules (see FIGS. 11 and 20, for example) illustrating the performance of specified functions and relationships thereof. The boundaries of these functional building blocks have been defined herein for the convenience of the description. Alternate boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Any such alternate boundaries are thus within the scope and spirit of the claimed invention. One skilled in the art will recognize that these functional building blocks can be implemented by discrete components, application specific integrated circuits, processors executing appropriate software and the like or any combination thereof. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.
Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US4772882 *18 Jul 198620 Sep 1988Commodore-Amiga, Inc.Cursor controller user interface system
US4896291 *20 May 198823 Jan 1990International Business Machines CorporationValuator menu for use as a graphical user interface tool
US5041967 *13 Oct 198720 Aug 1991Bell Communications Research, Inc.Methods and apparatus for dynamic menu generation in a menu driven computer system
US5081592 *16 Jan 199014 Jan 1992Tektronix, Inc.Test system for acquiring, calculating and displaying representations of data sequences
US5345550 *23 Dec 19926 Sep 1994International Business Machines CorporationUser-modifiable popup menus for object oriented behavior
US5490245 *25 Jan 19956 Feb 1996Ast Research, Inc.Component-based icon construction and customization system
US5513310 *18 Jan 199430 Apr 1996Megard; PatrickMethod for selecting a constructed element of a drawing to generate similar elements
US5544354 *18 Jul 19946 Aug 1996Ikonic Interactive, Inc.Multimedia matrix architecture user interface
US5572648 *19 Jan 19935 Nov 1996Canon Kabushiki KaishaSystem for simultaneously displaying a static tool palette having predefined windowing tool functions and a dynamic tool palette which changes windowing tool functons in accordance with a context of an executed application program
US5581670 *21 Jul 19933 Dec 1996Xerox CorporationUser interface having movable sheet with click-through tools
US5589856 *30 Dec 199431 Dec 1996International Business Machines CorporationSystem & method for dynamically labeled touch sensitive buttons in a digitizing display
US5596699 *2 Feb 199421 Jan 1997Driskell; Stanley W.Linear-viewing/radial-selection graphic for menu display
US5602997 *7 Jun 199511 Feb 1997Starfish Software, Inc.Customizable program control interface for a computer system
US5664133 *30 Apr 19962 Sep 1997Microsoft CorporationContext sensitive menu system/menu behavior
US5664737 *10 Oct 19959 Sep 1997Beloit Technologies, Inc.Centerwind assist for a paper winder system
US5678015 *1 Sep 199514 Oct 1997Silicon Graphics, Inc.Four-dimensional graphical user interface
US5689667 *6 Jun 199518 Nov 1997Silicon Graphics, Inc.Methods and system of controlling menus with radial and linear portions
US5689668 *23 Sep 199618 Nov 1997International Business Machines CorporationDynamic hierarchical selection menu
US5701424 *6 Jun 199523 Dec 1997Microsoft CorporationPalladian menus and methods relating thereto
US5706448 *29 Dec 19946 Jan 1998International Business Machines CorporationMethod and system for manipulating data through a graphic user interface within a data processing system
US5706456 *18 Apr 19956 Jan 1998Unisys CorporationApplication specific graphical user interface (GUI) that is window programmable and capable of operating above a windows operating system GUI
US5721853 *28 Apr 199524 Feb 1998Ast Research, Inc.Spot graphic display element with open locking and periodic animation
US5737557 *26 May 19957 Apr 1998Ast Research, Inc.Intelligent window user interface for computers
US5737560 *29 Dec 19957 Apr 1998Silicon Graphics, Inc.Graphical method and system for accessing information on a communications network
US5745116 *9 Sep 199628 Apr 1998Motorola, Inc.Intuitive gesture-based graphical user interface
US5745717 *7 Jun 199528 Apr 1998Vayda; MarkGraphical menu providing simultaneous multiple command selection
US5748891 *22 Jul 19945 May 1998Aether Wire & LocationSpread spectrum localizers
US5754173 *28 Feb 199619 May 1998Sun Microsystems, Inc.Method and system for creating user interface independent programs with a user interface provider
US5754174 *4 Nov 199619 May 1998Starfish Software, Inc.User interface with individually configurable panel interfaces for use in a computer system
US5790820 *7 Jun 19954 Aug 1998Vayda; MarkRadial graphical menuing system
US5798760 *7 Jun 199525 Aug 1998Vayda; MarkRadial graphical menuing system with concentric region menuing
US5805167 *30 Oct 19968 Sep 1998Van Cruyningen; IzakPopup menus with directional gestures
US5812805 *18 Mar 199622 Sep 1998International Business Machines Corp.Method and editing system for setting tool button
US5818445 *9 Sep 19926 Oct 1998Tandem Computers IncorporatedMethod and system for creating computer-program-based applications with developer specified look and feel
US5825357 *21 Jun 199620 Oct 1998Microsoft CorporationContinuously accessible computer system interface
US5828360 *20 Jul 199427 Oct 1998U.S. Philips CorporationApparatus for the interactive handling of objects
US5828376 *23 Sep 199627 Oct 1998J. D. Edwards World Source CompanyMenu control in a graphical user interface
US5838321 *6 May 199617 Nov 1998Ast Research, Inc.User interface with embedded objects for personal computers and the like
US5852440 *18 Sep 199622 Dec 1998International Business Machines CorporationMethod and system for facilitating the selection of icons
US5875966 *7 Mar 19972 Mar 1999Itt Manufacturing Enterprises, Inc.Dual mode input signal conditioner
US5914714 *1 Apr 199722 Jun 1999Microsoft CorporationSystem and method for changing the characteristics of a button by direct manipulation
US5926178 *1 Apr 199720 Jul 1999Silicon Graphics, Inc.Display and control of menus with radial and linear portions
US5933141 *5 Jan 19983 Aug 1999Gateway 2000, Inc.Mutatably transparent displays
US5936614 *7 Mar 199610 Aug 1999International Business Machines CorporationUser defined keyboard entry system
US5940076 *1 Dec 199717 Aug 1999Motorola, Inc.Graphical user interface for an electronic device and method therefor
US5943039 *6 Aug 199824 Aug 1999U.S. Philips CorporationApparatus for the interactive handling of objects
US5973666 *20 Jul 199526 Oct 1999International Business Machines CorporationMethod and means for controlling the concurrent execution of a plurality of programs on a computer system
US6002402 *9 Apr 199714 Dec 1999Symantec CorporationSystem and method for producing a drag-and-drop object from a popup menu item
US6002709 *4 Sep 199814 Dec 1999Dsp Group, Inc.Verification of PN synchronization in a direct-sequence spread-spectrum digital communications system
US6005578 *25 Sep 199721 Dec 1999Mindsphere, Inc.Method and apparatus for visual navigation of information objects
US6104399 *3 Jun 199815 Aug 2000U.S. Philips CorporationSystem for menu-driven instruction input
US6111614 *17 Oct 199729 Aug 2000Sony CorporationMethod and apparatus for displaying an electronic menu having components with differing levels of transparency
US6118427 *18 Apr 199612 Sep 2000Silicon Graphics, Inc.Graphical user interface with optimal transparency thresholds for maximizing user performance and system efficiency
US6121965 *17 Oct 199719 Sep 2000Lucent Technologies Inc.User interface for graphical application tool
US6133915 *17 Jun 199817 Oct 2000Microsoft CorporationSystem and method for customizing controls on a toolbar
US6188399 *8 May 199813 Feb 2001Apple Computer, Inc.Multiple theme engine graphical user interface architecture
US6201539 *4 Jan 199413 Mar 2001International Business Machines CorporationMethod and system for customizing a data processing system graphical user interface
US6208341 *5 Aug 199827 Mar 2001U. S. Philips CorporationGUI of remote control facilitates user-friendly editing of macros
US6215488 *12 Jun 199810 Apr 2001U.S. Philips CorporationMethod and system for designing a graphical user interface for an electronic consumer product
US6256030 *15 Apr 19963 Jul 2001International Business Machines Corp.Navigation within a graphical user interface for a compound graphical object using pointing device input
US6304746 *17 Nov 199816 Oct 2001Aether Systems, Inc.Method and system for providing formatted information via a two-way communications system
US6307937 *1 May 200023 Oct 2001Hughes Electronics CorporationMethod and apparatus for an adapter card providing conditional access in a communication system
US6341270 *25 Jan 199922 Jan 2002Aether Systems, Inc.Method for providing vendor notification marketing in an electronic commerce network environment
US6369837 *17 Jul 19989 Apr 2002International Business Machines CorporationGUI selector control
US6385268 *22 Jul 19997 May 2002Aether-Wire & TechnologySpread spectrum localizers
US6400754 *7 Dec 20004 Jun 2002Aether Wire & Location, Inc.Spread spectrum localizers
US6414700 *21 Jul 19982 Jul 2002Silicon Graphics, Inc.System for accessing a large number of menu items using a zoned menu bar
US6448987 *3 Apr 199810 Sep 2002Intertainer, Inc.Graphic user interface for a digital content delivery system using circular menus
US6466937 *10 Mar 200015 Oct 2002Aether Systems, Inc.System, method and apparatus for utilizing transaction databases in a client-server environment
US6509908 *4 Feb 200021 Jan 2003Clemens CroyPersonal navigator system
US6535885 *16 Oct 200018 Mar 2003Aether Systems, Inc.Multikeyed table implementable on a personal digital assistant
US6546374 *25 Jan 19998 Apr 2003Aether Systems, Inc.Apparatus for providing instant vendor notification in an electronic commerce network environment
US6549219 *9 Apr 199915 Apr 2003International Business Machines CorporationPie menu graphical user interface
US6593945 *19 May 200015 Jul 2003Xsides CorporationParallel graphical user interface
US6618063 *8 Mar 19999 Sep 2003Silicon Graphics, Inc.Method and apparatus for producing, controlling and displaying menus
US6621532 *9 Jan 199816 Sep 2003International Business Machines CorporationEasy method of dragging pull-down menu items onto a toolbar
US6651084 *29 Nov 199918 Nov 2003International Business Machines CorporationSystem and method for adding plug-ins to a web browser
US6664981 *13 Aug 200116 Dec 2003Apple Computer, Inc.Graphical user interface with hierarchical structure for customizable menus and control objects
US6686938 *5 Jan 20003 Feb 2004Apple Computer, Inc.Method and system for providing an embedded application toolbar
US6717597 *15 Dec 20006 Apr 2004Dassault SystemesContextual and dynamic command navigator for CAD and related systems
US6724402 *27 Jan 200020 Apr 2004David R. BaqueroMethod of launching computer programs within a graphical user interface
US20010004260 *13 Dec 200021 Jun 2001Sun Microsystems, IncMethod, system, and graphic user interface for file system navigation
US20010033298 *28 Feb 200125 Oct 2001Benjamin SlotznickAdjunct use of instant messenger software to enable communications to or between chatterbots or other software agents
US20010045965 *14 Feb 200129 Nov 2001Julian OrbanesMethod and system for receiving user input
US20020186254 *11 Jun 200112 Dec 2002Apps4Biz.Com Holding AgInformation handling method and apparatus and intuitive graphical user interface for navigating business application software
US20060085763 *5 Dec 200520 Apr 2006Change Tools, Inc.System and method for using an interface
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US7181697 *4 Dec 200220 Feb 2007Tatung Co., Ltd.Method of implementing a plurality of system tray areas
US759670125 Jan 200629 Sep 2009Oracle International CorporationOnline data encryption and decryption
US761676429 Jun 200510 Nov 2009Oracle International CorporationOnline data encryption and decryption
US775263314 Sep 20056 Jul 2010Seven Networks, Inc.Cross-platform event engine
US782299022 Dec 200526 Oct 2010Oracle International CorporationOnline data encryption and decryption
US7877703 *14 Sep 200525 Jan 2011Seven Networks, Inc.Intelligent rendering of information in a limited display environment
US789553030 Aug 200422 Feb 2011Change Tools, Inc.User definable interface system, method, support tools, and computer program product
US790864528 Apr 200615 Mar 2011Oracle International CorporationSystem and method for fraud monitoring, detection, and tiered user authentication
US79579555 Jan 20077 Jun 2011Apple Inc.Method and system for providing word recommendations for text input
US801008219 Oct 200530 Aug 2011Seven Networks, Inc.Flexible billing architecture
US806458321 Sep 200622 Nov 2011Seven Networks, Inc.Multiple data store authentication
US806916627 Feb 200629 Nov 2011Seven Networks, Inc.Managing user-to-user contact with inferred presence information
US8074172 *5 Jan 20076 Dec 2011Apple Inc.Method, system, and graphical user interface for providing word recommendations
US807815826 Jun 200813 Dec 2011Seven Networks, Inc.Provisioning applications for a mobile device
US810792111 Jan 200831 Jan 2012Seven Networks, Inc.Mobile virtual network operator
US811621430 Nov 200514 Feb 2012Seven Networks, Inc.Provisioning of e-mail settings for a mobile terminal
US812734223 Sep 201028 Feb 2012Seven Networks, Inc.Secure end-to-end transport through intermediary nodes
US816616414 Oct 201124 Apr 2012Seven Networks, Inc.Application and network-based long poll request detection and cacheability assessment therefor
US81907011 Nov 201129 May 2012Seven Networks, Inc.Cache defeat detection and caching of content addressed by identifiers intended to defeat cache
US82049531 Nov 201119 Jun 2012Seven Networks, Inc.Distributed system for cache defeat detection and caching of content addressed by identifiers intended to defeat cache
US82097095 Jul 201026 Jun 2012Seven Networks, Inc.Cross-platform event engine
US823297330 Jun 200831 Jul 2012Apple Inc.Method, device, and graphical user interface providing word recommendations for text input
US824515628 Jun 200814 Aug 2012Apple Inc.Radial menu selection
US82910765 Mar 201216 Oct 2012Seven Networks, Inc.Application and network-based long poll request detection and cacheability assessment therefor
US831609819 Apr 201220 Nov 2012Seven Networks Inc.Social caching for device resource sharing and management
US83269851 Nov 20114 Dec 2012Seven Networks, Inc.Distributed management of keep-alive message signaling for mobile network resource conservation and optimization
US835608020 Jul 201215 Jan 2013Seven Networks, Inc.System and method for a mobile device to use physical storage of another device for caching
US836418110 Dec 200729 Jan 2013Seven Networks, Inc.Electronic-mail filtering for mobile devices
US841267524 Feb 20062 Apr 2013Seven Networks, Inc.Context aware data presentation
US841782318 Nov 20119 Apr 2013Seven Network, Inc.Aligning data transfer to optimize connections established for transmission over a wireless network
US843863318 Dec 20067 May 2013Seven Networks, Inc.Flexible real-time inbox access
US846812614 Dec 200518 Jun 2013Seven Networks, Inc.Publishing data in an information community
US848431414 Oct 20119 Jul 2013Seven Networks, Inc.Distributed caching in a wireless network of content delivered for a mobile application over a long-held request
US848445514 Sep 20109 Jul 2013Oracle International CorporationOnline data encryption and decryption
US84945106 Dec 201123 Jul 2013Seven Networks, Inc.Provisioning applications for a mobile device
US853904028 Feb 201217 Sep 2013Seven Networks, Inc.Mobile network background traffic data management with optimized polling intervals
US8549432 *29 May 20091 Oct 2013Apple Inc.Radial menus
US854958714 Feb 20121 Oct 2013Seven Networks, Inc.Secure end-to-end transport through intermediary nodes
US856108617 May 201215 Oct 2013Seven Networks, Inc.System and method for executing commands that are non-native to the native environment of a mobile device
US8578294 *29 Feb 20085 Nov 2013Sungkyunkwan University Foundation For Corporate CollaborationMenu user interface providing device and method thereof
US860138930 Apr 20093 Dec 2013Apple Inc.Scrollable menus and toolbars
US862107527 Apr 201231 Dec 2013Seven Metworks, Inc.Detecting and preserving state for satisfying application requests in a distributed proxy and cache system
US863533922 Aug 201221 Jan 2014Seven Networks, Inc.Cache state management on a mobile device to preserve user experience
US869349431 Mar 20088 Apr 2014Seven Networks, Inc.Polling
US870072817 May 201215 Apr 2014Seven Networks, Inc.Cache defeat detection and caching of content addressed by identifiers intended to defeat cache
US8707211 *21 Oct 201122 Apr 2014Hewlett-Packard Development Company, L.P.Radial graphical user interface
US87380507 Jan 201327 May 2014Seven Networks, Inc.Electronic-mail filtering for mobile devices
US873927829 Oct 200827 May 2014Oracle International CorporationTechniques for fraud monitoring and detection using application fingerprinting
US875012331 Jul 201310 Jun 2014Seven Networks, Inc.Mobile device equipped with mobile network congestion recognition to make intelligent decisions regarding connecting to an operator network
US8750802 *29 Apr 201110 Jun 2014Sony CorporationInformation processing apparatus, information processing system, and program
US876175613 Sep 201224 Jun 2014Seven Networks International OyMaintaining an IP connection in a mobile network
US87748448 Apr 20118 Jul 2014Seven Networks, Inc.Integrated messaging
US877563125 Feb 20138 Jul 2014Seven Networks, Inc.Dynamic bandwidth adjustment for browsing or streaming activity in a wireless network based on prediction of user behavior when interacting with mobile applications
US87822225 Sep 201215 Jul 2014Seven NetworksTiming of keep-alive messages used in a system for mobile network resource conservation and optimization
US878794718 Jun 200822 Jul 2014Seven Networks, Inc.Application discovery on mobile devices
US879330513 Dec 200729 Jul 2014Seven Networks, Inc.Content delivery to a mobile device from a content service
US879941013 Apr 20115 Aug 2014Seven Networks, Inc.System and method of a relay server for managing communications and notification between a mobile device and a web access server
US88053345 Sep 200812 Aug 2014Seven Networks, Inc.Maintaining mobile terminal information for secure communications
US880542528 Jan 200912 Aug 2014Seven Networks, Inc.Integrated messaging
US88119525 May 201119 Aug 2014Seven Networks, Inc.Mobile device power management in data synchronization over a mobile network with or without a trigger notification
US88126953 Apr 201319 Aug 2014Seven Networks, Inc.Method and system for management of a virtual network connection without heartbeat messages
US8817028 *3 Feb 201026 Aug 2014Dassault Systemes Solidworks CorporationCreating dynamic sets to automatically arrange dimension annotations
US882618128 Jun 20082 Sep 2014Apple Inc.Moving radial menus
US883156128 Apr 20119 Sep 2014Seven Networks, IncSystem and method for tracking billing events in a mobile wireless network for a network operator
US883222826 Apr 20129 Sep 2014Seven Networks, Inc.System and method for making requests on behalf of a mobile device based on atomic processes for mobile network traffic relief
US883874428 Jan 200916 Sep 2014Seven Networks, Inc.Web-based access to data objects
US88387835 Jul 201116 Sep 2014Seven Networks, Inc.Distributed caching for resource and mobile network traffic management
US883941213 Sep 201216 Sep 2014Seven Networks, Inc.Flexible real-time inbox access
US88431531 Nov 201123 Sep 2014Seven Networks, Inc.Mobile traffic categorization and policy for network use optimization while preserving user experience
US8849510 *7 Dec 201230 Sep 2014Panasonic CorporationIn-vehicle display system
US884990224 Jun 201130 Sep 2014Seven Networks, Inc.System for providing policy based content service in a mobile network
US886135414 Dec 201214 Oct 2014Seven Networks, Inc.Hierarchies and categories for management and deployment of policies for distributed wireless traffic optimization
US886265725 Jan 200814 Oct 2014Seven Networks, Inc.Policy based content service
US88687536 Dec 201221 Oct 2014Seven Networks, Inc.System of redundantly clustered machines to provide failover mechanisms for mobile traffic management and network resource conservation
US887341112 Jan 201228 Oct 2014Seven Networks, Inc.Provisioning of e-mail settings for a mobile terminal
US887476115 Mar 201328 Oct 2014Seven Networks, Inc.Signaling optimization in a wireless network for traffic utilizing proprietary and non-proprietary protocols
US888617622 Jul 201111 Nov 2014Seven Networks, Inc.Mobile application traffic optimization
US889244621 Dec 201218 Nov 2014Apple Inc.Service orchestration for intelligent automated assistant
US890371621 Dec 20122 Dec 2014Apple Inc.Personalized vocabulary for digital assistant
US890395422 Nov 20112 Dec 2014Seven Networks, Inc.Optimization of resource polling intervals to satisfy mobile device requests
US890919211 Aug 20119 Dec 2014Seven Networks, Inc.Mobile virtual network operator
US89092027 Jan 20139 Dec 2014Seven Networks, Inc.Detection and management of user interactions with foreground applications on a mobile device in distributed caching
US890975912 Oct 20099 Dec 2014Seven Networks, Inc.Bandwidth measurement
US891400211 Aug 201116 Dec 2014Seven Networks, Inc.System and method for providing a network service in a distributed fashion to a mobile device
US891850328 Aug 201223 Dec 2014Seven Networks, Inc.Optimization of mobile traffic directed to private networks and operator configurability thereof
US89301914 Mar 20136 Jan 2015Apple Inc.Paraphrasing of user requests and results by automated digital assistant
US894298621 Dec 201227 Jan 2015Apple Inc.Determining user intent based on ontologies of domains
US89545448 Aug 201210 Feb 2015Axcient, Inc.Cloud-based virtual machines and offices
US896606612 Oct 201224 Feb 2015Seven Networks, Inc.Application and network-based long poll request detection and cacheability assessment therefor
US89777556 Dec 201210 Mar 2015Seven Networks, Inc.Mobile device and method to utilize the failover mechanism for fault tolerance provided for mobile traffic management and network/device resource conservation
US898458111 Jul 201217 Mar 2015Seven Networks, Inc.Monitoring mobile application activities for malicious traffic on a mobile device
US89897287 Sep 200624 Mar 2015Seven Networks, Inc.Connection architecture for a mobile network
US899466029 Aug 201131 Mar 2015Apple Inc.Text correction processing
US90028282 Jan 20097 Apr 2015Seven Networks, Inc.Predictive content delivery
US9003298 *13 Mar 20127 Apr 2015Microsoft CorporationWeb page application controls
US90092507 Dec 201214 Apr 2015Seven Networks, Inc.Flexible and dynamic integration schemas of a traffic management system with various network operators for network traffic alleviation
US902102110 Dec 201228 Apr 2015Seven Networks, Inc.Mobile network reporting and usage analytics system and method aggregated using a distributed traffic optimization system
US904343325 May 201126 May 2015Seven Networks, Inc.Mobile network traffic coordination across multiple applications
US904373130 Mar 201126 May 2015Seven Networks, Inc.3D mobile user interface with configurable workspace management
US904714216 Dec 20102 Jun 2015Seven Networks, Inc.Intelligent rendering of information in a limited display environment
US904917920 Jan 20122 Jun 2015Seven Networks, Inc.Mobile network traffic coordination across multiple applications
US90530834 Nov 20119 Jun 2015Microsoft Technology Licensing, LlcInteraction between web gadgets and spreadsheets
US90551022 Aug 20109 Jun 2015Seven Networks, Inc.Location-based operations and messaging
US90600329 May 201216 Jun 2015Seven Networks, Inc.Selective data compression by a distributed traffic management system to reduce mobile data traffic and signaling traffic
US90657658 Oct 201323 Jun 2015Seven Networks, Inc.Proxy server associated with a mobile carrier for enhancing mobile traffic management in a mobile network
US90776308 Jul 20117 Jul 2015Seven Networks, Inc.Distributed implementation of dynamic wireless traffic policy
US908410519 Apr 201214 Jul 2015Seven Networks, Inc.Device resources sharing for network resource conservation
US908680226 Jul 201221 Jul 2015Apple Inc.Method, device, and graphical user interface providing word recommendations for text input
US910087314 Sep 20124 Aug 2015Seven Networks, Inc.Mobile network background traffic data management
US91046215 Nov 201411 Aug 2015Axcient, Inc.Systems and methods for restoring a file
US910642211 Dec 200711 Aug 2015Oracle International CorporationSystem and method for personalized security signature
US911744721 Dec 201225 Aug 2015Apple Inc.Using event alert text as input to an automated assistant
US91313976 Jun 20138 Sep 2015Seven Networks, Inc.Managing cache to prevent overloading of a wireless network due to user activity
US9158444 *27 Dec 201013 Oct 2015Avaya Inc.User interface for managing communication sessions
US916125815 Mar 201313 Oct 2015Seven Networks, LlcOptimized and selective management of policy deployment to mobile clients in a congested network to prevent further aggravation of network congestion
US91710998 Aug 201227 Oct 2015Microsoft Technology Licensing, LlcSystem and method for providing calculation web services for online documents
US91731286 Mar 201327 Oct 2015Seven Networks, LlcRadio-awareness of mobile device for sending server-side control signals using a wireless network optimized transport protocol
US9176653 *24 Sep 20123 Nov 2015Roland Wescott MontagueMethods for combination tools that zoom, pan, rotate, draw, or manipulate during a drag
US91890792 Dec 201117 Nov 2015Apple Inc.Method, system, and graphical user interface for providing word recommendations
US92038644 Feb 20131 Dec 2015Seven Networks, LlcDynamic categorization of applications for network access in a mobile network
US92081237 Dec 20128 Dec 2015Seven Networks, LlcMobile device having content caching mechanisms integrated with a network operator for traffic alleviation in a wireless network and methods therefor
US921360725 Sep 201315 Dec 2015Axcient, Inc.Systems, methods, and media for synthesizing views of file system backups
US923547417 Feb 201112 Jan 2016Axcient, Inc.Systems and methods for maintaining a virtual failover volume of a target computing system
US923980011 Jul 201219 Jan 2016Seven Networks, LlcAutomatic generation and distribution of policy information regarding malicious mobile traffic in a wireless network
US924131415 Mar 201319 Jan 2016Seven Networks, LlcMobile device with application or context aware fast dormancy
US92445362 Dec 201126 Jan 2016Apple Inc.Method, system, and graphical user interface for providing word recommendations
US925119328 Oct 20072 Feb 2016Seven Networks, LlcExtending user relationships
US926261221 Mar 201116 Feb 2016Apple Inc.Device access using voice authentication
US926286327 Jun 201416 Feb 2016Dassault Systemes Solidworks CorporationCreating dynamic sets to automatically arrange dimension annotations
US927123815 Mar 201323 Feb 2016Seven Networks, LlcApplication or context aware fast dormancy
US927516317 Oct 20111 Mar 2016Seven Networks, LlcRequest and response characteristics based adaptation of distributed caching in a mobile network
US92774437 Dec 20121 Mar 2016Seven Networks, LlcRadio-awareness of mobile device for sending server-side control signals using a wireless network optimized transport protocol
US9292153 *7 Mar 201322 Mar 2016Axcient, Inc.Systems and methods for providing efficient and focused visualization of data
US930071914 Jan 201329 Mar 2016Seven Networks, Inc.System and method for a mobile device to use physical storage of another device for caching
US930078413 Jun 201429 Mar 2016Apple Inc.System and method for emergency calls initiated by voice command
US930749315 Mar 20135 Apr 2016Seven Networks, LlcSystems and methods for application management of mobile device radio state promotion and demotion
US931810810 Jan 201119 Apr 2016Apple Inc.Intelligent automated assistant
US93256629 Jan 201226 Apr 2016Seven Networks, LlcSystem and method for reduction of mobile network traffic used for domain name system (DNS) queries
US93261894 Feb 201326 Apr 2016Seven Networks, LlcUser as an end point for profiling and optimizing the delivery of content and data in a wireless network
US933019614 Jun 20123 May 2016Seven Networks, LlcWireless traffic management system cache optimization using http headers
US93307202 Apr 20083 May 2016Apple Inc.Methods and apparatus for altering audio output signals
US933849326 Sep 201410 May 2016Apple Inc.Intelligent automated assistant for TV user interactions
US93681146 Mar 201414 Jun 2016Apple Inc.Context-sensitive handling of interruptions
US9377926 *12 Dec 201328 Jun 2016Tencent Technology (Shenzhen) Company LimitedMethod and apparatus for gesture operation on address bar and touch screen terminal
US93979077 Mar 201319 Jul 2016Axcient, Inc.Protection status determinations for computing devices
US9400628 *2 May 201426 Jul 2016Sony CorporationInformation processing apparatus, information processing system, and program
US940771316 Jan 20122 Aug 2016Seven Networks, LlcMobile application traffic optimization
US9411503 *19 Jun 20099 Aug 2016Sony CorporationInformation processing device, information processing method, and information processing program
US943046330 Sep 201430 Aug 2016Apple Inc.Exemplar-based natural language processing
US94597919 Jul 20124 Oct 2016Apple Inc.Radial menu selection
US94834616 Mar 20121 Nov 2016Apple Inc.Handling speech synthesis of content for multiple languages
US949512912 Mar 201315 Nov 2016Apple Inc.Device, method, and user interface for voice-activated navigation and browsing of a document
US950203123 Sep 201422 Nov 2016Apple Inc.Method for supporting dynamic grammars in WFST-based ASR
US95141164 Jun 20156 Dec 2016Microsoft Technology Licensing, LlcInteraction between web gadgets and spreadsheets
US953590617 Jun 20153 Jan 2017Apple Inc.Mobile device having human language translation capability with positional feedback
US95480509 Jun 201217 Jan 2017Apple Inc.Intelligent automated assistant
US95599038 Dec 201431 Jan 2017Axcient, Inc.Cloud-based virtual machines and offices
US95765749 Sep 201321 Feb 2017Apple Inc.Context-sensitive handling of interruptions by intelligent digital assistant
US95826086 Jun 201428 Feb 2017Apple Inc.Unified ranking with entropy-weighted information for phrase-based semantic auto-completion
US96201046 Jun 201411 Apr 2017Apple Inc.System and method for user-specified pronunciation of words for speech synthesis and recognition
US962010529 Sep 201411 Apr 2017Apple Inc.Analyzing audio input for efficient speech and music recognition
US96269554 Apr 201618 Apr 2017Apple Inc.Intelligent text-to-speech conversion
US963300429 Sep 201425 Apr 2017Apple Inc.Better resolution when referencing to concepts
US963366013 Nov 201525 Apr 2017Apple Inc.User profiling for voice input processing
US96336745 Jun 201425 Apr 2017Apple Inc.System and method for detecting errors in interactions with a voice-based digital assistant
US964660925 Aug 20159 May 2017Apple Inc.Caching apparatus for serving phonetic pronunciations
US964661421 Dec 20159 May 2017Apple Inc.Fast, language-independent method for user authentication by voice
US966802430 Mar 201630 May 2017Apple Inc.Intelligent automated assistant for TV user interactions
US966812125 Aug 201530 May 2017Apple Inc.Social reminders
US96978207 Dec 20154 Jul 2017Apple Inc.Unit-selection text-to-speech synthesis using concatenation-sensitive neural networks
US969782228 Apr 20144 Jul 2017Apple Inc.System and method for updating an adaptive speech recognition model
US97057307 May 201311 Jul 2017Axcient, Inc.Cloud storage using Merkle trees
US971114112 Dec 201418 Jul 2017Apple Inc.Disambiguating heteronyms in speech synthesis
US971298622 Mar 201218 Jul 2017Seven Networks, LlcMobile device configured for communicating with another mobile device associated with an associated user
US971587530 Sep 201425 Jul 2017Apple Inc.Reducing the need for manual start/end-pointing and trigger phrases
US972156631 Aug 20151 Aug 2017Apple Inc.Competing devices responding to voice triggers
US973379630 Sep 201315 Aug 2017Apple Inc.Radial menus
US973419318 Sep 201415 Aug 2017Apple Inc.Determining domain salience ranking from ambiguous words in natural speech
US9747270 *7 Jan 201129 Aug 2017Microsoft Technology Licensing, LlcNatural input for spreadsheet actions
US976055922 May 201512 Sep 2017Apple Inc.Predictive text input
US978563028 May 201510 Oct 2017Apple Inc.Text prediction using combined word N-gram and unigram language models
US97856472 Oct 201210 Oct 2017Axcient, Inc.File system virtualization
US979839325 Feb 201524 Oct 2017Apple Inc.Text correction processing
US20040061723 *4 Dec 20021 Apr 2004Tatung Co., Ltd.Method of implementing a plurality of system tray areas
US20060020815 *29 Jun 200526 Jan 2006Bharosa Inc.Online data encryption and decryption
US20060282660 *28 Apr 200614 Dec 2006Varghese Thomas ESystem and method for fraud monitoring, detection, and tiered user authentication
US20070064477 *20 Sep 200522 Mar 2007Battelle Memorial InstituteSystem for remote data sharing
US20070165849 *22 Dec 200519 Jul 2007Varghese Thomas EOnline data encryption and decryption
US20070168879 *17 Jan 200619 Jul 2007Microsoft CorporationTraversal of datasets using positioning of radial input device
US20070192615 *25 Jan 200616 Aug 2007Varghese Thomas EOnline data encryption and decryption
US20070261003 *8 May 20078 Nov 2007Combots Product GmbhMethod and device for providing a selection menu allocated to a displayed symbol
US20070287506 *25 May 200713 Dec 2007Samsung Electronics Co.; LtdMode selection mechanism and method for mobile terminal using virtual mode selection dial
US20080167858 *5 Jan 200710 Jul 2008Greg ChristieMethod and system for providing word recommendations for text input
US20080168366 *5 Jan 200710 Jul 2008Kenneth KociendaMethod, system, and graphical user interface for providing word recommendations
US20080250349 *5 Apr 20079 Oct 2008Hewlett-Packard Development Company, L.P.Graphical user interface
US20090051701 *5 Sep 200826 Feb 2009Michael FlemingInformation layout
US20090051704 *5 Sep 200826 Feb 2009Michael FlemingObject rendering from a base coordinate
US20090051706 *5 Sep 200826 Feb 2009Michael FlemingCoordinate evaluation
US20090070339 *7 Apr 200812 Mar 2009Lg Electronics Inc.Managing digital files in an electronic device
US20090132939 *19 Nov 200721 May 2009International Business Machines CorporationMethod and apparatus for a floating island for user navigation in an interactive environment
US20090172593 *1 May 20072 Jul 2009Koninklijke Philips Electronics N.V.Method and electronic device for allowing a user to select a menu option
US20090183100 *29 Feb 200816 Jul 2009Sungkyunkwan University Foundation For Corporate CollaborationMenu user interface providing device and method thereof
US20090327955 *28 Jun 200831 Dec 2009Mouilleseaux Jean-Pierre MSelecting Menu Items
US20090327963 *28 Jun 200831 Dec 2009Mouilleseaux Jean-Pierre MRadial menu selection
US20090327964 *28 Jun 200831 Dec 2009Mouilleseaux Jean-Pierre MMoving radial menus
US20100013780 *19 Jun 200921 Jan 2010Sony CorporationInformation processing device, information processing method, and information processing program
US20100182264 *25 Jul 200822 Jul 2010Vanilla Breeze Co. Ltd.Mobile Device Equipped With Touch Screen
US20100188358 *18 Mar 201029 Jul 2010Kenneth KociendaUser Interface Including Word Recommendations
US20100192102 *29 Jan 200929 Jul 2010International Business Machines CorporationDisplaying radial menus near edges of a display area
US20100201684 *3 Feb 201012 Aug 2010Sumit YadavCreating dynamic sets to automatically arrange dimension annotations
US20100235780 *17 Jul 200916 Sep 2010Westerman Wayne CSystem and Method for Identifying Words Based on a Sequence of Keyboard Events
US20100281374 *30 Apr 20094 Nov 2010Egan SchulzScrollable menus and toolbars
US20100306702 *29 May 20092 Dec 2010Peter WarnerRadial Menus
US20110294433 *29 Apr 20111 Dec 2011Sony CorporationInformation processing apparatus, information processing system, and program
US20120173963 *13 Mar 20125 Jul 2012Microsoft CorporationWeb page application controls
US20120180002 *7 Jan 201112 Jul 2012Microsoft CorporationNatural input for spreadsheet actions
US20130019200 *24 Sep 201217 Jan 2013Roland Wescott MontagueMethods for combination tools that zoom, pan, rotate, draw, or manipulate during a drag
US20130104079 *21 Oct 201125 Apr 2013Nozomu YasuiRadial graphical user interface
US20130166146 *7 Dec 201227 Jun 2013Panasonic CorporationIn-vehicle display system
US20140181746 *18 Mar 201326 Jun 2014Giga-Byte Technology Co., Ltd.Electrionic device with shortcut function and control method thereof
US20140240199 *2 May 201428 Aug 2014Sony CorporationInformation processing apparatus, information processing system, and program
US20140344754 *15 Apr 201420 Nov 2014Citrix Systems, Inc.Providing Enhanced User Interfaces
US20150143248 *10 Nov 201421 May 2015Salesforce.Com, Inc.Apparatus and methods for performing an action on a database record
US20150177973 *10 Dec 201425 Jun 2015Funai Electric Co., Ltd.Selection device
US20150261388 *12 Dec 201317 Sep 2015Tencent Technology (Shenzhen) Company LimitedMethod and apparatus for gesture operation on address bar and touch screen terminal
US20160139805 *21 Jan 201619 May 2016Apple Inc.Method, system, and graphical user interface for providing word recommendations
US20160179337 *17 Dec 201423 Jun 2016Datalogic ADC, Inc.Floating soft trigger for touch displays on electronic device
US20160306601 *24 Jun 201620 Oct 2016Sony CorporationInformation processing apparatus, information processing system, and program
USD762230 *23 Apr 201426 Jul 2016Google Inc.Display panel with an animated computer icon
USD762709 *26 May 20142 Aug 2016Hon Hai Precision Industry Co., Ltd.Display screen or portion thereof with graphical user interface
USD763266 *11 Dec 20139 Aug 2016Samsung Electronics Co., Ltd.Display screen or portion thereof with graphical user interface
USD771123 *1 Sep 20148 Nov 2016Apple Inc.Display screen or portion thereof with multi-state graphical user interface
USD774539 *28 Apr 201420 Dec 2016Inbay Technologies Inc.Display screen with graphical user interface
USD776708 *22 May 201417 Jan 2017Samsung Electronics Co., Ltd.Display screen or portion thereof with a graphical user interface
USD779532 *30 Sep 201521 Feb 2017Fanuc CorporationDisplay screen with graphical user interface for controlling machine tools
USD780208 *2 Oct 201528 Feb 2017Fanuc CorporationDisplay panel with graphical user interface for controlling machine tools
USD784399 *22 May 201418 Apr 2017Samsung Electronics Co., Ltd.Display screen or portion thereof with a graphical user interface
USD788121 *22 Sep 201530 May 2017Td Ameritrade Ip Company, Inc.Display screen or portion thereof with graphical user interface
USD801994 *29 Jan 20157 Nov 2017Caterpillar Inc.Display screen or portion thereof with graphical user interface
USRE4534816 Mar 201220 Jan 2015Seven Networks, Inc.Method and apparatus for intercepting events in a communication system
Classifications
U.S. Classification715/765, 715/826, 715/779, 715/834
International ClassificationG06F9/44, G06F3/048, G06F3/033
Cooperative ClassificationG06F9/4443, G06F3/04817, G06F3/0482
European ClassificationG06F3/0482, G06F3/0481H, G06F9/44W
Legal Events
DateCodeEventDescription
29 Aug 2005ASAssignment
Owner name: CHANGE TOOLS, INC., ALABAMA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEAVITT, JOSEPH M.;MILLS, SCOTT A.;REEL/FRAME:016465/0941;SIGNING DATES FROM 20020102 TO 20020104