US20080235735A1 - Scaling and Layout Methods and Systems for Handling One-To-Many Objects - Google Patents

Scaling and Layout Methods and Systems for Handling One-To-Many Objects Download PDF

Info

Publication number
US20080235735A1
US20080235735A1 US12/134,486 US13448608A US2008235735A1 US 20080235735 A1 US20080235735 A1 US 20080235735A1 US 13448608 A US13448608 A US 13448608A US 2008235735 A1 US2008235735 A1 US 2008235735A1
Authority
US
United States
Prior art keywords
items
displayed
groups
user interface
item
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/134,486
Inventor
Frank J. Wroblewski
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
IDHL Holdings Inc
Original Assignee
Wroblewski Frank J
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wroblewski Frank J filed Critical Wroblewski Frank J
Priority to US12/134,486 priority Critical patent/US20080235735A1/en
Publication of US20080235735A1 publication Critical patent/US20080235735A1/en
Assigned to MULTIPLIER CAPITAL, LP reassignment MULTIPLIER CAPITAL, LP SECURITY AGREEMENT Assignors: HILLCREST LABORATORIES, INC.
Assigned to IDHL HOLDINGS, INC. reassignment IDHL HOLDINGS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HILLCREST LABORATORIES, INC.
Assigned to HILLCREST LABORATORIES, INC. reassignment HILLCREST LABORATORIES, INC. RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: MULTIPLIER CAPITAL, LP
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42212Specific keyboard arrangements
    • H04N21/42213Specific keyboard arrangements for facilitating data entry
    • H04N21/42214Specific keyboard arrangements for facilitating data entry using alphanumerical characters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4722End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/482End-user interface for program selection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/445Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
    • H04N5/45Picture in picture, e.g. displaying simultaneously another television channel in a region of the screen

Definitions

  • the present invention describes a framework for organizing, selecting and launching media items. Part of that framework involves the design and operation of graphical user interfaces with the basic building blocks of point, click, scroll, hover and zoom and, more particularly, to graphical user interfaces associated with media items which can be used with a three-dimensional (hereinafter “3D”) pointing remote.
  • 3D three-dimensional
  • the television was tuned to the desired channel by adjusting a tuner knob and the viewer watched the selected program. Later, remote control devices were introduced that permitted viewers to tune the television from a distance. This addition to the user-television interface created the phenomenon known as “channel surfing” whereby a viewer could rapidly view short segments being broadcast on a number of channels to quickly learn what programs were available at any given time.
  • buttons on these universal remote units was typically more than the number of buttons on either the TV remote unit or VCR remote unit individually. This added number of buttons and functionality makes it very difficult to control anything but the simplest aspects of a TV or VCR without hunting for exactly the right button on the remote. Many times, these universal remotes do not provide enough buttons to access many levels of control or features unique to certain TVs. In these cases, the original device remote unit is still needed, and the original hassle of handling multiple remotes remains due to user interface issues arising from the complexity of aggregation. Some remote units have addressed this problem by adding “soft” buttons that can be programmed with the expert commands.
  • buttons sometimes have accompanying LCD displays to indicate their action. These too have the flaw that they are difficult to use without looking away from the TV to the remote control. Yet another flaw in these remote units is the use of modes in an attempt to reduce the number of buttons.
  • moded a special button exists to select whether the remote should communicate with the TV, DVD player, cable set-top box, VCR, etc. This causes many usability issues including sending commands to the wrong device, forcing the user to look at the remote to make sure that it is in the right mode, and it does not provide any simplification to the integration of multiple devices.
  • the most advanced of these universal remote units provide some integration by allowing the user to program sequences of commands to multiple devices into the remote. This is such a difficult task that many users hire professional installers to program their universal remote units.
  • EPGs Electronic program guides
  • Early EPGs provided what was essentially an electronic replica of the printed media guides.
  • cable service operators have provided analog EPGs wherein a dedicated channel displays a slowly scrolling grid of the channels and their associated programs over a certain time horizon, e.g., the next two hours. Scrolling through even one hundred channels in this way can be tedious and is not feasibly scalable to include significant additional content deployment, e.g., video-on-demand.
  • More sophisticated digital EPGs have also been developed.
  • digital EPGs program schedule information, and optionally applications/system software, is transmitted to dedicated EPG equipment, e.g., a digital set-top box (STB).
  • STB digital set-top box
  • Digital EPGs provide more flexibility in designing the user interface for media systems due to their ability to provide local interactivity and to interpose one or more interface layers between the user and the selection of the media items to be viewed.
  • An example of such an interface can be found in U.S. Pat. No. 6,421,067 to Kamen et al., the disclosure of which is incorporated here by reference.
  • FIG. 2 depicts a GUI described in the '067 patent. Therein, according to the Kamen et al.
  • a first column 190 lists program channels, a second column 191 depicts programs currently playing, a column 192 depicts programs playing in the next half-hour, and a fourth column 193 depicts programs playing in the half hour after that.
  • the baseball bat icon 121 spans columns 191 and 192, thereby indicating that the baseball game is expected to continue into the time slot corresponding to column 192.
  • text block 111 does not extend through into column 192. This indicates that the football game is not expected to extend into the time slot corresponding to column 192.
  • a pictogram 194 indicates that after the football game, ABC will be showing a horse race.
  • interfaces described above suffer from, among other drawbacks, an inability to easily scale between large collections of media items and small collections of media items.
  • interfaces which rely on lists of items may work well for small collections of media items, but are tedious to browse for large collections of media items.
  • Interfaces which rely on hierarchical navigation may be more speedy to traverse than list interfaces for large collections of media items, but are not readily adaptable to small collections of media items.
  • users tend to lose interest in selection processes wherein the user has to move through three or more layers in a tree structure. For all of these cases, current remote units make this selection processor even more tedious by forcing the user to repeatedly depress the up and down buttons to navigate the list or hierarchies.
  • selection skipping controls are available such as page up and page down, the user usually has to look at the remote to find these special buttons or be trained to know that they even exist.
  • Systems and methods according to the present invention address these needs and others by providing a user interface displayed on a screen with a plurality of control elements, at least some of the plurality of control elements having at least one alphanumeric character displayed thereon.
  • the layout of the plurality of groups on the user interface is based on a first number of groups which are displayed, and wherein a layout of the displayed items within a group is based on a second number of items displayed within that group.
  • a method for laying out items in a user interface includes the steps of: laying out a plurality of groups of items within a group display space, the groups being laid out within the display space in a pattern which varies as a function of the number of the plurality of groups, and laying out, for each of the plurality of groups, a plurality of items within an item display space associated with a respective one of the plurality of groups, the items being laid out within a respective item display space in a pattern which varies as a function of the number of the plurality of items.
  • FIG. 1 depicts a conventional remote control unit for an entertainment system
  • FIG. 2 depicts a conventional graphical user interface for an entertainment system
  • FIG. 3 depicts an exemplary media system in which exemplary embodiments of the present invention (both display and remote control) can be implemented;
  • FIG. 4 shows a system controller of FIG. 3 in more detail
  • FIGS. 5-8 depict a graphical user interface for a media system according to an exemplary embodiment of the present invention.
  • FIGS. 9-13 depict a zoomable graphical user interface according to another exemplary embodiment of the present invention.
  • FIG. 14( a ) illustrates a user interface for searching and displaying search results in a graphical layout according to exemplary embodiments of the present invention
  • FIG. 14( b ) illustrates an abstraction of a user interface for searching and displaying search results in a graphical layout according to exemplary embodiments of the present invention
  • FIGS. 15( a - n ) illustrate groups containing items with overlap to exemplary embodiments of the present invention
  • FIGS. 16( a ) and 16 ( b ) illustrate a hoverzoom effect according to exemplary embodiments of the present invention
  • FIGS. 17( a - h ) illustrate groups containing items without overlap according to exemplary embodiments of the present invention
  • FIG. 18 illustrates groups with vertical overlapping in a user interface according to exemplary embodiments of the present invention.
  • FIG. 19 illustrates groups with vertical overlapping in a user interface according to exemplary embodiments of the present invention.
  • an exemplary aggregated media system 200 in which the present invention can be implemented will first be described with respect to FIG. 3 .
  • I/O input/output
  • the I/O bus 210 represents any of a number of different mechanisms and techniques for routing signals between the media system components.
  • the I/O bus 210 may include an appropriate number of independent audio “patch” cables that route audio signals, coaxial cables that route video signals, two-wire serial lines or infrared or radio frequency transceivers that route control signals, optical fiber or any other routing mechanisms that route other types of signals.
  • the media system 200 includes a television/monitor 212 , a video cassette recorder (VCR) 214 , digital video disk (DVD) recorder/playback device 216 , audio/video tuner 218 and compact disk player 220 coupled to the I/O bus 210 .
  • the VCR 214 , DVD 216 and compact disk player 220 may be single disk or single cassette devices, or alternatively may be multiple disk or multiple cassette devices. They may be independent units or integrated together.
  • the media system 200 includes a microphone/speaker system 222 , video camera 224 and a wireless I/O control device 226 .
  • the wireless I/O control device 226 is a media system remote control unit that supports 3D pointing, has a minimal number of buttons to support navigation, and communicates with the entertainment system 200 through RF signals.
  • wireless I/O control device 226 can be a 3D pointing device which uses a gyroscope or other mechanism to define both a screen position and a motion vector to determine the particular command desired.
  • a set of buttons can also be included on the wireless I/O device 226 to initiate the “click” primitive described below as well as a “back” button.
  • wireless I/O control device 226 is a media system remote control unit, which communicates with the components of the entertainment system 200 through IR signals.
  • wireless I/O control device 226 may be an IR remote control device similar in appearance to a typical entertainment system remote control with the added feature of a track-ball or other navigational mechanisms which allows a user to position a cursor on a display of the entertainment system 200 .
  • the entertainment system 200 also includes a system controller 228 .
  • the system controller 228 operates to store and display entertainment system data available from a plurality of entertainment system data sources and to control a wide variety of features associated with each of the system components.
  • system controller 228 is coupled, either directly or indirectly, to each of the system components, as necessary, through I/O bus 210 .
  • system controller 228 in addition to or in place of I/O bus 210 , system controller 228 is configured with a wireless communication transmitter (or transceiver), which is capable of communicating with the system components via IR signals or RF signals. Regardless of the control medium, the system controller 228 is configured to control the media components of the media system 200 via a graphical user interface described below.
  • media system 200 may be configured to receive media items from various media sources and service providers.
  • media system 200 receives media input from and, optionally, sends information to, any or all of the following sources: cable broadcast 230 , satellite broadcast 232 (e.g., via a satellite dish), very high frequency (VHF) or ultra high frequency (UHF) radio frequency communication of the broadcast television networks 234 (e.g., via an aerial antenna), telephone network 236 and cable modem 238 (or another source of Internet content).
  • VHF very high frequency
  • UHF ultra high frequency
  • FIG. 4 is a block diagram illustrating an embodiment of an exemplary system controller 228 according to the present invention.
  • System controller 228 can, for example, be implemented as a set-top box and includes, for example, a processor 300 , memory 302 , a display controller 304 , other device controllers (e.g., associated with the other components of system 200 ), one or more data storage devices 308 and an I/O interface 310 . These components communicate with the processor 300 via bus 312 .
  • processor 300 can be implemented using one or more processing units.
  • Memory device(s) 302 may include, for example, DRAM or SRAM, ROM, some of which may be designated as cache memory, which store software to be run by processor 300 and/or data usable by such programs, including software and/or data associated with the graphical user interfaces described below.
  • Display controller 304 is operable by processor 300 to control the display of monitor 212 to, among other things, display GUI screens and objects as described below. Zoomable GUIs according to exemplary embodiments of the present invention provide resolution independent zooming, so that monitor 212 can provide displays at any resolution.
  • Device controllers 306 provide an interface between the other components of the media system 200 and the processor 300 .
  • Data storage 308 may include one or more of a hard disk drive, a floppy disk drive, a CD-ROM device, or other mass storage device.
  • Input/output interface 310 may include one or more of a plurality of interfaces including, for example, a keyboard interface, an RF interface, an IR interface and a microphone/speech interface. According to one exemplary embodiment of the present invention, I/O interface 310 will include an interface for receiving location information associated with movement of a wireless pointing device.
  • Generation and control of a graphical user interface according to exemplary embodiments of the present invention to display media item selection information is performed by the system controller 228 in response to the processor 300 executing sequences of instructions contained in the memory 302 .
  • Such instructions may be read into the memory 302 from other computer-readable mediums such as data storage device(s) 308 or from a computer connected externally to the media system 200 .
  • Execution of the sequences of instructions contained in the memory 302 causes the processor to generate graphical user interface objects and controls, among other things, on monitor 212 .
  • hard-wire circuitry may be used in place of or in combination with software instructions to implement the present invention.
  • control frameworks described herein overcome these limitations and are, therefore, intended for use with televisions, albeit not exclusively. It is also anticipated that the revolutionary control frameworks, graphical user interfaces and/or various algorithms described herein will find applicability to interfaces which may be used with computers and other non-television devices.
  • the terms “television” and “TV” are used in this specification to refer to a subset of display devices, whereas the terms “GUI”, “GUI screen”, “display” and “display screen” are intended to be generic and refer to television displays, computer displays and any other display device. More specifically, the terms “television” and “TV” are intended to refer to the subset of display devices which are able to display television signals (e.g., NTSC signals, PAL signals or SECAM signals) without using an adapter to translate television signals into another format (e.g., computer video formats).
  • television signals e.g., NTSC signals, PAL signals or SECAM signals
  • TV refers to a subset of display devices that are generally viewed from a distance of several feet or more (e.g., sofa to a family room TV) whereas computer displays are generally viewed close-up (e.g., chair to a desktop monitor).
  • a user interface displays selectable items which can be grouped by category. A user points a remote unit at the category or categories of interest and depresses the selection button to zoom in or the “back” button to zoom back.
  • each zoom in, or zoom back, action by a user results in a change in the magnification level and/or context of the selectable items rendered by the user interface on the screen.
  • each change in magnification level can be consistent, i.e., the changes in magnification level are provided in predetermined steps.
  • Exemplary embodiments of the present invention also provide for user interfaces which incorporate several visual techniques to achieve scaling to the very large. These techniques involve a combination of building blocks and techniques that achieve both scalability and ease-of-use, in particular techniques which supply an easy and fast selection experience regardless of the size(s) of the media item collection(s) being browsed.
  • the user interface is largely a visual experience.
  • exemplary embodiments of the present invention make use of the capability of the user to remember the location of objects within the visual environment. This is achieved by providing a stable, dependable location for user interface selection items, which is at the same time pleasing to the user and efficiently uses the allocated display space.
  • Each object or item has a location in the zoomable layout, which location can be selected according to layout rules described below with respect to FIGS. 14-19 .
  • User interfaces provide visual mnemonics that help the user remember the location of items of interest.
  • visual mnemonics include pan and zoom animations, transition effects which generate a geographic sense of movement across the user interface's virtual surface and consistent zooming functionality, among other things which will become more apparent based on the examples described below.
  • FIG. 5 portrays the zoomable GUI at a high level e.g., a second “most zoomed out” state.
  • the interface displays a set of shapes 500 . Displayed within each shape 500 are text 502 and/or a picture 504 that describe the group of media item selections accessible via that portion of the GUI. As shown in FIG. 5 , the shapes 500 are rectangles, and text 502 and/or picture 504 describe the genre of the media.
  • GUI grouping could represent other aspects of the media selections available to the user e.g., artist, year produced, area of residence for the artist, length of the item, or any other characteristic of the selection.
  • the shapes used to outline the various groupings in the GUI need not be rectangles.
  • Shrunk down versions of album covers and other icons could be used to provide further navigational hints to the user in lieu of or in addition to text 502 and/or picture 504 within the shape groupings 500 .
  • a background portion of the GUI 506 can be displayed as a solid color or be a part of a picture such as a map to aid the user in remembering the spatial location of genres so as to make future uses of the interface require less reading.
  • the selection pointer (cursor) 508 follows the movements of an input device and indicates the location to zoom in on when the user presses the button on the device (not shown in FIG. 5 ).
  • the input device can be a 3D pointing device, e.g., the 3D pointing device described in U.S. patent application Ser. No. 11/119,663, filed on May 2, 2005, entitled “3D Pointing Devices and Methods”, the disclosure of which is incorporated here by reference and which is hereafter referred to as the “'663 application”, coupled with a graphical user interface that supports the point, click, scroll, hover and zoom building blocks which are described in more detail below.
  • a 3D pointing device e.g., the 3D pointing device described in U.S. patent application Ser. No. 11/119,663, filed on May 2, 2005, entitled “3D Pointing Devices and Methods”, the disclosure of which is incorporated here by reference and which is hereafter referred to as the “'663 application”, coupled with a graphical user interface that supports the point, click, scroll, hover and zoom building blocks which are described in more detail below.
  • One feature of this exemplary input device that is beneficial for use in conjunction with the present invention is that it
  • buttons can be configured as a ZOOM IN (select) button and one can be configured as a ZOOM OUT (back) button.
  • the present invention simplifies this aspect of the GUI by greatly reducing the number of buttons, etc., that a user is confronted with in making his or her media item selection.
  • An additional preferred, but not required, feature of input devices according to exemplary embodiments of the present invention is that they provide “3D pointing” capability for the user.
  • 3D pointing is used in this specification to refer to the ability of a user to freely move the input device in three (or more) dimensions in the air in front of the display screen and the corresponding ability of the user interface to translate those motions directly into movement of a cursor on the screen.
  • 3D pointing differs from conventional computer mouse pointing techniques which use a surface other than the display screen, e.g., a desk surface or mousepad, as a proxy surface from which relative movement of the mouse is translated into cursor movement on the computer display screen.
  • Use of 3D pointing in control frameworks according to exemplary embodiments of the present invention further simplifies the user's selection experience, while at the same time providing an opportunity to introduce gestures as distinguishable inputs to the interface.
  • a gesture can be considered as a recognizable pattern of movement over time which pattern can be translated into a GUI command, e.g., a function of movement in the x, y, z, yaw, pitch and roll dimensions or any subcombination thereof.
  • any suitable input device can be used in conjunction with zoomable GUIs according to the present invention.
  • Other examples of suitable input devices include, but are not limited to, trackballs, touchpads, conventional TV remote control devices, speech input, any devices which can communicate/translate a user's gestures into GUI commands, or any combination thereof. It is intended that each aspect of the GUI functionality described herein can be actuated in frameworks according to the present invention using at least one of a gesture and a speech command. Alternate implementations include using cursor and/or other remote control keys or even speech input to identify items for selection.
  • FIG. 6 shows a zoomed in view of Genre 3 that would be displayed if the user selects Genre 3 from FIG. 5 , e.g., by moving the cursor 508 over the area encompassed by the rectangle surrounding Genre 3 on display 212 and depressing a button on the input device.
  • the interface can animate the zoom from FIG. 5 to FIG. 6 so that it is clear to the user that a zoom occurred.
  • An example of such an animated zoom/transition effect is described below.
  • the unselected genres 515 that were adjacent to Genre 3 in the zoomed out view of FIG. 5 are still adjacent to Genre 3 in the zoomed in view, but are clipped by the edge of the display 212 . These unselected genres can be quickly navigated to by selection of them with selection pointer 508 . It will be appreciated, however, that other exemplary embodiments of the present invention can omit clipping neighboring objects and, instead, present only the unclipped selections.
  • Each of the artist groups, e.g., group 512 can contain images of shrunk album covers, a picture of the artist or customizable artwork by the user in the case that the category contains playlists created by the user.
  • FIG. 7 shows a further zoomed in view in response to a user selection of Artist 3 via positioning of cursor 508 and actuation of the input device, in which images of album covers 520 come into view.
  • the unselected, adjacent artists (artists # 2 , 6 and 7 in this example) are shown towards the side of the zoomed in display, and the user can click on these with selection pointer 508 to pan to these artist views.
  • artist information 524 can be displayed as an item in the artist group.
  • This information may contain, for example, the artist's picture, biography, trivia, discography, influences, links to web sites and other pertinent data.
  • Each of the album images 520 can contain a picture of the album cover and, optionally, textual data.
  • the graphical user interface can display a picture which is selected automatically by the interface or preselected by the user.
  • the interface zooms into the album cover as shown in FIG. 8 .
  • the album cover can fade or morph into a view that contains items such as the artist and title of the album 530 , a list of tracks 532 , further information about the album 536 , a smaller version of the album cover 528 , and controls 534 to play back the content, modify the categorization, link to the artists web page, or find any other information about the selection.
  • Neighboring albums 538 are shown that can be selected using selection pointer 508 to cause the interface to bring them into view.
  • alternative embodiments of the present invention can, for example, zoom in to only display the selected object, e.g., album 5 , and omit the clipped portions of the unselected objects, e.g., albums 4 and 6 .
  • This final zoom provides an example of semantic zooming, wherein certain GUI elements are revealed that were not previously visible at the previous zoom level.
  • Various techniques for performing semantic zooming according to exemplary embodiments of the present invention are provided below.
  • this exemplary embodiment of a graphical user interface provides for navigation of a music collection.
  • Interfaces according to the present invention can also be used for video collections such as for DVDs, VHS tapes, other recorded media, video-on-demand, video segments and home movies.
  • Other audio uses include navigation of radio shows, instructional tapes, historical archives, and sound clip collections.
  • Print or text media such as news stories and electronic books can also be organized and accessed using this invention.
  • zoomable graphical user interfaces provide users with the capability to browse a large (or small) number of media items rapidly and easily.
  • This capability is attributable to many characteristics of interfaces according to exemplary embodiments of the present invention including, but not limited to: (1) the use of images as all or part of the selection information for a particular media item, (2) the use of zooming to rapidly provide as much or as little information as a user needs to make a selection and (3) the use of several GUI techniques which combine to give the user the sense that the entire interface resides on a single plane, such that navigation of the GUI can be accomplished, and remembered, by way of the user's sense of direction.
  • GUI screen refers to a set of GUI objects rendered on one or more display units at the same time.
  • a GUI screen may be rendered on the same display which outputs media items, or it may be rendered on a different display.
  • the display can be a TV display, computer monitor or any other suitable GUI output device.
  • GUI effect which enhances the user's sense of GUI screen connectivity is the panning animation effect which is invoked when a zoom is performed or when the user selects an adjacent object at the same zoom level as the currently selected object.
  • the zoom in process is animated to convey the shifting the POV center from point 550 to 552 .
  • This panning animation can be provided for every GUI change, e.g., from a change in zoom level or a change from one object to another object on the same GUI zoom level.
  • a panning animation would occur which would give the user the visual impression of “moving” left or west.
  • Exemplary embodiments of the present invention employ such techniques to provide a consistent sense of directional movement between GUI screens enables users to more rapidly navigate the GUI, both between zoom levels and between media items at the same zoom level.
  • a startup GUI screen 1400 displays a plurality of organizing objects which operate as media group representations.
  • the purely exemplary media group representations of home video, movies, TV, sports, radio, music and news could, of course include different, more or fewer media group representations.
  • the GUI according to this exemplary embodiment will then display a plurality of images each grouped into a particular category or genre. For example, if the “movie” icon in FIG.
  • the GUI screen of FIG. 10 can then be displayed. Therein, a large number, e.g., 120 or more, selection objects are displayed. These selection objects can be categorized into particular group(s), e.g., action, classics, comedy, drama, family and new releases. Those skilled in the art will appreciate that more or fewer categories could be provided.
  • the media item images can be cover art associated with each movie selection. Although the size of the blocks in FIG. 10 is too small to permit detailed illustration of this relatively large group of selection item images, in implementation, the level of magnification of the images is such that the identity of the movie can be discerned by its associated image, even if some or all of the text may be too small to be easily read.
  • the cursor (not shown in FIG. 10 ) can then be disposed over a group of the movie images and the input device actuated to provide a selection indication for one of the groups.
  • the user selects the drama group and the graphical user interface then displays a zoomed version of the drama group of images as seen in FIG. 11 .
  • a transition effect can also be displayed as the GUI shifts from the GUI screen of FIG. 10 to the GUI screen of FIG. 11 , e.g., the GUI may pan the view from the center of the GUI screen of FIG. 10 to the center of the drama group of images during or prior to the zoom. Note that although the zoomed version of the drama group of FIG.
  • GUI 11 only displays a subset of the total number of images in the drama group, that this zoomed version can alternatively contain all of the images in the selected group.
  • the choice of whether or not to display all of the images in a selected group in any given zoomed in version of a GUI screen can be made based upon, for example, the number of media items in a group and a minimum desirable magnification level for a media item for a particular zoom level.
  • This latter characteristic of GUIs according to the present invention can be predetermined by the system designer/service provider or can be user customizable via software settings in the GUI.
  • the number of media items in a group and the minimum and/or maximum magnification levels can be configurable by either or both of the service provider or the end user.
  • Such features enable those users with, for example, poor eyesight, to increase the magnification level of media items being displayed. Conversely, users with especially keen eyesight may decrease the level of magnification, thereby increasing the number of media items displayed on a GUI screen at any one time and decrease browsing time.
  • transition effect which can be employed in graphical user interfaces according to the present invention is referred to herein as the “shoe-to-detail” view effect.
  • this transition effect takes a zoomed out image and simultaneously shrinks and translates the zoomed out image into a smaller view, i.e., the next higher level of magnification.
  • the transition from the magnification level used in the GUI screen of FIG. 10 to the greater magnification level used in the GUI screen of FIG. 11 results in additional details being revealed by the GUI for the images which are displayed in the zoomed in version of FIG. 11 .
  • the GUI selectively reveals or hides details at each zoom level based upon whether or not those details would display well at the currently selected zoom level.
  • exemplary embodiments of the present invention provide for a configurable zoom level parameter that specifies a transition point between when to show the full image and when to show a version of the image with details that are withheld.
  • the transition point can be based upon an internal resolution independent depiction of the image rather the resolution of TV/Monitor 212 .
  • GUIs according to the present invention are consistent regardless of the resolution of the display device being used in the media system.
  • an additional amount of magnification for a particular image can be provided by passing the cursor over a particular image. This feature can be seen in FIG. 12 , wherein the cursor has rolled over the image for the movie “Apollo 13”. Although not depicted in FIG. 12 , such additional magnification could, for example, make more legible the quote “Houston, we have a problem” which appears on the cover art of the associated media item as compared to the corresponding image in the GUI screen of FIG. 12 which is at a lower level of magnification. User selection of this image, e.g., by depressing a button on the input device, can result in a further zoom to display the details shown in FIG. 13 .
  • GUI screen includes GUI control objects including, for example, button control objects for buying the movie, watching a trailer or returning to the previous GUI screen (which could also be accomplished by depressing the ZOOM OUT button on the input device).
  • GUI control objects including, for example, button control objects for buying the movie, watching a trailer or returning to the previous GUI screen (which could also be accomplished by depressing the ZOOM OUT button on the input device).
  • Hyperlinks can also be used to allow the user to jump to, for example, GUI screens associated with the related movies identified in the lower right hand corner of the GUI screen of FIG.
  • film titles under the heading “Filmography” can be implemented as hyperlinks which, when actuated by the user via the input device, will cause the GUI to display a GUI screen corresponding to that of FIG. 13 for the indicated movie.
  • a transition effect can also be employed when a user actuates a hyperlink. Since the hyperlinks may be generated at very high magnification levels, simply jumping to the linked media item may cause the user to lose track of where he or she is in the media item selection “map”. Accordingly, exemplary embodiments of the present invention provide a transition effect to aid in maintaining the user's sense of geographic position when a hyperlink is actuated.
  • One exemplary transition effect which can be employed for this purpose is a hop transition.
  • the GUI zooms out and pans in the direction of the item pointed to by the hyperlink. Zooming out and panning continues until both the destination image and the origination image are viewable by the user. Using the example of FIG.
  • the first phase of the hyperlink hop effect would include zooming out and panning toward the image of “Saving Private Ryan” until both the image for “Saving Private Ryan” and “Apollo 13” were visible to the user.
  • the transition effect has provided the user with the visual impression of being moved upwardly in an arc toward the destination image.
  • the second phase of the transition effect gives the user the visual impression of zooming in and panning to, e.g., on the other half of the arc, the destination image.
  • the hop time i.e., the amount of time both phases one and two of this transition effect are displayed, can be fixed as between any two hyperlinked image items.
  • the hop time may vary, e.g., based on the distance traveled over the GUI.
  • a graphical layout deals with, for example, the number, size and specific arrangement of items on, e.g., a TV screen.
  • Layouts are generally composed of two opposing factors, equilibrium and form. Equilibrium is achieved when objects are uniformly and symmetrically distributed which is the lowest energy state. The mind strives for equilibrium when trying to deal with complexity; however, a layout in total equilibrium is usually considered boring. To add interest to a layout, equilibrium is perturbed by introducing form.
  • Centricity pertains to the central location while eccentricity pertains to locations in the layout which are offset from the center.
  • eccentricity pertains to locations in the layout which are offset from the center.
  • a user interface associated providing a searching mechanism for searching among selectable media items can generate a displayed screen such as that illustrated in FIG. 14( a ).
  • a displayed screen such as that illustrated in FIG. 14( a ).
  • this exemplary embodiment of the present invention incorporates image overlapping and a black border added to each image, although these effects could also be used independently of one another.
  • the border helps the eye delineate each object within an overlapping layout. Overlapping increases the perception of ‘belonging’ desirable to display a unified group and adds a three-dimensional (3D) effect to the layout.
  • the overlapping feature provides additional freedom in placement rules by making better use of the space between images and groups and allowing the displayed images to be scaled significantly larger than they could be if placed separately.
  • the exemplary GUI screen 2000 depicted in FIG. 14( a ) contains a text entry widget including a plurality of control elements 2004 , with at least some of the control elements 2004 being drawn as keys or buttons having alphanumeric characters 2014 thereon, and other control elements 2004 being drawn on the interface as having non-alphanumeric characters 2016 which can be, e.g., used to control character entry.
  • the control elements 2004 are laid out in two horizontal rows across the interface, although other configurations may be used.
  • the corresponding alphanumeric input is displayed in the textbox 2002 , disposed above the text entry widget, and one or more groups of displayed items related to the alphanumeric input provided via the control element(s) can be displayed on the interface, e.g., below the text entry widget.
  • the GUI screen depicted in FIG. 14( a ) can be used to search for selectable media items, and graphically display the results of the search on a GUI screen, in a manner that is useful, efficient and pleasing to the user. (Note that in the illustrated example of FIG.
  • the displayed movie cover images below the text entry widget simply represent a test pattern and are not necessarily related to the input letter “g” as they could be in an implementation, e.g., the displayed movie covers could be only those whose movie titles start with the letter “g”).
  • the layout of the four groups 2006 , 2008 , 2010 and 2012 of displayed items within the region of the user interface allocated for search results to be displayed is based on the number of groups which are displayed and the layout of displayed items (in this example, images of movie covers) within each group is based upon the number of items displayed within each group.
  • group layout refers to the layout of groups of displayed items within a group display area.
  • the group display area is the portion of the display screen between the bottom of the text entry widget and the bottom of the display screen (with suitable margins).
  • item layout refers to the layout of items within each group and, more specifically, within an item display area associated with each group.
  • the four displayed groups of items 2006 , 2008 , 2010 and 2012 have a group layout within a group display area 2020 that is substantially trapezoidal, e.g., connecting the center points of the groups 2006 , 2008 , 2010 and 2012 will form a trapezoid.
  • Left-center display group 2008 and right-center display group 2010 are raised (placed further from the bottom) within the group display area 2020 relative to the leftmost group 2006 and rightmost group 2012 .
  • each respective group i.e., the areas within which the items associated with each group are laid out. As seen in FIG. 14( a ), these regions need not be explicitly displayed on the GUI screen 2000 (although according to other exemplary embodiments, item display area boundaries may be displayed). Moreover, those skilled in the art will appreciate that the item display areas (as well as the group display area) need not be rectangular in shape, as in the example of FIG. 14( b ), but can be any desired shape.
  • the groups' displayed items are laid out according to item layout rules. An exemplary set of overlapping item layouts are shown in FIGS. 15( a )-( n ) and exemplary rules for displaying these item layouts are described below. Each rule is based, at least in part, on the number of items within the group.
  • Group of 2 Items For a group consisting of two displayed items 3010 and 3012 , illustrated in FIG. 15( c ), the two displayed items are laid out in the item display area by aligning the center points of the two items on a diagonal within the item display region 3004 . This is shown conceptually in FIG. 15( d ), wherein a center point 3014 of item 3010 is disposed up and to the left of the center point 3008 of the item display region 3004 on diagonal 3015 , while a center point 3016 of item 3012 is disposed down and to the right of center point 3008 .
  • Group of 3 Items For a group consisting of three displayed items 3018 , 3020 and 3022 , illustrated in FIG. 15( e ), the three displayed items are laid out in the item display area by aligning the center points of the three items on the circumference of a circle within the item display region 3004 . This is shown conceptually in FIG. 15( f ), wherein a center point 3024 of item 3018 is disposed above the center point 3008 of the item display region 3004 on the circumference of circle 3030 , while a center point 3026 of item 3020 is disposed down and to the left of center point 3008 and a center point 3028 of item 3022 is disposed down and to the right of center point 3008 .
  • Group of 4 Items For a group consisting of four displayed items, 3032 , 3034 , 3036 and 3038 , illustrated in FIG. 15( g ), the four displayed items are laid out in the item display area by aligning the center points of the four items on the corners of a rhombus within the item display region 3004 . This is shown conceptually in FIG.
  • Group of 5 Items For a group consisting of five displayed items 3050 , 3052 , 3054 , 3056 and 3058 , illustrated in FIG. 15( i ), the five displayed items are laid out in the item display area 3004 by placing the center points of the five items on the right half of a circumference of an ellipse within the item display region 3004 . This is shown conceptually in FIG.
  • a center point 3060 of item 3050 is dispose up and to the left of the center point 3008 of the item display region 3004 on the right half of a circumference of an ellipse 3070 , while a center point 3062 of item 3052 is disposed above the center point 3008 , while a center point 3064 of item 3054 is disposed to the right of the center point 3008 , while a center point 3066 of item 3056 is disposed below the center point 3008 and while a center point 3068 of item 3058 is disposed down and to the left of the center point 3008 .
  • Group of 6 Items For a group consisting of six displayed items, 3072 , 3074 , 3076 , 3078 , 3080 and 3082 , illustrated in FIG. 15( k ), the six displayed items are laid out in the item display area 3004 by creating a grid such that three items are arranged in an upper row above three items arranged in a lower row. An upper leftmost item 3072 is aligned above a lower leftmost item 3082 and a top edge of the upper leftmost item is higher than a top edge of an upper rightmost item.
  • the upper rightmost item 3076 is aligned above a lower rightmost item 3080
  • the lower rightmost item 3080 is aligned below the upper rightmost item 3076 and a bottom edge of the lower leftmost item is lower than a bottom edge of the lower rightmost item.
  • An upper center item 3074 is left of the center point of the group and overlaps both items in the upper row.
  • a lower center item 3078 is right of the center point of the group and overlaps both items in the lower row and overlaps the upper center item 3074 .
  • Group of 7 Items For a group consisting of seven displayed items, 3084 , 3086 , 3088 , 3090 , 3092 , 3094 and 3096 , illustrated in FIG. 15( l ), the seven displayed items are laid out in the item display area 3004 by creating a grid such that one item 3090 is in the center, three items are arranged in an upper row above the center item and three items are arranged below the center item in a lower row.
  • An upper leftmost item 3084 is slightly higher than an upper rightmost item 3088 and is aligned with a lower leftmost item 3096 .
  • the upper rightmost item 3088 is slightly lower than the upper leftmost item 3084 and is aligned with a lower rightmost item 3092 .
  • the lower rightmost item 3092 is in line with both the upper rightmost item 3088 and the lower leftmost item 3096 .
  • the top edge of the middle item in the upper row 3086 is higher than the top edges of both the upper leftmost item and the upper rightmost item.
  • the bottom edge of the middle item in the lower row 3094 is lower than a bottom edge of both the lower leftmost item 3096 and the lower rightmost item 3092 .
  • the center item 3090 can overlap all other items, the middle items in either row overlap items in that row and wherein the center item 3090 is left of the center point of the item display area 3004 and the middle items are right of the center point of the item display area 3004 .
  • Groups of 8 Items Form a group consisting of eight displayed items, 3098 , 3100 , 3102 , 3104 , 3106 , 3108 , 3110 and 3112 , illustrated in FIG. 15( m ), the eight displayed items are laid out in the item display area 3004 by creating a grid such that two items are arranged in a middle row, three items are arranged in an upper row above the middle row and three items are arranged in a lower row below the middle row.
  • An upper leftmost item 3112 is aligned with both an upper rightmost item 3100 and a lower leftmost item 3108 .
  • the upper rightmost item 3100 is aligned with both the upper leftmost item 3112 and a lower rightmost item 3104 .
  • the lower rightmost item 3104 is aligned with both the upper rightmost item 3100 and the lower leftmost item 3108 .
  • a top edge of a middle item 3098 in the upper row is higher than the top edges of both the upper leftmost item and the upper rightmost item.
  • a bottom edge of a middle item 3106 in the lower row is lower than the bottom edge of both the lower leftmost item and the lower rightmost item. Overlapping occurs between the middle row and all other rows, and that the center items in either the upper row or the lower row overlap items in their respective row. Similar overlapping layout rules and algorithms can be applied to groups having more items, e.g., up to and including sixteen items, an example of which is shown in FIG. 15( n ).
  • the layout rules described above are recursively applied to display four groups of search results, e.g., selectable media items represented by movie cover images.
  • the four groups include a leftmost group 2006 , a left-center group 2008 , a right-center group 2010 and a rightmost group 2012 .
  • the user interface software prepares to display these groups on the GUI screen 2000 , it recursively applies the above described rules to determine an appropriate display layout for the groups, as well as for the items within each group.
  • the left-center group 2008 has three items which are arranged such that center points of each item are located on a circumference of a circle and each of the three items overlap.
  • the other groups 2006 , 2010 and 2012 have their items laid out by applying the rule associated with the number of items in the respective group.
  • a hover zoom effect can be used in conjunction with overlapping images to allow the user to view the portion of the image obscured within the collage layout.
  • the group layout illustrated in FIG. 16( a ) involves a group of eight, overlapping items.
  • the user points to an image, it rises to the top and is scaled to a larger size (“hover zoom”).
  • the user has moved a cursor (not shown in FIG. 16( b )), or otherwise indicated an initial selection of the item 4002 (associated with the movie “The Pianist”), causing that image to rise above the item 4004 which was obscuring it in its originally displayed state and increasing the size of the item 4002 .
  • overlap need not be used in displaying groups of items, e.g., when fewer items are displayed.
  • another set of rules and algorithms can be applied for laying out items without use of overlap as depicted in FIGS. 17( a )-( h ).
  • Exemplary rules for displaying a single group with non-overlapping items based on the number of items in the group are presented below in Table 1.
  • the item 5004 is scaled to a size such that a second item of the same size and border area will not fit in the item display region 5002.
  • FIG. 17(b) -- The two items, 5006 and 5008, are scaled to fit side by side and aligned on a diagonal such that a third item will not fit in the item display region 5002.
  • FIG. 17(d) The four items, 5016, 5018, 5020 and 5022, are scaled and arranged in a circle such that a fifth item will not fit in the item display region 5002.
  • FIG. 17(e) The five items, 5024, 5026, 5028, 5030 and 5032, are arranged in a pyramid such that a sixth item will not fit in the item display region 5002.
  • FIG. 17(f) The six items, 5034, 5036, 5038, 5040, 5042 and 5044, are scaled in two rows forming a rectangle such that a seventh item will not fit in the item display region 5002. 7 FIG.
  • FIG. 18 An example is illustrated in FIG. 18 .
  • horizontal overlapping is also used for the displayed results, e.g., offsetting the images from a vertical center of each group to add some eccentricity to the layout.
  • Image size affects the percentage of overlap.
  • the percentage of overlap used in generating GUI screens according to exemplary embodiments of the present invention can be a function of screen size, number of items being displayed and/or user preference. Compare the layout of FIG. 19 with that of FIG. 18 .
  • FIG. 18 have a 50% horizontal overlap, while the (larger) displayed items in FIG. 19 have a 30% horizontal overlap. Another difference is that the items in FIG. 18 are stacked top to bottom (i.e., the topmost image in each vertical stack covers a portion of the second topmost image in each vertical stack, etc.), whereas the items in FIG. 19 are stacked bottom to top.
  • Systems and methods for processing data to generate layouts on user interfaces can be performed by one or more processors executing sequences of instructions contained in a memory device. Such instructions may be read into the memory device from other computer-readable mediums such as secondary data storage device(s). Execution of the sequences of instructions contained in the memory device causes the processor to operate, for example, as described above. In alternative embodiments, hard-wire circuitry may be used in place of or in combination with software instructions to implement the present invention.
  • the layout rules described herein can be encoded algorithmically in software and applied recursively, e.g., to recursively layout groups within a group display area and items within an item display area.
  • the layout rules can be applied recursively to generate a layout for each layer.
  • items are laid out in a display space in a manner which provides a pleasing appearance to the user, while at the same time making efficient use of limited display (e.g., TV screen) space to display more (and larger) images per layout.
  • a maximum of 128 movie covers can be shown on a single GUI screen. If a search result or particular GUI screen should display more than 128 movie cover images in such an exemplary user interface according to the present invention, then a scroll mechanism can be added to the interface to allow the user to scroll down beyond an initial display of 128 items.

Abstract

Systems and methods according to the present invention provide layout structures and methods for user interfaces.

Description

    RELATED APPLICATIONS
  • This application is related to, and claims priority from, U.S. patent application Ser. No. 11/325,768, filed Jan. 5, 2006, which is related to, and claims priority from, U.S. Provisional Patent Application No. 60/641,421, filed on Jan. 5, 2005, entitled “Scaling and Layout Methods and Systems for Handling One-to-Many Objects”, the disclosure of which is incorporated here by reference.
  • BACKGROUND
  • The present invention describes a framework for organizing, selecting and launching media items. Part of that framework involves the design and operation of graphical user interfaces with the basic building blocks of point, click, scroll, hover and zoom and, more particularly, to graphical user interfaces associated with media items which can be used with a three-dimensional (hereinafter “3D”) pointing remote.
  • Technologies associated with the communication of information have evolved rapidly over the last several decades. Television, cellular telephony, the Internet and optical communication techniques (to name just a few things) combine to inundate consumers with available information and entertainment options. Taking television as an example, the last three decades have seen the introduction of cable television service, satellite television service, pay-per-view movies and video-on-demand. Whereas television viewers of the 1960s could typically receive perhaps four or five over-the-air TV channels on their television sets, today's TV watchers have the opportunity to select from hundreds and potentially thousands of channels of shows and information. Video-on-demand technology, currently used primarily in hotels and the like, provides the potential for in-home entertainment selection from among thousands of movie titles. Digital video recording (DVR) equipment such as offered by TiVo, Inc., 2160 Gold Street, Alviso, Calif. 95002, further expand the available choices.
  • The technological ability to provide so much information and content to end users provides both opportunities and challenges to system designers and service providers. One challenge is that while end users typically prefer having more choices rather than fewer, this preference is counterweighted by their desire that the selection process be both fast and simple. Unfortunately, the development of the systems and interfaces by which end users access media items has resulted in selection processes which are neither fast nor simple. Consider again the example of television programs. When television was in its infancy, determining which program to watch was a relatively simple process primarily due to the small number of choices. One would consult a printed guide which was formatted, for example, as series of columns and rows which showed the correspondence between (1) nearby television channels, (2) programs being transmitted on those channels and (3) date and time. The television was tuned to the desired channel by adjusting a tuner knob and the viewer watched the selected program. Later, remote control devices were introduced that permitted viewers to tune the television from a distance. This addition to the user-television interface created the phenomenon known as “channel surfing” whereby a viewer could rapidly view short segments being broadcast on a number of channels to quickly learn what programs were available at any given time.
  • Despite the fact that the number of channels and amount of viewable content has dramatically increased, the generally available user interface and control device options and framework for televisions has not changed much over the last 30 years. Printed guides are still the most prevalent mechanism for conveying programming information. The multiple button remote control with simple up and down arrows is still the most prevalent channel/content selection mechanism. The reaction of those who design and implement the TV user interface to the increase in available media content has been a straightforward extension of the existing selection procedures and interface objects. Thus, the number of rows and columns in the printed guides has been increased to accommodate more channels. The number of buttons on the remote control devices has been increased to support additional functionality and content handling, e.g., as shown in FIG. 1. However, this approach has significantly increased both the time required for a viewer to review the available information and the complexity of actions required to implement a selection. Arguably, the cumbersome nature of the existing interface has hampered commercial implementation of some services, e.g., video-on-demand, since consumers are resistant to new services that will add complexity to an interface that they view as already too slow and complex.
  • In addition to increases in bandwidth and content, the user interface bottleneck problem is being exacerbated by the aggregation of technologies. Consumers are reacting positively to having the option of buying integrated systems rather than a number of segregable components. A good example of this trend is the combination television/VCR/DVD in which three previously independent components are frequently sold today as an integrated unit. This trend is likely to continue, potentially with an end result that most if not all of the communication devices currently found in the household being packaged as an integrated unit, e.g., a television/VCR/DVD/internet access/radio/stereo unit. Even those who buy separate components desire seamless control of and interworking between them. With this increased aggregation comes the potential for more complexity in the user interface. For example, when so-called “universal” remote units were introduced, e.g., to combine the functionality of TV remote units and VCR remote units, the number of buttons on these universal remote units was typically more than the number of buttons on either the TV remote unit or VCR remote unit individually. This added number of buttons and functionality makes it very difficult to control anything but the simplest aspects of a TV or VCR without hunting for exactly the right button on the remote. Many times, these universal remotes do not provide enough buttons to access many levels of control or features unique to certain TVs. In these cases, the original device remote unit is still needed, and the original hassle of handling multiple remotes remains due to user interface issues arising from the complexity of aggregation. Some remote units have addressed this problem by adding “soft” buttons that can be programmed with the expert commands. These soft buttons sometimes have accompanying LCD displays to indicate their action. These too have the flaw that they are difficult to use without looking away from the TV to the remote control. Yet another flaw in these remote units is the use of modes in an attempt to reduce the number of buttons. In these “moded” universal remote units, a special button exists to select whether the remote should communicate with the TV, DVD player, cable set-top box, VCR, etc. This causes many usability issues including sending commands to the wrong device, forcing the user to look at the remote to make sure that it is in the right mode, and it does not provide any simplification to the integration of multiple devices. The most advanced of these universal remote units provide some integration by allowing the user to program sequences of commands to multiple devices into the remote. This is such a difficult task that many users hire professional installers to program their universal remote units.
  • Some attempts have also been made to modernize the screen interface between end users and media systems. Electronic program guides (EPGs) have been developed and implemented to replace the afore-described media guides. Early EPGs provided what was essentially an electronic replica of the printed media guides. For example, cable service operators have provided analog EPGs wherein a dedicated channel displays a slowly scrolling grid of the channels and their associated programs over a certain time horizon, e.g., the next two hours. Scrolling through even one hundred channels in this way can be tedious and is not feasibly scalable to include significant additional content deployment, e.g., video-on-demand. More sophisticated digital EPGs have also been developed. In digital EPGs, program schedule information, and optionally applications/system software, is transmitted to dedicated EPG equipment, e.g., a digital set-top box (STB). Digital EPGs provide more flexibility in designing the user interface for media systems due to their ability to provide local interactivity and to interpose one or more interface layers between the user and the selection of the media items to be viewed. An example of such an interface can be found in U.S. Pat. No. 6,421,067 to Kamen et al., the disclosure of which is incorporated here by reference. FIG. 2 depicts a GUI described in the '067 patent. Therein, according to the Kamen et al. patent, a first column 190 lists program channels, a second column 191 depicts programs currently playing, a column 192 depicts programs playing in the next half-hour, and a fourth column 193 depicts programs playing in the half hour after that. The baseball bat icon 121 spans columns 191 and 192, thereby indicating that the baseball game is expected to continue into the time slot corresponding to column 192. However, text block 111 does not extend through into column 192. This indicates that the football game is not expected to extend into the time slot corresponding to column 192. As can be seen, a pictogram 194 indicates that after the football game, ABC will be showing a horse race. The icons shown in FIG. 2 can be actuated using a cursor, not shown, to implement various features, e.g., to download information associated with the selected programming. Other digital EPGs and related interfaces are described, for example, in U.S. Pat. Nos. 6,314,575, 6,412,110, and 6,577,350, the disclosures of which are also incorporated here by reference.
  • However, the interfaces described above suffer from, among other drawbacks, an inability to easily scale between large collections of media items and small collections of media items. For example, interfaces which rely on lists of items may work well for small collections of media items, but are tedious to browse for large collections of media items. Interfaces which rely on hierarchical navigation (e.g., tree structures) may be more speedy to traverse than list interfaces for large collections of media items, but are not readily adaptable to small collections of media items. Additionally, users tend to lose interest in selection processes wherein the user has to move through three or more layers in a tree structure. For all of these cases, current remote units make this selection processor even more tedious by forcing the user to repeatedly depress the up and down buttons to navigate the list or hierarchies. When selection skipping controls are available such as page up and page down, the user usually has to look at the remote to find these special buttons or be trained to know that they even exist.
  • Organizing frameworks, techniques and systems which simplify the control and screen interface between users and media systems as well as accelerate the selection process have been described in U.S. patent application Ser. No. 10/768,432, filed on Jan. 30, 2004, entitled “A Control Framework with a Zoomable Graphical User Interface for Organizing, Selecting and Launching Media Items”, the disclosure of which is incorporated here by reference and which is hereafter referred to as the “'432 application”. Such frameworks permit service providers to take advantage of the increases in available bandwidth to end user equipment by facilitating the supply of a large number of media items and new services to the user.
  • Thus, it would be desirable to provide interfaces which supply an easy and fast selection experience regardless of the size(s) of the media item collection(s) being browsed. One objective associated with such interfaces is to lay out the items in a manner which provides a pleasing appearance to the user. Another objective is to make better use of limited display (e.g., TV screen) space to display more and larger images per layout. Yet another objective is to automatically provide layouts of multiple groups having the same or varying sizes.
  • SUMMARY
  • Systems and methods according to the present invention address these needs and others by providing a user interface displayed on a screen with a plurality of control elements, at least some of the plurality of control elements having at least one alphanumeric character displayed thereon. A text box for displaying alphanumeric characters entered using the plurality of control elements and a plurality of groups of displayed items. The layout of the plurality of groups on the user interface is based on a first number of groups which are displayed, and wherein a layout of the displayed items within a group is based on a second number of items displayed within that group.
  • According to one exemplary embodiment of the present invention, a method for laying out items in a user interface includes the steps of: laying out a plurality of groups of items within a group display space, the groups being laid out within the display space in a pattern which varies as a function of the number of the plurality of groups, and laying out, for each of the plurality of groups, a plurality of items within an item display space associated with a respective one of the plurality of groups, the items being laid out within a respective item display space in a pattern which varies as a function of the number of the plurality of items.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings illustrate exemplary embodiments of the present invention, wherein:
  • FIG. 1 depicts a conventional remote control unit for an entertainment system;
  • FIG. 2 depicts a conventional graphical user interface for an entertainment system;
  • FIG. 3 depicts an exemplary media system in which exemplary embodiments of the present invention (both display and remote control) can be implemented;
  • FIG. 4 shows a system controller of FIG. 3 in more detail;
  • FIGS. 5-8 depict a graphical user interface for a media system according to an exemplary embodiment of the present invention;
  • FIGS. 9-13 depict a zoomable graphical user interface according to another exemplary embodiment of the present invention;
  • FIG. 14( a) illustrates a user interface for searching and displaying search results in a graphical layout according to exemplary embodiments of the present invention;
  • FIG. 14( b) illustrates an abstraction of a user interface for searching and displaying search results in a graphical layout according to exemplary embodiments of the present invention;
  • FIGS. 15( a-n) illustrate groups containing items with overlap to exemplary embodiments of the present invention;
  • FIGS. 16( a) and 16(b) illustrate a hoverzoom effect according to exemplary embodiments of the present invention;
  • FIGS. 17( a-h) illustrate groups containing items without overlap according to exemplary embodiments of the present invention;
  • FIG. 18 illustrates groups with vertical overlapping in a user interface according to exemplary embodiments of the present invention; and
  • FIG. 19 illustrates groups with vertical overlapping in a user interface according to exemplary embodiments of the present invention.
  • DETAILED DESCRIPTION
  • The following detailed description of the invention refers to the accompanying drawings. The same reference numbers in different drawings identify the same or similar elements. Also, the following detailed description does not limit the invention. Instead, the scope of the invention is defined by the appended claims.
  • In order to provide some context for this discussion, an exemplary aggregated media system 200 in which the present invention can be implemented will first be described with respect to FIG. 3. Those skilled in the art will appreciate, however, that the present invention is not restricted to implementation in this type of media system and that more or fewer components can be included therein. Therein, an input/output (I/O) bus 210 connects the system components in the media system 200 together. The I/O bus 210 represents any of a number of different mechanisms and techniques for routing signals between the media system components. For example, the I/O bus 210 may include an appropriate number of independent audio “patch” cables that route audio signals, coaxial cables that route video signals, two-wire serial lines or infrared or radio frequency transceivers that route control signals, optical fiber or any other routing mechanisms that route other types of signals.
  • In this exemplary embodiment, the media system 200 includes a television/monitor 212, a video cassette recorder (VCR) 214, digital video disk (DVD) recorder/playback device 216, audio/video tuner 218 and compact disk player 220 coupled to the I/O bus 210. The VCR 214, DVD 216 and compact disk player 220 may be single disk or single cassette devices, or alternatively may be multiple disk or multiple cassette devices. They may be independent units or integrated together. In addition, the media system 200 includes a microphone/speaker system 222, video camera 224 and a wireless I/O control device 226. According to exemplary embodiments of the present invention, the wireless I/O control device 226 is a media system remote control unit that supports 3D pointing, has a minimal number of buttons to support navigation, and communicates with the entertainment system 200 through RF signals. For example, wireless I/O control device 226 can be a 3D pointing device which uses a gyroscope or other mechanism to define both a screen position and a motion vector to determine the particular command desired. A set of buttons can also be included on the wireless I/O device 226 to initiate the “click” primitive described below as well as a “back” button. In another exemplary embodiment, wireless I/O control device 226 is a media system remote control unit, which communicates with the components of the entertainment system 200 through IR signals. In yet another embodiment, wireless I/O control device 226 may be an IR remote control device similar in appearance to a typical entertainment system remote control with the added feature of a track-ball or other navigational mechanisms which allows a user to position a cursor on a display of the entertainment system 200.
  • The entertainment system 200 also includes a system controller 228. According to one exemplary embodiment of the present invention, the system controller 228 operates to store and display entertainment system data available from a plurality of entertainment system data sources and to control a wide variety of features associated with each of the system components. As shown in FIG. 3, system controller 228 is coupled, either directly or indirectly, to each of the system components, as necessary, through I/O bus 210. In one exemplary embodiment, in addition to or in place of I/O bus 210, system controller 228 is configured with a wireless communication transmitter (or transceiver), which is capable of communicating with the system components via IR signals or RF signals. Regardless of the control medium, the system controller 228 is configured to control the media components of the media system 200 via a graphical user interface described below.
  • As further illustrated in FIG. 3, media system 200 may be configured to receive media items from various media sources and service providers. In this exemplary embodiment, media system 200 receives media input from and, optionally, sends information to, any or all of the following sources: cable broadcast 230, satellite broadcast 232 (e.g., via a satellite dish), very high frequency (VHF) or ultra high frequency (UHF) radio frequency communication of the broadcast television networks 234 (e.g., via an aerial antenna), telephone network 236 and cable modem 238 (or another source of Internet content). Those skilled in the art will appreciate that the media components and media sources illustrated and described with respect to FIG. 3 are purely exemplary and that media system 200 may include more or fewer of both. For example, other types of inputs to the system include AM/FM radio and satellite radio.
  • FIG. 4 is a block diagram illustrating an embodiment of an exemplary system controller 228 according to the present invention. System controller 228 can, for example, be implemented as a set-top box and includes, for example, a processor 300, memory 302, a display controller 304, other device controllers (e.g., associated with the other components of system 200), one or more data storage devices 308 and an I/O interface 310. These components communicate with the processor 300 via bus 312. Those skilled in the art will appreciate that processor 300 can be implemented using one or more processing units. Memory device(s) 302 may include, for example, DRAM or SRAM, ROM, some of which may be designated as cache memory, which store software to be run by processor 300 and/or data usable by such programs, including software and/or data associated with the graphical user interfaces described below. Display controller 304 is operable by processor 300 to control the display of monitor 212 to, among other things, display GUI screens and objects as described below. Zoomable GUIs according to exemplary embodiments of the present invention provide resolution independent zooming, so that monitor 212 can provide displays at any resolution. Device controllers 306 provide an interface between the other components of the media system 200 and the processor 300. Data storage 308 may include one or more of a hard disk drive, a floppy disk drive, a CD-ROM device, or other mass storage device. Input/output interface 310 may include one or more of a plurality of interfaces including, for example, a keyboard interface, an RF interface, an IR interface and a microphone/speech interface. According to one exemplary embodiment of the present invention, I/O interface 310 will include an interface for receiving location information associated with movement of a wireless pointing device.
  • Generation and control of a graphical user interface according to exemplary embodiments of the present invention to display media item selection information is performed by the system controller 228 in response to the processor 300 executing sequences of instructions contained in the memory 302. Such instructions may be read into the memory 302 from other computer-readable mediums such as data storage device(s) 308 or from a computer connected externally to the media system 200. Execution of the sequences of instructions contained in the memory 302 causes the processor to generate graphical user interface objects and controls, among other things, on monitor 212. In alternative embodiments, hard-wire circuitry may be used in place of or in combination with software instructions to implement the present invention. As mentioned in the Background section, conventional interface frameworks associated with the television industry are severely limited in their ability to provide users with a simple and yet comprehensive selection experience. Accordingly, control frameworks described herein overcome these limitations and are, therefore, intended for use with televisions, albeit not exclusively. It is also anticipated that the revolutionary control frameworks, graphical user interfaces and/or various algorithms described herein will find applicability to interfaces which may be used with computers and other non-television devices. In order to distinguish these various applications of exemplary embodiments of the present invention, the terms “television” and “TV” are used in this specification to refer to a subset of display devices, whereas the terms “GUI”, “GUI screen”, “display” and “display screen” are intended to be generic and refer to television displays, computer displays and any other display device. More specifically, the terms “television” and “TV” are intended to refer to the subset of display devices which are able to display television signals (e.g., NTSC signals, PAL signals or SECAM signals) without using an adapter to translate television signals into another format (e.g., computer video formats). In addition, the terms “television” and “TV” refer to a subset of display devices that are generally viewed from a distance of several feet or more (e.g., sofa to a family room TV) whereas computer displays are generally viewed close-up (e.g., chair to a desktop monitor).
  • Having described an exemplary media system which can be used to implement control frameworks including zoomable graphical interfaces according to the present invention, several examples of such interfaces will now be described. Those skilled in the art will, however, appreciate that layout techniques and mechanisms according to exemplary embodiments of the present invention are not limited to usage in a zoomable user interface and can also be applied to user interfaces which do not use zooming mechanisms. According to some exemplary embodiments of the present invention, a user interface displays selectable items which can be grouped by category. A user points a remote unit at the category or categories of interest and depresses the selection button to zoom in or the “back” button to zoom back. Each zoom in, or zoom back, action by a user results in a change in the magnification level and/or context of the selectable items rendered by the user interface on the screen. According to exemplary embodiments, each change in magnification level can be consistent, i.e., the changes in magnification level are provided in predetermined steps. Exemplary embodiments of the present invention also provide for user interfaces which incorporate several visual techniques to achieve scaling to the very large. These techniques involve a combination of building blocks and techniques that achieve both scalability and ease-of-use, in particular techniques which supply an easy and fast selection experience regardless of the size(s) of the media item collection(s) being browsed.
  • The user interface is largely a visual experience. In such an environment exemplary embodiments of the present invention make use of the capability of the user to remember the location of objects within the visual environment. This is achieved by providing a stable, dependable location for user interface selection items, which is at the same time pleasing to the user and efficiently uses the allocated display space. Each object or item has a location in the zoomable layout, which location can be selected according to layout rules described below with respect to FIGS. 14-19. Once the user has found an object of interest it is natural to remember which direction was taken to locate the object. If that object is of particular interest it is likely that the user will re-visit the item more than once, which will reinforce the user's memory of the path to the object. User interfaces according to exemplary embodiments of the present invention provide visual mnemonics that help the user remember the location of items of interest. Such visual mnemonics include pan and zoom animations, transition effects which generate a geographic sense of movement across the user interface's virtual surface and consistent zooming functionality, among other things which will become more apparent based on the examples described below.
  • Referring first to FIGS. 5-8, an exemplary control framework including a zoomable graphical user interface according to an exemplary embodiment of the present invention is described for use in displaying and selecting musical media items. FIG. 5 portrays the zoomable GUI at a high level e.g., a second “most zoomed out” state. Therein, the interface displays a set of shapes 500. Displayed within each shape 500 are text 502 and/or a picture 504 that describe the group of media item selections accessible via that portion of the GUI. As shown in FIG. 5, the shapes 500 are rectangles, and text 502 and/or picture 504 describe the genre of the media. However, those skilled in the art will appreciate that this first viewed GUI grouping could represent other aspects of the media selections available to the user e.g., artist, year produced, area of residence for the artist, length of the item, or any other characteristic of the selection. Also, the shapes used to outline the various groupings in the GUI need not be rectangles. Shrunk down versions of album covers and other icons could be used to provide further navigational hints to the user in lieu of or in addition to text 502 and/or picture 504 within the shape groupings 500. A background portion of the GUI 506 can be displayed as a solid color or be a part of a picture such as a map to aid the user in remembering the spatial location of genres so as to make future uses of the interface require less reading. The selection pointer (cursor) 508 follows the movements of an input device and indicates the location to zoom in on when the user presses the button on the device (not shown in FIG. 5).
  • According to one exemplary embodiment of the present invention, the input device can be a 3D pointing device, e.g., the 3D pointing device described in U.S. patent application Ser. No. 11/119,663, filed on May 2, 2005, entitled “3D Pointing Devices and Methods”, the disclosure of which is incorporated here by reference and which is hereafter referred to as the “'663 application”, coupled with a graphical user interface that supports the point, click, scroll, hover and zoom building blocks which are described in more detail below. One feature of this exemplary input device that is beneficial for use in conjunction with the present invention is that it can be implemented with only two buttons and a scroll wheel, i.e., three input actuation objects. One of the buttons can be configured as a ZOOM IN (select) button and one can be configured as a ZOOM OUT (back) button. Compared with the conventional remote control units, e.g., that shown in FIG. 1, the present invention simplifies this aspect of the GUI by greatly reducing the number of buttons, etc., that a user is confronted with in making his or her media item selection. An additional preferred, but not required, feature of input devices according to exemplary embodiments of the present invention is that they provide “3D pointing” capability for the user. The phrase “3D pointing” is used in this specification to refer to the ability of a user to freely move the input device in three (or more) dimensions in the air in front of the display screen and the corresponding ability of the user interface to translate those motions directly into movement of a cursor on the screen. Thus “3D pointing” differs from conventional computer mouse pointing techniques which use a surface other than the display screen, e.g., a desk surface or mousepad, as a proxy surface from which relative movement of the mouse is translated into cursor movement on the computer display screen. Use of 3D pointing in control frameworks according to exemplary embodiments of the present invention further simplifies the user's selection experience, while at the same time providing an opportunity to introduce gestures as distinguishable inputs to the interface. A gesture can be considered as a recognizable pattern of movement over time which pattern can be translated into a GUI command, e.g., a function of movement in the x, y, z, yaw, pitch and roll dimensions or any subcombination thereof. Those skilled in the art will appreciate, however that any suitable input device can be used in conjunction with zoomable GUIs according to the present invention. Other examples of suitable input devices include, but are not limited to, trackballs, touchpads, conventional TV remote control devices, speech input, any devices which can communicate/translate a user's gestures into GUI commands, or any combination thereof. It is intended that each aspect of the GUI functionality described herein can be actuated in frameworks according to the present invention using at least one of a gesture and a speech command. Alternate implementations include using cursor and/or other remote control keys or even speech input to identify items for selection.
  • FIG. 6 shows a zoomed in view of Genre 3 that would be displayed if the user selects Genre 3 from FIG. 5, e.g., by moving the cursor 508 over the area encompassed by the rectangle surrounding Genre 3 on display 212 and depressing a button on the input device. The interface can animate the zoom from FIG. 5 to FIG. 6 so that it is clear to the user that a zoom occurred. An example of such an animated zoom/transition effect is described below. Once the shape 516 that contains Genre 3 occupies most of the screen on display 212, the interface reveals the artists that have albums in the genre. In this example, seven different artists and/or their works are displayed. The unselected genres 515 that were adjacent to Genre 3 in the zoomed out view of FIG. 5 are still adjacent to Genre 3 in the zoomed in view, but are clipped by the edge of the display 212. These unselected genres can be quickly navigated to by selection of them with selection pointer 508. It will be appreciated, however, that other exemplary embodiments of the present invention can omit clipping neighboring objects and, instead, present only the unclipped selections. Each of the artist groups, e.g., group 512, can contain images of shrunk album covers, a picture of the artist or customizable artwork by the user in the case that the category contains playlists created by the user.
  • A user may then select one of the artist groups for further review and/or selection. FIG. 7 shows a further zoomed in view in response to a user selection of Artist 3 via positioning of cursor 508 and actuation of the input device, in which images of album covers 520 come into view. As with the transition from the GUI screen of FIG. 5 and FIG. 6, the unselected, adjacent artists ( artists # 2, 6 and 7 in this example) are shown towards the side of the zoomed in display, and the user can click on these with selection pointer 508 to pan to these artist views. In this portion of the interface, in addition to the images 520 of album covers, artist information 524 can be displayed as an item in the artist group. This information may contain, for example, the artist's picture, biography, trivia, discography, influences, links to web sites and other pertinent data. Each of the album images 520 can contain a picture of the album cover and, optionally, textual data. In the case that the album image 520 includes a user created playlist, the graphical user interface can display a picture which is selected automatically by the interface or preselected by the user.
  • Finally, when the user selects an album cover image 520 from within the group 521, the interface zooms into the album cover as shown in FIG. 8. As the zoom progresses, the album cover can fade or morph into a view that contains items such as the artist and title of the album 530, a list of tracks 532, further information about the album 536, a smaller version of the album cover 528, and controls 534 to play back the content, modify the categorization, link to the artists web page, or find any other information about the selection. Neighboring albums 538 are shown that can be selected using selection pointer 508 to cause the interface to bring them into view. As mentioned above, alternative embodiments of the present invention can, for example, zoom in to only display the selected object, e.g., album 5, and omit the clipped portions of the unselected objects, e.g., albums 4 and 6. This final zoom provides an example of semantic zooming, wherein certain GUI elements are revealed that were not previously visible at the previous zoom level. Various techniques for performing semantic zooming according to exemplary embodiments of the present invention are provided below.
  • As illustrated in the FIGS. 5-8 and the description, this exemplary embodiment of a graphical user interface provides for navigation of a music collection. Interfaces according to the present invention can also be used for video collections such as for DVDs, VHS tapes, other recorded media, video-on-demand, video segments and home movies. Other audio uses include navigation of radio shows, instructional tapes, historical archives, and sound clip collections. Print or text media such as news stories and electronic books can also be organized and accessed using this invention.
  • As will be apparent to those skilled in the art from the foregoing description, zoomable graphical user interfaces according to the present invention provide users with the capability to browse a large (or small) number of media items rapidly and easily. This capability is attributable to many characteristics of interfaces according to exemplary embodiments of the present invention including, but not limited to: (1) the use of images as all or part of the selection information for a particular media item, (2) the use of zooming to rapidly provide as much or as little information as a user needs to make a selection and (3) the use of several GUI techniques which combine to give the user the sense that the entire interface resides on a single plane, such that navigation of the GUI can be accomplished, and remembered, by way of the user's sense of direction. This latter aspect of GUIs according to the present invention can be accomplished by, among other things, linking the various GUI screens together “geographically” by maintaining as much GUI object continuity from one GUI screen to the next, e.g., by displaying edges of neighboring, unselected objects around the border of the current GUI screen. Alternatively, if a cleaner view is desired, and other GUI techniques provide sufficient geographic feedback, then the clipped objects can be omitted. As used in this text, the phrase “GUI screen” refers to a set of GUI objects rendered on one or more display units at the same time. A GUI screen may be rendered on the same display which outputs media items, or it may be rendered on a different display. The display can be a TV display, computer monitor or any other suitable GUI output device.
  • Another GUI effect which enhances the user's sense of GUI screen connectivity is the panning animation effect which is invoked when a zoom is performed or when the user selects an adjacent object at the same zoom level as the currently selected object. Returning to the example of FIG. 5, as the user is initially viewing this GUI screen, his or her point-of-view is centered about point 550. However, when he or she selects Genre 3 for zooming in, his or her point-of-view will shift to point 552. According to exemplary embodiments of the present invention, the zoom in process is animated to convey the shifting the POV center from point 550 to 552. This panning animation can be provided for every GUI change, e.g., from a change in zoom level or a change from one object to another object on the same GUI zoom level. Thus if, for example, a user situated in the GUI screen of FIG. 6 selected the leftmost unselected genre 515 (Genre 2), a panning animation would occur which would give the user the visual impression of “moving” left or west. Exemplary embodiments of the present invention employ such techniques to provide a consistent sense of directional movement between GUI screens enables users to more rapidly navigate the GUI, both between zoom levels and between media items at the same zoom level.
  • These capabilities of graphical user interfaces according to the present invention, as well as the usefulness of more sophisticated layouts and algorithms for generating such layouts, will become even more apparent upon review of another exemplary embodiment described below with respect to FIGS. 9-13. Therein, a startup GUI screen 1400 displays a plurality of organizing objects which operate as media group representations. The purely exemplary media group representations of home video, movies, TV, sports, radio, music and news could, of course include different, more or fewer media group representations. Upon actuation of one of these icons by a user, the GUI according to this exemplary embodiment will then display a plurality of images each grouped into a particular category or genre. For example, if the “movie” icon in FIG. 9 was actuated by a user, the GUI screen of FIG. 10 can then be displayed. Therein, a large number, e.g., 120 or more, selection objects are displayed. These selection objects can be categorized into particular group(s), e.g., action, classics, comedy, drama, family and new releases. Those skilled in the art will appreciate that more or fewer categories could be provided. In this exemplary embodiment, the media item images can be cover art associated with each movie selection. Although the size of the blocks in FIG. 10 is too small to permit detailed illustration of this relatively large group of selection item images, in implementation, the level of magnification of the images is such that the identity of the movie can be discerned by its associated image, even if some or all of the text may be too small to be easily read.
  • The cursor (not shown in FIG. 10) can then be disposed over a group of the movie images and the input device actuated to provide a selection indication for one of the groups. In this example the user selects the drama group and the graphical user interface then displays a zoomed version of the drama group of images as seen in FIG. 11. As with the previous embodiment, a transition effect can also be displayed as the GUI shifts from the GUI screen of FIG. 10 to the GUI screen of FIG. 11, e.g., the GUI may pan the view from the center of the GUI screen of FIG. 10 to the center of the drama group of images during or prior to the zoom. Note that although the zoomed version of the drama group of FIG. 11 only displays a subset of the total number of images in the drama group, that this zoomed version can alternatively contain all of the images in the selected group. The choice of whether or not to display all of the images in a selected group in any given zoomed in version of a GUI screen can be made based upon, for example, the number of media items in a group and a minimum desirable magnification level for a media item for a particular zoom level. This latter characteristic of GUIs according to the present invention can be predetermined by the system designer/service provider or can be user customizable via software settings in the GUI. For example, the number of media items in a group and the minimum and/or maximum magnification levels can be configurable by either or both of the service provider or the end user. Such features enable those users with, for example, poor eyesight, to increase the magnification level of media items being displayed. Conversely, users with especially keen eyesight may decrease the level of magnification, thereby increasing the number of media items displayed on a GUI screen at any one time and decrease browsing time.
  • One exemplary transition effect which can be employed in graphical user interfaces according to the present invention is referred to herein as the “shoe-to-detail” view effect. When actuated, this transition effect takes a zoomed out image and simultaneously shrinks and translates the zoomed out image into a smaller view, i.e., the next higher level of magnification. The transition from the magnification level used in the GUI screen of FIG. 10 to the greater magnification level used in the GUI screen of FIG. 11 results in additional details being revealed by the GUI for the images which are displayed in the zoomed in version of FIG. 11. The GUI selectively reveals or hides details at each zoom level based upon whether or not those details would display well at the currently selected zoom level. Unlike a camera zoom, which attempts to resolve details regardless of their visibility to the unaided eye, exemplary embodiments of the present invention provide for a configurable zoom level parameter that specifies a transition point between when to show the full image and when to show a version of the image with details that are withheld. The transition point can be based upon an internal resolution independent depiction of the image rather the resolution of TV/Monitor 212. In this way, GUIs according to the present invention are consistent regardless of the resolution of the display device being used in the media system.
  • In this exemplary embodiment, an additional amount of magnification for a particular image can be provided by passing the cursor over a particular image. This feature can be seen in FIG. 12, wherein the cursor has rolled over the image for the movie “Apollo 13”. Although not depicted in FIG. 12, such additional magnification could, for example, make more legible the quote “Houston, we have a problem” which appears on the cover art of the associated media item as compared to the corresponding image in the GUI screen of FIG. 12 which is at a lower level of magnification. User selection of this image, e.g., by depressing a button on the input device, can result in a further zoom to display the details shown in FIG. 13. This provides yet another example of semantic zooming as it was previously described since various information and control elements are present in the GUI screen of FIG. 13 that were not available in the GUI screen of FIG. 12. For example, information about the movie “Apollo 13” including, among other things, the movie's runtime, price and actor information is shown. Those skilled in the art will appreciate that other types of information could be provided here. Additionally, this GUI screen includes GUI control objects including, for example, button control objects for buying the movie, watching a trailer or returning to the previous GUI screen (which could also be accomplished by depressing the ZOOM OUT button on the input device). Hyperlinks can also be used to allow the user to jump to, for example, GUI screens associated with the related movies identified in the lower right hand corner of the GUI screen of FIG. 13 or information associated with the actors in this movie. In this example, some or all of the film titles under the heading “Filmography” can be implemented as hyperlinks which, when actuated by the user via the input device, will cause the GUI to display a GUI screen corresponding to that of FIG. 13 for the indicated movie.
  • A transition effect can also be employed when a user actuates a hyperlink. Since the hyperlinks may be generated at very high magnification levels, simply jumping to the linked media item may cause the user to lose track of where he or she is in the media item selection “map”. Accordingly, exemplary embodiments of the present invention provide a transition effect to aid in maintaining the user's sense of geographic position when a hyperlink is actuated. One exemplary transition effect which can be employed for this purpose is a hop transition. In an initial phase of the transition effect, the GUI zooms out and pans in the direction of the item pointed to by the hyperlink. Zooming out and panning continues until both the destination image and the origination image are viewable by the user. Using the example of FIG. 13 once again, if the user selects the hyperlink for “Saving Private Ryan”, then the first phase of the hyperlink hop effect would include zooming out and panning toward the image of “Saving Private Ryan” until both the image for “Saving Private Ryan” and “Apollo 13” were visible to the user. At this point, the transition effect has provided the user with the visual impression of being moved upwardly in an arc toward the destination image. Once the destination image is in view, the second phase of the transition effect gives the user the visual impression of zooming in and panning to, e.g., on the other half of the arc, the destination image. The hop time, i.e., the amount of time both phases one and two of this transition effect are displayed, can be fixed as between any two hyperlinked image items. Alternatively, the hop time may vary, e.g., based on the distance traveled over the GUI. For example, the hop time can be parameterized as HopTime=A log(zoomed-in scale level/hop apex scale level)+B(distance between hyperlinked media items)+C, where A, B and C are suitably selected constant values.
  • Scaling, Layout and Searching
  • Given the potentially huge amount of content to be accessed using the afore-described (and other) user interfaces and systems, the need to layout objects on a display in a fashion that is pleasing to the eye as well as efficient with respect to space is becoming more important. A graphical layout deals with, for example, the number, size and specific arrangement of items on, e.g., a TV screen. Layouts are generally composed of two opposing factors, equilibrium and form. Equilibrium is achieved when objects are uniformly and symmetrically distributed which is the lowest energy state. The mind strives for equilibrium when trying to deal with complexity; however, a layout in total equilibrium is usually considered boring. To add interest to a layout, equilibrium is perturbed by introducing form. Also when considering layouts it is often useful to understand and use the concepts of centricity and eccentricity. Centricity pertains to the central location while eccentricity pertains to locations in the layout which are offset from the center. When viewing a layout there is a perceived tension, or force, between these focal points. A layout that is pleasing to the user is achieved with a mixture of equilibrium and form to combine simplicity and interest coupled with some amount of tension. These concepts are used in exemplary embodiments of the present invention when displaying groups of selectable media items, such as images of movie covers, as part of a user interface displayed on, e.g., a TV screen.
  • According to one exemplary embodiment of the present invention, a user interface associated providing a searching mechanism for searching among selectable media items can generate a displayed screen such as that illustrated in FIG. 14( a). For multiple groups of displayed items, it may be desirable that a user perceive each group as a cohesive whole. It may further be desirable to minimize the amount of border space and maximize an image size associated with each displayed item. Accordingly, this exemplary embodiment of the present invention incorporates image overlapping and a black border added to each image, although these effects could also be used independently of one another. The border helps the eye delineate each object within an overlapping layout. Overlapping increases the perception of ‘belonging’ desirable to display a unified group and adds a three-dimensional (3D) effect to the layout. With overlapping, a sufficient surface view of the image remains uncovered to gain the users' attention and achieve recognition. In fact, the parts of the images that are hidden through overlapping can serve to heighten the curiosity of the users. When a user positions a cursor over a particular displayed item, the hover effect described above will then reveal the entire image as well as increase its size. In addition, the overlapping feature provides additional freedom in placement rules by making better use of the space between images and groups and allowing the displayed images to be scaled significantly larger than they could be if placed separately.
  • The exemplary GUI screen 2000 depicted in FIG. 14( a) contains a text entry widget including a plurality of control elements 2004, with at least some of the control elements 2004 being drawn as keys or buttons having alphanumeric characters 2014 thereon, and other control elements 2004 being drawn on the interface as having non-alphanumeric characters 2016 which can be, e.g., used to control character entry. In this example, the control elements 2004 are laid out in two horizontal rows across the interface, although other configurations may be used.
  • Upon actuating a control element 2004, e.g., by clicking a button on a 3D pointer, the corresponding alphanumeric input is displayed in the textbox 2002, disposed above the text entry widget, and one or more groups of displayed items related to the alphanumeric input provided via the control element(s) can be displayed on the interface, e.g., below the text entry widget. Thus, the GUI screen depicted in FIG. 14( a) according to one exemplary embodiment of the present invention can be used to search for selectable media items, and graphically display the results of the search on a GUI screen, in a manner that is useful, efficient and pleasing to the user. (Note that in the illustrated example of FIG. 14( a), although the letter “g” is illustrated as being displayed in the text box 2002, the displayed movie cover images below the text entry widget simply represent a test pattern and are not necessarily related to the input letter “g” as they could be in an implementation, e.g., the displayed movie covers could be only those whose movie titles start with the letter “g”). In particular, the layout of the four groups 2006, 2008, 2010 and 2012 of displayed items within the region of the user interface allocated for search results to be displayed is based on the number of groups which are displayed and the layout of displayed items (in this example, images of movie covers) within each group is based upon the number of items displayed within each group. These layouts according to exemplary embodiments of the present invention can be governed by layout rules and algorithmically implemented in the user interface.
  • To illustrate how groups and items within groups can be laid out on user interfaces according to exemplary embodiments of the present invention, exemplary rules and algorithms for laying out selectable media items (but which can be equally applied to other displayed items) will now be described. To aid in understanding the application of these rules, some terminology is first defined. The phrase “group layout” refers to the layout of groups of displayed items within a group display area. In the example of FIG. 14( a), the group display area is the portion of the display screen between the bottom of the text entry widget and the bottom of the display screen (with suitable margins). The phrase “item layout” refers to the layout of items within each group and, more specifically, within an item display area associated with each group.
  • To render each of these concepts more concrete, consider the abstraction of the GUI screen 2000 of FIG. 14( a) illustrated in FIG. 14( b). Therein, the four displayed groups of items 2006, 2008, 2010 and 2012 have a group layout within a group display area 2020 that is substantially trapezoidal, e.g., connecting the center points of the groups 2006, 2008, 2010 and 2012 will form a trapezoid. Left-center display group 2008 and right-center display group 2010 are raised (placed further from the bottom) within the group display area 2020 relative to the leftmost group 2006 and rightmost group 2012. The rectangles shown in FIG. 14( b) as surrounding the display groups 2006-2012 represent exemplary item display areas for each respective group, i.e., the areas within which the items associated with each group are laid out. As seen in FIG. 14( a), these regions need not be explicitly displayed on the GUI screen 2000 (although according to other exemplary embodiments, item display area boundaries may be displayed). Moreover, those skilled in the art will appreciate that the item display areas (as well as the group display area) need not be rectangular in shape, as in the example of FIG. 14( b), but can be any desired shape. Within each item display area, the groups' displayed items are laid out according to item layout rules. An exemplary set of overlapping item layouts are shown in FIGS. 15( a)-(n) and exemplary rules for displaying these item layouts are described below. Each rule is based, at least in part, on the number of items within the group.
  • Group of 1 Item—Starting with FIG. 15( a), for a group consisting of one displayed item 3002, the item is placed in the item display region 3004 offset from a center of the item display region. For example, as shown in FIG. 15( b), this can be accomplished by rendering displayed item 3002 such that its center point 3006 is offset upwardly and to the left of the center point 3008 of the item display region 3004. Additionally, the item 3002 is scaled at the offset position such that a second item of the same size would not fit within the item display area 3004.
  • Group of 2 Items—For a group consisting of two displayed items 3010 and 3012, illustrated in FIG. 15( c), the two displayed items are laid out in the item display area by aligning the center points of the two items on a diagonal within the item display region 3004. This is shown conceptually in FIG. 15( d), wherein a center point 3014 of item 3010 is disposed up and to the left of the center point 3008 of the item display region 3004 on diagonal 3015, while a center point 3016 of item 3012 is disposed down and to the right of center point 3008.
  • Group of 3 Items—For a group consisting of three displayed items 3018, 3020 and 3022, illustrated in FIG. 15( e), the three displayed items are laid out in the item display area by aligning the center points of the three items on the circumference of a circle within the item display region 3004. This is shown conceptually in FIG. 15( f), wherein a center point 3024 of item 3018 is disposed above the center point 3008 of the item display region 3004 on the circumference of circle 3030, while a center point 3026 of item 3020 is disposed down and to the left of center point 3008 and a center point 3028 of item 3022 is disposed down and to the right of center point 3008.
  • Group of 4 Items—For a group consisting of four displayed items, 3032, 3034, 3036 and 3038, illustrated in FIG. 15( g), the four displayed items are laid out in the item display area by aligning the center points of the four items on the corners of a rhombus within the item display region 3004. This is shown conceptually in FIG. 15( h), wherein a center point 3040 of item 3032 is disposed up and to the left of the center point 3008 of the item display region 3004 on rhombus 3048, while a center point 3042 of item 3034 is disposed up and to the right of the center point 3008, while a center point 3044 of item 3036 is disposed below and to the left of center point 3008 and while a center point 3046 of item 3038 is disposed below and to the right of center point 3008.
  • Group of 5 Items—For a group consisting of five displayed items 3050, 3052, 3054, 3056 and 3058, illustrated in FIG. 15( i), the five displayed items are laid out in the item display area 3004 by placing the center points of the five items on the right half of a circumference of an ellipse within the item display region 3004. This is shown conceptually in FIG. 15( j), wherein a center point 3060 of item 3050 is dispose up and to the left of the center point 3008 of the item display region 3004 on the right half of a circumference of an ellipse 3070, while a center point 3062 of item 3052 is disposed above the center point 3008, while a center point 3064 of item 3054 is disposed to the right of the center point 3008, while a center point 3066 of item 3056 is disposed below the center point 3008 and while a center point 3068 of item 3058 is disposed down and to the left of the center point 3008.
  • Group of 6 Items—For a group consisting of six displayed items, 3072, 3074, 3076, 3078, 3080 and 3082, illustrated in FIG. 15( k), the six displayed items are laid out in the item display area 3004 by creating a grid such that three items are arranged in an upper row above three items arranged in a lower row. An upper leftmost item 3072 is aligned above a lower leftmost item 3082 and a top edge of the upper leftmost item is higher than a top edge of an upper rightmost item. The upper rightmost item 3076 is aligned above a lower rightmost item 3080, the lower rightmost item 3080 is aligned below the upper rightmost item 3076 and a bottom edge of the lower leftmost item is lower than a bottom edge of the lower rightmost item. An upper center item 3074 is left of the center point of the group and overlaps both items in the upper row. A lower center item 3078 is right of the center point of the group and overlaps both items in the lower row and overlaps the upper center item 3074.
  • Group of 7 Items—For a group consisting of seven displayed items, 3084, 3086, 3088, 3090, 3092, 3094 and 3096, illustrated in FIG. 15( l), the seven displayed items are laid out in the item display area 3004 by creating a grid such that one item 3090 is in the center, three items are arranged in an upper row above the center item and three items are arranged below the center item in a lower row. An upper leftmost item 3084 is slightly higher than an upper rightmost item 3088 and is aligned with a lower leftmost item 3096. The upper rightmost item 3088 is slightly lower than the upper leftmost item 3084 and is aligned with a lower rightmost item 3092. The lower rightmost item 3092 is in line with both the upper rightmost item 3088 and the lower leftmost item 3096. The top edge of the middle item in the upper row 3086 is higher than the top edges of both the upper leftmost item and the upper rightmost item. The bottom edge of the middle item in the lower row 3094 is lower than a bottom edge of both the lower leftmost item 3096 and the lower rightmost item 3092. The center item 3090 can overlap all other items, the middle items in either row overlap items in that row and wherein the center item 3090 is left of the center point of the item display area 3004 and the middle items are right of the center point of the item display area 3004.
  • Groups of 8 Items—For a group consisting of eight displayed items, 3098, 3100, 3102, 3104, 3106, 3108, 3110 and 3112, illustrated in FIG. 15( m), the eight displayed items are laid out in the item display area 3004 by creating a grid such that two items are arranged in a middle row, three items are arranged in an upper row above the middle row and three items are arranged in a lower row below the middle row. An upper leftmost item 3112 is aligned with both an upper rightmost item 3100 and a lower leftmost item 3108. The upper rightmost item 3100 is aligned with both the upper leftmost item 3112 and a lower rightmost item 3104. The lower rightmost item 3104 is aligned with both the upper rightmost item 3100 and the lower leftmost item 3108. A top edge of a middle item 3098 in the upper row is higher than the top edges of both the upper leftmost item and the upper rightmost item. A bottom edge of a middle item 3106 in the lower row is lower than the bottom edge of both the lower leftmost item and the lower rightmost item. Overlapping occurs between the middle row and all other rows, and that the center items in either the upper row or the lower row overlap items in their respective row. Similar overlapping layout rules and algorithms can be applied to groups having more items, e.g., up to and including sixteen items, an example of which is shown in FIG. 15( n).
  • Having described exemplary layout rules according to one exemplary embodiment of the present invention, consider again FIG. 14( a). In this example, the layout rules described above are recursively applied to display four groups of search results, e.g., selectable media items represented by movie cover images. The four groups include a leftmost group 2006, a left-center group 2008, a right-center group 2010 and a rightmost group 2012. When the user interface software prepares to display these groups on the GUI screen 2000, it recursively applies the above described rules to determine an appropriate display layout for the groups, as well as for the items within each group. For example, the left-center group 2008 has three items which are arranged such that center points of each item are located on a circumference of a circle and each of the three items overlap. Similarly, the other groups 2006, 2010 and 2012 have their items laid out by applying the rule associated with the number of items in the respective group.
  • As mentioned above, a hover zoom effect can be used in conjunction with overlapping images to allow the user to view the portion of the image obscured within the collage layout. Consider, for example, the group layout illustrated in FIG. 16( a) involves a group of eight, overlapping items. When the user points to an image, it rises to the top and is scaled to a larger size (“hover zoom”). Thus, in FIG. 16( b), the user has moved a cursor (not shown in FIG. 16( b)), or otherwise indicated an initial selection of the item 4002 (associated with the movie “The Pianist”), causing that image to rise above the item 4004 which was obscuring it in its originally displayed state and increasing the size of the item 4002.
  • According to another exemplary embodiment of the present invention, overlap need not be used in displaying groups of items, e.g., when fewer items are displayed. Thus, another set of rules and algorithms can be applied for laying out items without use of overlap as depicted in FIGS. 17( a)-(h). Exemplary rules for displaying a single group with non-overlapping items based on the number of items in the group are presented below in Table 1.
  • TABLE 1
    Number of
    Items Layout Rule
    1 FIG. 17(a) -- The item 5004 is placed to the right or left of the center point
    of the item display region 5002. The item 5004 is scaled to a size such
    that a second item of the same size and border area will not fit in the item
    display region
    5002.
    2 FIG. 17(b) -- The two items, 5006 and 5008, are scaled to fit side by side
    and aligned on a diagonal such that a third item will not fit in the item
    display region
    5002.
    3 FIG. 17(c) -- The three items, 5010, 5012 and 5014, are scaled to fit side
    by side and aligned on a diagonal such that a fourth item will not fit in the item
    display region
    5002.
    4 FIG. 17(d) -- The four items, 5016, 5018, 5020 and 5022, are scaled and
    arranged in a circle such that a fifth item will not fit in the item display
    region
    5002.
    5 FIG. 17(e) -- The five items, 5024, 5026, 5028, 5030 and 5032, are
    arranged in a pyramid such that a sixth item will not fit in the item
    display region
    5002.
    6 FIG. 17(f) -- The six items, 5034, 5036, 5038, 5040, 5042 and 5044, are
    scaled in two rows forming a rectangle such that a seventh item will not
    fit in the item display region 5002.
    7 FIG. 17(g) -- The seven items, 5046, 5048, 5050, 5052, 5054, 5056 and
    5058, are arranged in a pyramid such that an eighth item will not fit in the
    item display region 5002.
    8 FIG. 17(h) -- The eight items, 5060, 5062, 5064, 5066, 5070, 5072, 5074
    and 5076, are scaled in two rows forming a rectangle such that a ninth
    item will not fit in the item display region 5002.
  • Numerous other variations of user interface layouts are contemplated by exemplary embodiments of the present invention. For example, the layout rules and algorithms described above can be used without displaying a text entry box and/or a text entry widget. An example is illustrated in FIG. 18. Therein, horizontal overlapping is also used for the displayed results, e.g., offsetting the images from a vertical center of each group to add some eccentricity to the layout. Image size affects the percentage of overlap. The percentage of overlap used in generating GUI screens according to exemplary embodiments of the present invention can be a function of screen size, number of items being displayed and/or user preference. Compare the layout of FIG. 19 with that of FIG. 18. The displayed items in FIG. 18 have a 50% horizontal overlap, while the (larger) displayed items in FIG. 19 have a 30% horizontal overlap. Another difference is that the items in FIG. 18 are stacked top to bottom (i.e., the topmost image in each vertical stack covers a portion of the second topmost image in each vertical stack, etc.), whereas the items in FIG. 19 are stacked bottom to top.
  • Systems and methods for processing data to generate layouts on user interfaces according to exemplary embodiments of the present invention can be performed by one or more processors executing sequences of instructions contained in a memory device. Such instructions may be read into the memory device from other computer-readable mediums such as secondary data storage device(s). Execution of the sequences of instructions contained in the memory device causes the processor to operate, for example, as described above. In alternative embodiments, hard-wire circuitry may be used in place of or in combination with software instructions to implement the present invention. The layout rules described herein can be encoded algorithmically in software and applied recursively, e.g., to recursively layout groups within a group display area and items within an item display area. Although the examples described above refer to two “layers” within each layout, i.e., a group layer and an item layer, those skilled in the art will appreciate that more (or fewer) than two layers can be implemented. For three or more layers, the layout rules can be applied recursively to generate a layout for each layer. According to these exemplary embodiments of the present invention, items are laid out in a display space in a manner which provides a pleasing appearance to the user, while at the same time making efficient use of limited display (e.g., TV screen) space to display more (and larger) images per layout.
  • Even using techniques, such as overlapping and scaling, to reduce the amount of space which each item requires on the user interface, there are still potential limits to the number of items which can be displayed on a single GUI screen while providing enough graphic detail to the item to be pleasing to the user. Naturally, such potential limits depend upon numerous implementation details, such as the type of items being displayed, the amount of space allocated for item display, the screen resolution, etc. For example movie covers have one ratio which is different from standard TV images and still further different from high definition TV images. Sometimes ‘show cards’ can be used in place of still images as items on the user interface, which also have different size ratios. Music album covers have a different size ratio. For each set of parameters, a different potential limit on the number of items being displayed can be established. For example, where movie cover images are used on a user interface which is intended to be shown on a high definition television screen, a maximum of 128 movie covers can be shown on a single GUI screen. If a search result or particular GUI screen should display more than 128 movie cover images in such an exemplary user interface according to the present invention, then a scroll mechanism can be added to the interface to allow the user to scroll down beyond an initial display of 128 items.
  • The above-described exemplary embodiments are intended to be illustrative in all respects, rather than restrictive, of the present invention. Thus the present invention is capable of many variations in detailed implementation that can be derived from the description contained herein by a person skilled in the art. All such variations and modifications are considered to be within the scope and spirit of the present invention as defined by the following claims. No element, act, or instruction used in the description of the present application should be construed as critical or essential to the invention unless explicitly described as such. Also, as used herein, the article “a” is intended to include one or more items.

Claims (18)

1. A user interface displayed on a screen comprising:
a plurality of control elements, at least some of said plurality of control elements having at least one alphanumeric character displayed thereon;
a textbox for displaying alphanumeric characters entered using said plurality of control elements; and
a plurality of groups of displayed items,
wherein a layout of said plurality of groups on said user interface is based on a first number of groups which are displayed, and wherein a layout of said displayed items within a group is based on a second number of said items displayed within that group.
2. The user interface of claim 1, wherein said plurality of control elements are laid out in two horizontal rows.
3. The user interface of claim 1, wherein said text box is displayed above said plurality of control elements and said groups of displayed items are displayed below said plurality of control elements.
4. The user interface of claim 1, wherein said displayed items are images.
5. The user interface of claim 4, wherein said images are movie covers.
6. The user interface of claim 1, wherein said layout of groups involves placing a center point of each group at a location on said user interface, which location is determined based on the number of groups to be displayed.
7. The user interface of claim 6, wherein each of said displayed items associated with one of said groups is displayed within a rectangular region disposed at a respective center point.
8. The user interface of claim 1, wherein said number of groups changes based upon search results entered using said plurality of control elements.
9. The user interface of claim 1, wherein said groups do not overlap and at least some of said items within each group do overlap.
10. A method for laying out items in a user interface comprising the steps of:
laying out a plurality of groups of items within a group display space, said groups being laid out within said display space in a pattern which varies as a function of the number of said plurality of groups; and
laying out, for each of said plurality of groups, a plurality of items within an item display space associated with a respective one of said plurality of groups, said items being laid out within a respective item display space in a pattern which varies as a function of the number of said plurality of items.
11. The method of claim 10, further comprising the step of:
performing said steps of laying out said plurality of groups and plurality of items in accordance with at least one set of layout rules.
12. The method of claim 11, wherein said layout rules are applied recursively to said steps of laying out said plurality of groups and said plurality of items.
13. The method of claim 10, wherein said user interface is displayed on a television and said items are movie cover images, wherein selection of one of said movie cover images by a user results in additional information being displayed on said user interface for a movie associated therewith.
14. The method of claim 10, further comprising the step of:
providing a plurality of control elements on said user interface, at least some of said plurality of control elements having at least one alphanumeric character displayed thereon; and
providing a textbox for displaying alphanumeric characters entered using said plurality of control elements on said user interface.
15. The method of claim 10, wherein said layout of groups involves placing a center point of each group at a location on said user interface, which location is determined based on the number of groups to be displayed.
16. The method of claim 15, wherein each of said displayed items associated with one of said groups is displayed within a rectangular region disposed at a respective center point.
17. The method of claim 14, wherein said number of groups changes based upon search results entered using said plurality of control elements.
18. The method of claim 10, wherein said groups do not overlap and at least some of said items within each group do overlap.
US12/134,486 2005-01-05 2008-06-06 Scaling and Layout Methods and Systems for Handling One-To-Many Objects Abandoned US20080235735A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/134,486 US20080235735A1 (en) 2005-01-05 2008-06-06 Scaling and Layout Methods and Systems for Handling One-To-Many Objects

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US64142105P 2005-01-05 2005-01-05
US11/325,768 US7386806B2 (en) 2005-01-05 2006-01-05 Scaling and layout methods and systems for handling one-to-many objects
US12/134,486 US20080235735A1 (en) 2005-01-05 2008-06-06 Scaling and Layout Methods and Systems for Handling One-To-Many Objects

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/325,768 Continuation US7386806B2 (en) 2005-01-05 2006-01-05 Scaling and layout methods and systems for handling one-to-many objects

Publications (1)

Publication Number Publication Date
US20080235735A1 true US20080235735A1 (en) 2008-09-25

Family

ID=36648158

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/325,768 Active 2026-04-28 US7386806B2 (en) 2005-01-05 2006-01-05 Scaling and layout methods and systems for handling one-to-many objects
US12/134,486 Abandoned US20080235735A1 (en) 2005-01-05 2008-06-06 Scaling and Layout Methods and Systems for Handling One-To-Many Objects

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/325,768 Active 2026-04-28 US7386806B2 (en) 2005-01-05 2006-01-05 Scaling and layout methods and systems for handling one-to-many objects

Country Status (6)

Country Link
US (2) US7386806B2 (en)
EP (1) EP1834477A4 (en)
JP (1) JP2008527539A (en)
KR (1) KR101190462B1 (en)
CN (1) CN101484869B (en)
WO (1) WO2006074266A2 (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070048713A1 (en) * 2005-08-12 2007-03-01 Microsoft Corporation Media player service library
US20100035682A1 (en) * 2008-07-01 2010-02-11 Yoostar Entertainment Group, Inc. User interface systems and methods for interactive video systems
US20100079681A1 (en) * 2008-09-30 2010-04-01 Echostar Technologies Llc Systems and methods for graphical control of symbol-based features in a television receiver
WO2010080934A1 (en) * 2009-01-07 2010-07-15 David Colter Method and apparatus for user interface movement scheme
CN102098469A (en) * 2009-12-15 2011-06-15 索尼公司 Information processing apparatus, information processing method and program
US20110307783A1 (en) * 2010-06-11 2011-12-15 Disney Enterprises, Inc. System and method enabling visual filtering of content
WO2012018358A1 (en) * 2010-08-04 2012-02-09 Copia Interactive, Llc Method of and system for browsing and displaying items from a collection
US8397262B2 (en) 2008-09-30 2013-03-12 Echostar Technologies L.L.C. Systems and methods for graphical control of user interface features in a television receiver
US20130145321A1 (en) * 2011-12-02 2013-06-06 Kabushiki Kaisha Toshiba Information processing apparatus, method of controlling display and storage medium
US8473979B2 (en) 2008-09-30 2013-06-25 Echostar Technologies L.L.C. Systems and methods for graphical adjustment of an electronic program guide
US8572651B2 (en) 2008-09-22 2013-10-29 EchoStar Technologies, L.L.C. Methods and apparatus for presenting supplemental information in an electronic programming guide
US8582957B2 (en) 2008-09-22 2013-11-12 EchoStar Technologies, L.L.C. Methods and apparatus for visually displaying recording timer information
US8635547B2 (en) 2009-01-09 2014-01-21 Sony Corporation Display device and display method
US8763045B2 (en) 2008-09-30 2014-06-24 Echostar Technologies L.L.C. Systems and methods for providing customer service features via a graphical user interface in a television receiver
US8793735B2 (en) 2008-09-30 2014-07-29 EchoStar Technologies, L.L.C. Methods and apparatus for providing multiple channel recall on a television receiver
US9100614B2 (en) 2008-10-31 2015-08-04 Echostar Technologies L.L.C. Graphical interface navigation based on image element proximity
US9357262B2 (en) 2008-09-30 2016-05-31 Echostar Technologies L.L.C. Systems and methods for graphical control of picture-in-picture windows

Families Citing this family (142)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6735253B1 (en) * 1997-05-16 2004-05-11 The Trustees Of Columbia University In The City Of New York Methods and architecture for indexing and editing compressed video over the world wide web
US7143434B1 (en) 1998-11-06 2006-11-28 Seungyup Paek Video description system and method
AU2002351310A1 (en) 2001-12-06 2003-06-23 The Trustees Of Columbia University In The City Of New York System and method for extracting text captions from video and generating video summaries
WO2006096612A2 (en) 2005-03-04 2006-09-14 The Trustees Of Columbia University In The City Of New York System and method for motion estimation and mode decision for low-complexity h.264 decoder
KR100735558B1 (en) * 2005-10-18 2007-07-04 삼성전자주식회사 Apparatus and method for displaying pointer
AU2005239672B2 (en) * 2005-11-30 2009-06-11 Canon Kabushiki Kaisha Sortable collection browser
KR100725411B1 (en) * 2006-02-06 2007-06-07 삼성전자주식회사 User interface for content browsing, method for the providing the user interface, and content browsing apparatus
US7536654B2 (en) * 2006-02-06 2009-05-19 Microsoft Corporation Photo browse and zoom
JP4677352B2 (en) * 2006-02-17 2011-04-27 キヤノン株式会社 Recording / reproducing apparatus and recording / reproducing method
JP2007304666A (en) 2006-05-08 2007-11-22 Sony Computer Entertainment Inc Information output system and information output method
JP2007304667A (en) * 2006-05-08 2007-11-22 Sony Computer Entertainment Inc User interface device, user interface method and program
US20070279416A1 (en) * 2006-06-06 2007-12-06 Cobb Glenn A Enabling and Rendering Business Components in an Interactive Data Visualization Tool
JP4844270B2 (en) * 2006-07-21 2011-12-28 ソニー株式会社 Display control apparatus, recording medium, display control method, and display control program
KR101335351B1 (en) * 2006-10-18 2013-12-03 삼성전자주식회사 Method for providing Program Image Information of Digital Broadcasting Receiving Device
US20090262074A1 (en) * 2007-01-05 2009-10-22 Invensense Inc. Controlling and accessing content using motion processing on mobile devices
US8020441B2 (en) 2008-02-05 2011-09-20 Invensense, Inc. Dual mode sensing for vibratory gyroscope
US8952832B2 (en) 2008-01-18 2015-02-10 Invensense, Inc. Interfacing application programs and motion sensors of a device
US7796872B2 (en) 2007-01-05 2010-09-14 Invensense, Inc. Method and apparatus for producing a sharp image from a handheld device containing a gyroscope
US8508039B1 (en) 2008-05-08 2013-08-13 Invensense, Inc. Wafer scale chip scale packaging of vertically integrated MEMS sensors with electronics
US7934423B2 (en) 2007-12-10 2011-05-03 Invensense, Inc. Vertically integrated 3-axis MEMS angular accelerometer with integrated electronics
US20090265671A1 (en) * 2008-04-21 2009-10-22 Invensense Mobile devices with motion gesture recognition
US8047075B2 (en) 2007-06-21 2011-11-01 Invensense, Inc. Vertically integrated 3-axis MEMS accelerometer with electronics
US8462109B2 (en) 2007-01-05 2013-06-11 Invensense, Inc. Controlling and accessing content using motion processing on mobile devices
US8141424B2 (en) 2008-09-12 2012-03-27 Invensense, Inc. Low inertia frame for detecting coriolis acceleration
US8250921B2 (en) 2007-07-06 2012-08-28 Invensense, Inc. Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics
US20080189627A1 (en) * 2007-02-07 2008-08-07 Microsoft Corporation Execution of application based on task selection
US7721312B2 (en) * 2007-03-19 2010-05-18 Sony Corporation System and method for scrolling through TV video icons by category
US8082512B2 (en) * 2007-08-03 2011-12-20 Microsoft Corporation Fractal display advertising on computer-driven screens
JP5050727B2 (en) * 2007-08-22 2012-10-17 ソニー株式会社 Image display device
KR101397541B1 (en) * 2007-09-05 2014-05-27 주식회사 알티캐스트 Method and apparatus for controlling scene structure in a digital broadcast receiver
KR101182286B1 (en) * 2007-09-19 2012-09-14 삼성전자주식회사 Remote controller for sensing motion, image display apparatus controlling pointer by the remote controller, and methods thereof
JP5121367B2 (en) * 2007-09-25 2013-01-16 株式会社東芝 Apparatus, method and system for outputting video
US7797402B2 (en) 2007-09-26 2010-09-14 Aq Media, Inc. Audio-visual navigation and communication dynamic memory architectures
US20090128581A1 (en) * 2007-11-20 2009-05-21 Microsoft Corporation Custom transition framework for application state transitions
JP4670860B2 (en) * 2007-11-22 2011-04-13 ソニー株式会社 Recording / playback device
US20090144776A1 (en) * 2007-11-29 2009-06-04 At&T Knowledge Ventures, L.P. Support for Personal Content in a Multimedia Content Delivery System and Network
US8775953B2 (en) * 2007-12-05 2014-07-08 Apple Inc. Collage display of image projects
US20090158337A1 (en) * 2007-12-13 2009-06-18 Mobitv, Inc. Mosaic video content selection mechanism
US20090177538A1 (en) * 2008-01-08 2009-07-09 Microsoft Corporation Zoomable advertisements with targeted content
JP4535150B2 (en) * 2008-03-18 2010-09-01 ソニー株式会社 Image processing apparatus and method, program, and recording medium
DE102008017846A1 (en) * 2008-04-08 2009-10-29 Siemens Aktiengesellschaft Method and user interface for the graphical representation of medical data
WO2009126785A2 (en) 2008-04-10 2009-10-15 The Trustees Of Columbia University In The City Of New York Systems and methods for image archaeology
JP5530425B2 (en) * 2008-05-01 2014-06-25 プライマル フュージョン インコーポレイテッド Method, system, and computer program for dynamic generation of user-driven semantic networks and media integration
US20090307086A1 (en) * 2008-05-31 2009-12-10 Randy Adams Systems and methods for visually grouping links to documents
WO2009155281A1 (en) 2008-06-17 2009-12-23 The Trustees Of Columbia University In The City Of New York System and method for dynamically and interactively searching media data
US20100058173A1 (en) * 2008-08-28 2010-03-04 Kabushiki Kaisha Toshiba Display processing apparatus, display processing method, and computer program product
JP4675995B2 (en) * 2008-08-28 2011-04-27 株式会社東芝 Display processing apparatus, program, and display processing method
US8892560B2 (en) * 2008-08-29 2014-11-18 Adobe Systems Incorporated Intuitive management of electronic files
US8671069B2 (en) 2008-12-22 2014-03-11 The Trustees Of Columbia University, In The City Of New York Rapid image annotation via brain state decoding and visual pattern mining
KR20100075014A (en) * 2008-12-24 2010-07-02 삼성전자주식회사 Program information displaying method and display apparatus using the same
US9152300B2 (en) * 2008-12-31 2015-10-06 Tivo Inc. Methods and techniques for adaptive search
US9037999B2 (en) * 2008-12-31 2015-05-19 Tivo Inc. Adaptive search result user interface
US20100192181A1 (en) * 2009-01-29 2010-07-29 At&T Intellectual Property I, L.P. System and Method to Navigate an Electonic Program Guide (EPG) Display
KR101368612B1 (en) 2009-02-24 2014-02-27 이베이 인크. Systems and methods for providing multi-directional visual browsing
JP5388631B2 (en) * 2009-03-03 2014-01-15 株式会社東芝 Content presentation apparatus and method
US20100225815A1 (en) * 2009-03-05 2010-09-09 Vishal Vincent Khatri Systems methods and apparatuses for rendering user customizable multimedia signals on a display device
JP4852119B2 (en) * 2009-03-25 2012-01-11 株式会社東芝 Data display device, data display method, and data display program
KR101570696B1 (en) * 2009-05-29 2015-11-20 엘지전자 주식회사 Apparatus for displaying image and method for operating the same
US9237296B2 (en) * 2009-06-01 2016-01-12 Lg Electronics Inc. Image display apparatus and operating method thereof
KR101551212B1 (en) 2009-06-02 2015-09-18 엘지전자 주식회사 Apparatus for displaying image and method for operating the same
US9578271B2 (en) * 2009-08-18 2017-02-21 Sony Corporation Integrated user interface for internet-enabled TV
US8458169B2 (en) * 2009-09-25 2013-06-04 Apple Inc. Mini-form view for data records
US8135221B2 (en) * 2009-10-07 2012-03-13 Eastman Kodak Company Video concept classification using audio-visual atoms
US8601510B2 (en) * 2009-10-21 2013-12-03 Westinghouse Digital, Llc User interface for interactive digital television
KR101164353B1 (en) * 2009-10-23 2012-07-09 삼성전자주식회사 Method and apparatus for browsing and executing media contents
CN102065227A (en) * 2009-11-17 2011-05-18 新奥特(北京)视频技术有限公司 Method and device for horizontally and vertically aligning object in graph and image processing
US8839128B2 (en) 2009-11-25 2014-09-16 Cooliris, Inc. Gallery application for content viewing
JP5985991B2 (en) * 2010-02-19 2016-09-06 トムソン ライセンシングThomson Licensing Media content space navigation
CN102202208A (en) * 2010-03-23 2011-09-28 华为终端有限公司 Information interaction method and interface control system
JP5533165B2 (en) 2010-04-09 2014-06-25 ソニー株式会社 Information processing apparatus, information processing method, and program
US8957920B2 (en) 2010-06-25 2015-02-17 Microsoft Corporation Alternative semantics for zoom operations in a zoomable scene
JP2012060236A (en) * 2010-09-06 2012-03-22 Sony Corp Image processing apparatus, image processing method, and computer program
KR101753141B1 (en) * 2010-09-07 2017-07-04 삼성전자 주식회사 Display apparatus and displaying method of contents
JP4922446B2 (en) * 2010-09-13 2012-04-25 株式会社東芝 Electronic device, control method of electronic device
KR101730422B1 (en) * 2010-11-15 2017-04-26 엘지전자 주식회사 Image display apparatus and method for operating the same
US20120166953A1 (en) * 2010-12-23 2012-06-28 Microsoft Corporation Techniques for electronic aggregation of information
US9671825B2 (en) 2011-01-24 2017-06-06 Apple Inc. Device, method, and graphical user interface for navigating through an electronic document
US9030405B2 (en) 2011-02-04 2015-05-12 Invensense, Inc. High fidelity remote controller device for digital living room
US10353566B2 (en) * 2011-09-09 2019-07-16 Microsoft Technology Licensing, Llc Semantic zoom animations
US9588679B2 (en) 2011-09-12 2017-03-07 Microsoft Technology Licensing, Llc Virtual viewport and fixed positioning with optical zoom
US8493411B2 (en) * 2011-09-30 2013-07-23 Google, Inc. Methods and apparatus for extensions to directed graphs with minimal and maximal constraints are encoded by arcs in opposite directions
JP5983983B2 (en) * 2011-10-03 2016-09-06 ソニー株式会社 Information processing apparatus and method, and program
US9639614B2 (en) * 2011-10-04 2017-05-02 Microsoft Technology Licensing, Llc Maximizing content item information on a search engine results page
USD731507S1 (en) * 2011-11-17 2015-06-09 Axell Corporation Display screen with animated graphical user interface
USD731504S1 (en) * 2011-11-17 2015-06-09 Axell Corporation Display screen with graphical user interface
CN103150153B (en) * 2011-12-06 2016-03-02 阿里巴巴集团控股有限公司 A kind of method for designing of user interface and device
CN103186638B (en) * 2011-12-31 2015-12-02 北大方正集团有限公司 A kind of picture layout method and device
CN102547466B (en) * 2012-02-27 2014-08-13 中国科学院计算技术研究所 Interactive method and system of intelligent television
USD716825S1 (en) 2012-03-06 2014-11-04 Apple Inc. Display screen or portion thereof with graphical user interface
USD760750S1 (en) * 2012-08-31 2016-07-05 Apple Inc. Display screen or portion thereof with graphical user interface
US9311310B2 (en) * 2012-10-26 2016-04-12 Google Inc. System and method for grouping related photographs
US9292160B2 (en) * 2012-11-30 2016-03-22 Verizon and Redbox Digital Entertainment Services, LLC Systems and methods for presenting media program accessibility information
US9165566B2 (en) 2013-01-24 2015-10-20 Microsoft Technology Licensing, Llc Indefinite speech inputs
CN103257786B (en) * 2013-04-28 2018-07-27 东莞宇龙通信科技有限公司 A kind of terminal interface display methods and terminal
USD755843S1 (en) * 2013-06-10 2016-05-10 Apple Inc. Display screen or portion thereof with graphical user interface
US9930417B2 (en) * 2013-07-11 2018-03-27 Time Warner Cable Enterprises Llc Video browser
US10545657B2 (en) 2013-09-03 2020-01-28 Apple Inc. User interface for manipulating user interface objects
US10080060B2 (en) 2013-09-10 2018-09-18 Opentv, Inc. Systems and methods of displaying content
USD766279S1 (en) * 2013-12-26 2016-09-13 Sony Corporation Display panel or screen with graphical user interface
US9720564B1 (en) 2014-01-17 2017-08-01 Beats Music, Llc Systems and methods for determining user preferences using a graphical user interface
GB2522453A (en) 2014-01-24 2015-07-29 Barco Nv Dynamic display layout
WO2015148644A1 (en) * 2014-03-25 2015-10-01 Touchtunes Music Corporation Digital jukebox device with improved user interfaces, and associated methods
JP6328797B2 (en) 2014-05-30 2018-05-23 アップル インコーポレイテッド Transition from using one device to using another device
US10073590B2 (en) 2014-09-02 2018-09-11 Apple Inc. Reduced size user interface
CN106662966B (en) 2014-09-02 2020-08-18 苹果公司 Multi-dimensional object rearrangement
USD783033S1 (en) * 2014-10-10 2017-04-04 Travelport, Lp Display screen with graphical user interface
CN105808182B (en) 2015-01-15 2019-09-17 财团法人工业技术研究院 Display control method and system, advertisement cut judgment means, image and sound processing unit
KR20160097867A (en) * 2015-02-10 2016-08-18 삼성전자주식회사 Image display apparatus and method for displaying image
JPWO2016157860A1 (en) * 2015-03-27 2018-01-11 パナソニックIpマネジメント株式会社 Recording / playback apparatus and program information display method
US9652125B2 (en) 2015-06-18 2017-05-16 Apple Inc. Device, method, and graphical user interface for navigating media content
USD785022S1 (en) * 2015-06-25 2017-04-25 Adp, Llc Display screen with a graphical user interface
US9928029B2 (en) * 2015-09-08 2018-03-27 Apple Inc. Device, method, and graphical user interface for providing audiovisual feedback
US9990113B2 (en) 2015-09-08 2018-06-05 Apple Inc. Devices, methods, and graphical user interfaces for moving a current focus using a touch-sensitive remote control
USD811419S1 (en) 2015-09-18 2018-02-27 Sap Se Display screen or portion thereof with graphical user interface
CN105187935A (en) * 2015-09-30 2015-12-23 北京奇虎科技有限公司 Method and device for displaying application information
CA168164S (en) 2015-10-22 2019-07-17 Gamblit Gaming Llc Display screen with a graphical user interface
US10637986B2 (en) 2016-06-10 2020-04-28 Apple Inc. Displaying and updating a set of application views
DK201670595A1 (en) 2016-06-11 2018-01-22 Apple Inc Configuring context-specific user interfaces
USD831047S1 (en) * 2016-07-15 2018-10-16 Kazark, Inc. Display screen with graphical user interface
CN106514071B (en) * 2016-12-06 2018-04-10 南京熊猫电子股份有限公司 A kind of robot welding swings establishing method
USD847196S1 (en) * 2017-02-07 2019-04-30 Mitsubishi Electric Corporation Display screen with animated graphical user interface
US20180275856A1 (en) * 2017-03-23 2018-09-27 Vizio Inc Systems and methods for zooming a selected portion of the graphics layer of a display screen
CN107085567B (en) * 2017-04-25 2020-08-14 深圳铂睿智恒科技有限公司 Control method and system for intelligent terminal data layout display
USD842879S1 (en) * 2017-07-11 2019-03-12 Google Llc Display screen with transitional graphical user interface
USD845320S1 (en) * 2017-07-17 2019-04-09 Google Llc Display screen with transitional graphical user interface
USD861715S1 (en) * 2017-07-24 2019-10-01 Facebook, Inc. Display screen with graphical user interface for a feature collection advertisement
USD858546S1 (en) * 2017-07-24 2019-09-03 Facebook, Inc. Display screen with a transitional graphical user interface for a product collection advertisement
USD863344S1 (en) * 2018-04-08 2019-10-15 Go Gladys, Inc. Display screen with animated graphical user interface
USD877175S1 (en) * 2018-06-04 2020-03-03 Apple Inc. Electronic device with graphical user interface
USD883319S1 (en) 2018-10-29 2020-05-05 Apple Inc. Electronic device with graphical user interface
CN109408189B (en) * 2018-11-02 2022-07-12 北京字节跳动网络技术有限公司 Dynamic adjustment method, device, equipment and medium for client interface layout
JP7183778B2 (en) * 2018-12-26 2022-12-06 セイコーエプソン株式会社 Display method and display device
US10845979B2 (en) * 2019-01-10 2020-11-24 Tcl Research America, Inc. Method and system for digital content display and interaction
CA186708S (en) * 2019-03-26 2020-12-10 Tertzakian Peter Display screen
CA186709S (en) * 2019-03-26 2020-08-31 Tertzakian Peter Display screen
TWI713360B (en) * 2019-09-09 2020-12-11 三竹資訊股份有限公司 Device and method of video and audio playback frame in watchlist view on a tv
US11714928B2 (en) * 2020-02-27 2023-08-01 Maxon Computer Gmbh Systems and methods for a self-adjusting node workspace
CN113766293B (en) * 2020-06-05 2023-03-21 北京字节跳动网络技术有限公司 Information display method, device, terminal and storage medium
US11360634B1 (en) 2021-05-15 2022-06-14 Apple Inc. Shared-content session user interfaces
US11907605B2 (en) 2021-05-15 2024-02-20 Apple Inc. Shared-content session user interfaces
USD984461S1 (en) * 2021-06-04 2023-04-25 Apple Inc. Display screen or portion thereof with graphical user interface
USD1012109S1 (en) 2022-04-25 2024-01-23 Sap Se Display screen or portion thereof with graphical user interface

Citations (84)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4745402A (en) * 1987-02-19 1988-05-17 Rca Licensing Corporation Input device for a display system using phase-encoded signals
US5045843A (en) * 1988-12-06 1991-09-03 Selectech, Ltd. Optical pointing device
US5341466A (en) * 1991-05-09 1994-08-23 New York University Fractal computer user centerface with zooming capability
US5359348A (en) * 1992-05-21 1994-10-25 Selectech, Ltd. Pointing device having improved automatic gain control and information reporting
US5515488A (en) * 1994-08-30 1996-05-07 Xerox Corporation Method and apparatus for concurrent graphical visualization of a database search and its search history
US5553217A (en) * 1993-09-23 1996-09-03 Ricoh Company, Ltd. Document layout using tiling
US5615346A (en) * 1994-12-23 1997-03-25 International Business Machines Corporation Method and system for a piano bar browser of information sets
US5638523A (en) * 1993-01-26 1997-06-10 Sun Microsystems, Inc. Method and apparatus for browsing information in a computer database
US5671342A (en) * 1994-11-30 1997-09-23 Intel Corporation Method and apparatus for displaying information relating to a story and a story indicator in a computer system
US5685002A (en) * 1993-09-29 1997-11-04 Minolta Co., Ltd. Image processing system capable of generating a multi-picture image
US5745710A (en) * 1993-05-24 1998-04-28 Sun Microsystems, Inc. Graphical user interface for selection of audiovisual programming
US5790121A (en) * 1996-09-06 1998-08-04 Sklar; Peter Clustering user interface
US5793438A (en) * 1995-11-13 1998-08-11 Hyundai Electronics America Electronic program guide with enhanced presentation
US5796395A (en) * 1996-04-02 1998-08-18 Wegener Internet Projects Bv System for publishing and searching interests of individuals
US5835156A (en) * 1996-08-14 1998-11-10 Samsung Electroncis, Ltd. Television graphical user interface employing remote random access pointing device
US5912612A (en) * 1997-10-14 1999-06-15 Devolpi; Dean R. Multi-speed multi-direction analog pointing device
US5940072A (en) * 1996-08-15 1999-08-17 Samsung Information Systems America Graphics decompression using system ROM indexing in TV set top box
US5955988A (en) * 1996-08-14 1999-09-21 Samsung Electronics Co., Ltd. Graphical user interface for establishing installation location for satellite based television system
US5963916A (en) * 1990-09-13 1999-10-05 Intouch Group, Inc. Network apparatus and method for preview of music products and compilation of market data
US5978043A (en) * 1996-08-14 1999-11-02 Samsung Electronics Co., Ltd. TV graphical user interface that provides customized lists of programming
US5982369A (en) * 1997-04-21 1999-11-09 Sony Corporation Method for displaying on a screen of a computer system images representing search results
US6002394A (en) * 1995-10-02 1999-12-14 Starsight Telecast, Inc. Systems and methods for linking television viewers with advertisers and broadcasters
US6005578A (en) * 1997-09-25 1999-12-21 Mindsphere, Inc. Method and apparatus for visual navigation of information objects
US6016144A (en) * 1996-08-14 2000-01-18 Samsung Electronics Co., Ltd. Multi-layered television graphical user interface
US6034684A (en) * 1997-11-24 2000-03-07 Sony Corporation Identification of data items on a screen display using landmark and grid line graphical objects
US6035323A (en) * 1997-10-24 2000-03-07 Pictra, Inc. Methods and apparatuses for distributing a collection of digital media over a network with automatic generation of presentable media
US6037933A (en) * 1996-11-13 2000-03-14 Samsung Electronics Co., Ltd. TV graphical user interface for providing user access to preset time periods of TV program information
US6049823A (en) * 1995-10-04 2000-04-11 Hwang; Ivan Chung-Shung Multi server, interactive, video-on-demand television system utilizing a direct-access-on-demand workgroup
US6057831A (en) * 1996-08-14 2000-05-02 Samsung Electronics Co., Ltd. TV graphical user interface having cursor position indicator
US6088031A (en) * 1997-07-21 2000-07-11 Samsung Electronics Co., Ltd. Method and device for controlling selection of a menu item from a menu displayed on a screen
US6092076A (en) * 1998-03-24 2000-07-18 Navigation Technologies Corporation Method and system for map display in a navigation application
US6154723A (en) * 1996-12-06 2000-11-28 The Board Of Trustees Of The University Of Illinois Virtual reality 3D interface system for data creation, viewing and editing
US6175362B1 (en) * 1997-07-21 2001-01-16 Samsung Electronics Co., Ltd. TV graphical user interface providing selection among various lists of TV channels
US6181333B1 (en) * 1996-08-14 2001-01-30 Samsung Electronics Co., Ltd. Television graphical user interface having channel and program sorting capabilities
US6191781B1 (en) * 1996-08-14 2001-02-20 Samsung Electronics, Ltd. Television graphical user interface that combines electronic program guide with graphical channel changer
US6195089B1 (en) * 1996-08-14 2001-02-27 Samsung Electronics Co., Ltd. Television graphical user interface having variable channel changer icons
US6268849B1 (en) * 1998-06-30 2001-07-31 United Video Properties, Inc. Internet television program guide system with embedded real-time data
US6288719B1 (en) * 1998-10-26 2001-09-11 Eastman Kodak Company System and method of constructing a photo album
US6295646B1 (en) * 1998-09-30 2001-09-25 Intel Corporation Method and apparatus for displaying video data and corresponding entertainment data for multiple entertainment selection sources
US20010035875A1 (en) * 1996-01-11 2001-11-01 Kenji Suzuki Image edit device adapted to rapidly lay-out photographs into templates with means for preview and correction by user
US6314575B1 (en) * 1994-09-14 2001-11-06 Time Warner Entertainment Company, L.P. Telecasting service for providing video programs on demand with an interactive interface for facilitating viewer selection of video programs
US6330858B1 (en) * 1998-06-05 2001-12-18 Navigation Technologies Corporation Method and system for scrolling a map display in a navigation application
US6349257B1 (en) * 1999-09-15 2002-02-19 International Business Machines Corporation System for personalized mobile navigation information
US20020051208A1 (en) * 1998-01-08 2002-05-02 Xerox Corporation Method for image layout using energy minimization
US6385542B1 (en) * 2000-10-18 2002-05-07 Magellan Dis, Inc. Multiple configurations for a vehicle navigation system
US20020059603A1 (en) * 2000-04-10 2002-05-16 Kelts Brett R. Interactive content guide for television programming
US6397387B1 (en) * 1997-06-02 2002-05-28 Sony Corporation Client and server system
US6400406B1 (en) * 1996-06-28 2002-06-04 Samsung Electronics, Co., Ltd. Device and method for displaying broadcast program guide in a programmed recording system
US20020075317A1 (en) * 2000-05-26 2002-06-20 Dardick Technologies System and method for an on-demand script-activated virtual keyboard
US6412110B1 (en) * 1996-08-06 2002-06-25 Starsight Telecast, Inc. Electronic program guide with interactive areas
US6411308B1 (en) * 1996-08-14 2002-06-25 Samsung Electronics Co., Ltd. Television graphical user interface having variable channel control bars
US6415225B1 (en) * 1999-08-06 2002-07-02 Aisin Aw Co., Ltd. Navigation system and a memory medium
US6421067B1 (en) * 2000-01-16 2002-07-16 Isurftv Electronic programming guide
US6426761B1 (en) * 1999-04-23 2002-07-30 Internation Business Machines Corporation Information presentation system for a graphical user interface
US6429813B2 (en) * 1999-01-14 2002-08-06 Navigation Technologies Corp. Method and system for providing end-user preferences with a navigation system
US6434556B1 (en) * 1999-04-16 2002-08-13 Board Of Trustees Of The University Of Illinois Visualization of Internet search information
US20020112237A1 (en) * 2000-04-10 2002-08-15 Kelts Brett R. System and method for providing an interactive display interface for information objects
US6452609B1 (en) * 1998-11-06 2002-09-17 Supertuner.Com Web application for accessing media streams
US6529218B2 (en) * 1998-07-13 2003-03-04 Matsushita Electric Industrial Co., Ltd. Display control with movable or updatable auxiliary information
US6577350B1 (en) * 1998-12-21 2003-06-10 Sony Corporation Method and apparatus for displaying an electronic program guide
US20030160815A1 (en) * 2002-02-28 2003-08-28 Muschetto James Edward Method and apparatus for accessing information, computer programs and electronic communications across multiple computing devices using a graphical user interface
US20030167447A1 (en) * 2001-12-04 2003-09-04 Seiko Epson Corporation Layout editing program
US6621452B2 (en) * 1997-08-19 2003-09-16 Siemens Vdo Automotive Corporation Vehicle information system
US20040073922A1 (en) * 2001-02-28 2004-04-15 True Steven Ray System and method for distinguishing between indentically titled programs
US6735777B1 (en) * 1998-10-28 2004-05-11 Samsung Electronics Co., Ltd. Method for controlling program guide for displaying broadcast program title
US20040117831A1 (en) * 1999-06-28 2004-06-17 United Video Properties, Inc. Interactive television program guide system and method with niche hubs
US6753849B1 (en) * 1999-10-27 2004-06-22 Ken Curran & Associates Universal remote TV mouse
US6765598B2 (en) * 1998-10-27 2004-07-20 Samsung Electronics Co., Ltd. Method and apparatus for enabling selection in an on-screen menu
US6775659B2 (en) * 1998-08-26 2004-08-10 Symtec Limited Methods and devices for mapping data files
US20040160462A1 (en) * 2003-02-13 2004-08-19 Lumapix Method and system for interactive region segmentation
US20040215657A1 (en) * 2003-04-22 2004-10-28 Drucker Steven M. Relationship view
US20040268393A1 (en) * 2003-05-08 2004-12-30 Hunleth Frank A. Control framework with a zoomable graphical user interface for organizing, selecting and launching media items
US6842653B2 (en) * 2000-03-17 2005-01-11 Koninklijke Philips Electronics N.V. Method and apparatus for displaying a multi-level menu
US20050010868A1 (en) * 2003-07-11 2005-01-13 Schowtka Alexander K. System and method for automated product design
US20050188326A1 (en) * 2004-02-25 2005-08-25 Triworks Corp. Image assortment supporting device
US20050253806A1 (en) * 2004-04-30 2005-11-17 Hillcrest Communications, Inc. Free space pointing devices and methods
US20050262598A1 (en) * 1999-11-10 2005-11-24 Gaxiola Roberto A Proton transporters and uses in plants
US20050289593A1 (en) * 2004-05-26 2005-12-29 Skipjam Corp. Method and system for displaying and selecting content of an electronic program guide
US20060010395A1 (en) * 2004-07-09 2006-01-12 Antti Aaltonen Cute user interface
US20060026639A1 (en) * 2004-07-30 2006-02-02 Microsoft Corporation Interactive program information page and related methods
US7028050B1 (en) * 1999-04-15 2006-04-11 Canon Kabushiki Kaisha Data display apparatus and data display method
US20060123353A1 (en) * 2004-12-08 2006-06-08 Microsoft Corporation Method and system of taskbar button interfaces
US7337394B2 (en) * 2001-03-30 2008-02-26 Seiko Epson Corporation Digital content production system and digital content production program
US7343567B2 (en) * 2003-04-25 2008-03-11 Microsoft Corporation System and method for providing dynamic user information in an interactive display

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03206552A (en) 1989-10-17 1991-09-09 Sharp Corp Display processing system
US6005565A (en) 1997-03-25 1999-12-21 Sony Corporation Integrated search of electronic program guide, internet and other information resources
JP3636272B2 (en) * 1998-02-09 2005-04-06 富士通株式会社 Icon display method, apparatus thereof, and recording medium
JP4142175B2 (en) * 1998-10-20 2008-08-27 松下電器産業株式会社 Graphical user interface device
US6415226B1 (en) 1999-12-20 2002-07-02 Navigation Technologies Corp. Method and system for providing safe routes using a navigation system
EP1130502A1 (en) * 2000-02-29 2001-09-05 Sony Service Centre (Europe) N.V. Method and apparatus for inputting data
CN1596394A (en) * 2001-11-20 2005-03-16 皇家飞利浦电子股份有限公司 Method for entering a character sequence into an electronic device as well as an electronic device for performing said method
JP2004295159A (en) * 2003-02-07 2004-10-21 Sony Corp Icon display system and method, electronic equipment, and computer program
EP1620785A4 (en) * 2003-05-08 2011-09-07 Hillcrest Lab Inc A control framework with a zoomable graphical user interface for organizing, selecting and launching media items

Patent Citations (86)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4745402A (en) * 1987-02-19 1988-05-17 Rca Licensing Corporation Input device for a display system using phase-encoded signals
US5045843A (en) * 1988-12-06 1991-09-03 Selectech, Ltd. Optical pointing device
US5045843B1 (en) * 1988-12-06 1996-07-16 Selectech Ltd Optical pointing device
US5963916A (en) * 1990-09-13 1999-10-05 Intouch Group, Inc. Network apparatus and method for preview of music products and compilation of market data
US5341466A (en) * 1991-05-09 1994-08-23 New York University Fractal computer user centerface with zooming capability
US5359348A (en) * 1992-05-21 1994-10-25 Selectech, Ltd. Pointing device having improved automatic gain control and information reporting
US5638523A (en) * 1993-01-26 1997-06-10 Sun Microsystems, Inc. Method and apparatus for browsing information in a computer database
US5745710A (en) * 1993-05-24 1998-04-28 Sun Microsystems, Inc. Graphical user interface for selection of audiovisual programming
US5553217A (en) * 1993-09-23 1996-09-03 Ricoh Company, Ltd. Document layout using tiling
US5685002A (en) * 1993-09-29 1997-11-04 Minolta Co., Ltd. Image processing system capable of generating a multi-picture image
US5515488A (en) * 1994-08-30 1996-05-07 Xerox Corporation Method and apparatus for concurrent graphical visualization of a database search and its search history
US6314575B1 (en) * 1994-09-14 2001-11-06 Time Warner Entertainment Company, L.P. Telecasting service for providing video programs on demand with an interactive interface for facilitating viewer selection of video programs
US5671342A (en) * 1994-11-30 1997-09-23 Intel Corporation Method and apparatus for displaying information relating to a story and a story indicator in a computer system
US5615346A (en) * 1994-12-23 1997-03-25 International Business Machines Corporation Method and system for a piano bar browser of information sets
US6002394A (en) * 1995-10-02 1999-12-14 Starsight Telecast, Inc. Systems and methods for linking television viewers with advertisers and broadcasters
US6049823A (en) * 1995-10-04 2000-04-11 Hwang; Ivan Chung-Shung Multi server, interactive, video-on-demand television system utilizing a direct-access-on-demand workgroup
US5793438A (en) * 1995-11-13 1998-08-11 Hyundai Electronics America Electronic program guide with enhanced presentation
US20010035875A1 (en) * 1996-01-11 2001-11-01 Kenji Suzuki Image edit device adapted to rapidly lay-out photographs into templates with means for preview and correction by user
US5796395A (en) * 1996-04-02 1998-08-18 Wegener Internet Projects Bv System for publishing and searching interests of individuals
US6400406B1 (en) * 1996-06-28 2002-06-04 Samsung Electronics, Co., Ltd. Device and method for displaying broadcast program guide in a programmed recording system
US6412110B1 (en) * 1996-08-06 2002-06-25 Starsight Telecast, Inc. Electronic program guide with interactive areas
US20020129366A1 (en) * 1996-08-06 2002-09-12 Schein Steven Michael Electronic program guide with interactive areas
US6195089B1 (en) * 1996-08-14 2001-02-27 Samsung Electronics Co., Ltd. Television graphical user interface having variable channel changer icons
US6181333B1 (en) * 1996-08-14 2001-01-30 Samsung Electronics Co., Ltd. Television graphical user interface having channel and program sorting capabilities
US6016144A (en) * 1996-08-14 2000-01-18 Samsung Electronics Co., Ltd. Multi-layered television graphical user interface
US5835156A (en) * 1996-08-14 1998-11-10 Samsung Electroncis, Ltd. Television graphical user interface employing remote random access pointing device
US6191781B1 (en) * 1996-08-14 2001-02-20 Samsung Electronics, Ltd. Television graphical user interface that combines electronic program guide with graphical channel changer
US5955988A (en) * 1996-08-14 1999-09-21 Samsung Electronics Co., Ltd. Graphical user interface for establishing installation location for satellite based television system
US6057831A (en) * 1996-08-14 2000-05-02 Samsung Electronics Co., Ltd. TV graphical user interface having cursor position indicator
US5978043A (en) * 1996-08-14 1999-11-02 Samsung Electronics Co., Ltd. TV graphical user interface that provides customized lists of programming
US6411308B1 (en) * 1996-08-14 2002-06-25 Samsung Electronics Co., Ltd. Television graphical user interface having variable channel control bars
US5940072A (en) * 1996-08-15 1999-08-17 Samsung Information Systems America Graphics decompression using system ROM indexing in TV set top box
US5790121A (en) * 1996-09-06 1998-08-04 Sklar; Peter Clustering user interface
US6037933A (en) * 1996-11-13 2000-03-14 Samsung Electronics Co., Ltd. TV graphical user interface for providing user access to preset time periods of TV program information
US6154723A (en) * 1996-12-06 2000-11-28 The Board Of Trustees Of The University Of Illinois Virtual reality 3D interface system for data creation, viewing and editing
US5982369A (en) * 1997-04-21 1999-11-09 Sony Corporation Method for displaying on a screen of a computer system images representing search results
US6397387B1 (en) * 1997-06-02 2002-05-28 Sony Corporation Client and server system
US6175362B1 (en) * 1997-07-21 2001-01-16 Samsung Electronics Co., Ltd. TV graphical user interface providing selection among various lists of TV channels
US6088031A (en) * 1997-07-21 2000-07-11 Samsung Electronics Co., Ltd. Method and device for controlling selection of a menu item from a menu displayed on a screen
US6621452B2 (en) * 1997-08-19 2003-09-16 Siemens Vdo Automotive Corporation Vehicle information system
US6005578A (en) * 1997-09-25 1999-12-21 Mindsphere, Inc. Method and apparatus for visual navigation of information objects
US5912612A (en) * 1997-10-14 1999-06-15 Devolpi; Dean R. Multi-speed multi-direction analog pointing device
US6035323A (en) * 1997-10-24 2000-03-07 Pictra, Inc. Methods and apparatuses for distributing a collection of digital media over a network with automatic generation of presentable media
US6034684A (en) * 1997-11-24 2000-03-07 Sony Corporation Identification of data items on a screen display using landmark and grid line graphical objects
US20020051208A1 (en) * 1998-01-08 2002-05-02 Xerox Corporation Method for image layout using energy minimization
US6092076A (en) * 1998-03-24 2000-07-18 Navigation Technologies Corporation Method and system for map display in a navigation application
US6330858B1 (en) * 1998-06-05 2001-12-18 Navigation Technologies Corporation Method and system for scrolling a map display in a navigation application
US6268849B1 (en) * 1998-06-30 2001-07-31 United Video Properties, Inc. Internet television program guide system with embedded real-time data
US6529218B2 (en) * 1998-07-13 2003-03-04 Matsushita Electric Industrial Co., Ltd. Display control with movable or updatable auxiliary information
US6775659B2 (en) * 1998-08-26 2004-08-10 Symtec Limited Methods and devices for mapping data files
US6295646B1 (en) * 1998-09-30 2001-09-25 Intel Corporation Method and apparatus for displaying video data and corresponding entertainment data for multiple entertainment selection sources
US6288719B1 (en) * 1998-10-26 2001-09-11 Eastman Kodak Company System and method of constructing a photo album
US6765598B2 (en) * 1998-10-27 2004-07-20 Samsung Electronics Co., Ltd. Method and apparatus for enabling selection in an on-screen menu
US6735777B1 (en) * 1998-10-28 2004-05-11 Samsung Electronics Co., Ltd. Method for controlling program guide for displaying broadcast program title
US6452609B1 (en) * 1998-11-06 2002-09-17 Supertuner.Com Web application for accessing media streams
US6577350B1 (en) * 1998-12-21 2003-06-10 Sony Corporation Method and apparatus for displaying an electronic program guide
US6429813B2 (en) * 1999-01-14 2002-08-06 Navigation Technologies Corp. Method and system for providing end-user preferences with a navigation system
US7028050B1 (en) * 1999-04-15 2006-04-11 Canon Kabushiki Kaisha Data display apparatus and data display method
US6434556B1 (en) * 1999-04-16 2002-08-13 Board Of Trustees Of The University Of Illinois Visualization of Internet search information
US6426761B1 (en) * 1999-04-23 2002-07-30 Internation Business Machines Corporation Information presentation system for a graphical user interface
US20040117831A1 (en) * 1999-06-28 2004-06-17 United Video Properties, Inc. Interactive television program guide system and method with niche hubs
US6415225B1 (en) * 1999-08-06 2002-07-02 Aisin Aw Co., Ltd. Navigation system and a memory medium
US6349257B1 (en) * 1999-09-15 2002-02-19 International Business Machines Corporation System for personalized mobile navigation information
US6753849B1 (en) * 1999-10-27 2004-06-22 Ken Curran & Associates Universal remote TV mouse
US20050262598A1 (en) * 1999-11-10 2005-11-24 Gaxiola Roberto A Proton transporters and uses in plants
US6421067B1 (en) * 2000-01-16 2002-07-16 Isurftv Electronic programming guide
US6842653B2 (en) * 2000-03-17 2005-01-11 Koninklijke Philips Electronics N.V. Method and apparatus for displaying a multi-level menu
US20020112237A1 (en) * 2000-04-10 2002-08-15 Kelts Brett R. System and method for providing an interactive display interface for information objects
US20020059603A1 (en) * 2000-04-10 2002-05-16 Kelts Brett R. Interactive content guide for television programming
US20020075317A1 (en) * 2000-05-26 2002-06-20 Dardick Technologies System and method for an on-demand script-activated virtual keyboard
US6385542B1 (en) * 2000-10-18 2002-05-07 Magellan Dis, Inc. Multiple configurations for a vehicle navigation system
US20040073922A1 (en) * 2001-02-28 2004-04-15 True Steven Ray System and method for distinguishing between indentically titled programs
US7337394B2 (en) * 2001-03-30 2008-02-26 Seiko Epson Corporation Digital content production system and digital content production program
US20030167447A1 (en) * 2001-12-04 2003-09-04 Seiko Epson Corporation Layout editing program
US20030160815A1 (en) * 2002-02-28 2003-08-28 Muschetto James Edward Method and apparatus for accessing information, computer programs and electronic communications across multiple computing devices using a graphical user interface
US20040160462A1 (en) * 2003-02-13 2004-08-19 Lumapix Method and system for interactive region segmentation
US20040215657A1 (en) * 2003-04-22 2004-10-28 Drucker Steven M. Relationship view
US7343567B2 (en) * 2003-04-25 2008-03-11 Microsoft Corporation System and method for providing dynamic user information in an interactive display
US20040268393A1 (en) * 2003-05-08 2004-12-30 Hunleth Frank A. Control framework with a zoomable graphical user interface for organizing, selecting and launching media items
US20050010868A1 (en) * 2003-07-11 2005-01-13 Schowtka Alexander K. System and method for automated product design
US20050188326A1 (en) * 2004-02-25 2005-08-25 Triworks Corp. Image assortment supporting device
US20050253806A1 (en) * 2004-04-30 2005-11-17 Hillcrest Communications, Inc. Free space pointing devices and methods
US20050289593A1 (en) * 2004-05-26 2005-12-29 Skipjam Corp. Method and system for displaying and selecting content of an electronic program guide
US20060010395A1 (en) * 2004-07-09 2006-01-12 Antti Aaltonen Cute user interface
US20060026639A1 (en) * 2004-07-30 2006-02-02 Microsoft Corporation Interactive program information page and related methods
US20060123353A1 (en) * 2004-12-08 2006-06-08 Microsoft Corporation Method and system of taskbar button interfaces

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070048713A1 (en) * 2005-08-12 2007-03-01 Microsoft Corporation Media player service library
US20100035682A1 (en) * 2008-07-01 2010-02-11 Yoostar Entertainment Group, Inc. User interface systems and methods for interactive video systems
US8572651B2 (en) 2008-09-22 2013-10-29 EchoStar Technologies, L.L.C. Methods and apparatus for presenting supplemental information in an electronic programming guide
US8582957B2 (en) 2008-09-22 2013-11-12 EchoStar Technologies, L.L.C. Methods and apparatus for visually displaying recording timer information
US8763045B2 (en) 2008-09-30 2014-06-24 Echostar Technologies L.L.C. Systems and methods for providing customer service features via a graphical user interface in a television receiver
US20100079681A1 (en) * 2008-09-30 2010-04-01 Echostar Technologies Llc Systems and methods for graphical control of symbol-based features in a television receiver
US9357262B2 (en) 2008-09-30 2016-05-31 Echostar Technologies L.L.C. Systems and methods for graphical control of picture-in-picture windows
US8397262B2 (en) 2008-09-30 2013-03-12 Echostar Technologies L.L.C. Systems and methods for graphical control of user interface features in a television receiver
US8937687B2 (en) * 2008-09-30 2015-01-20 Echostar Technologies L.L.C. Systems and methods for graphical control of symbol-based features in a television receiver
US8473979B2 (en) 2008-09-30 2013-06-25 Echostar Technologies L.L.C. Systems and methods for graphical adjustment of an electronic program guide
US8793735B2 (en) 2008-09-30 2014-07-29 EchoStar Technologies, L.L.C. Methods and apparatus for providing multiple channel recall on a television receiver
US9100614B2 (en) 2008-10-31 2015-08-04 Echostar Technologies L.L.C. Graphical interface navigation based on image element proximity
WO2010080934A1 (en) * 2009-01-07 2010-07-15 David Colter Method and apparatus for user interface movement scheme
US8635547B2 (en) 2009-01-09 2014-01-21 Sony Corporation Display device and display method
US20110145860A1 (en) * 2009-12-15 2011-06-16 Yuan Wei Information processing apparatus, information processing method and program
CN102098469A (en) * 2009-12-15 2011-06-15 索尼公司 Information processing apparatus, information processing method and program
US8789098B2 (en) * 2009-12-15 2014-07-22 Sony Corporation Information processing apparatus, information processing method and program
US9185326B2 (en) * 2010-06-11 2015-11-10 Disney Enterprises, Inc. System and method enabling visual filtering of content
US20110307783A1 (en) * 2010-06-11 2011-12-15 Disney Enterprises, Inc. System and method enabling visual filtering of content
US8566747B2 (en) 2010-08-04 2013-10-22 Copia Interactive, Llc Method of and system for browsing and displaying items from a collection
WO2012018358A1 (en) * 2010-08-04 2012-02-09 Copia Interactive, Llc Method of and system for browsing and displaying items from a collection
US20130145321A1 (en) * 2011-12-02 2013-06-06 Kabushiki Kaisha Toshiba Information processing apparatus, method of controlling display and storage medium

Also Published As

Publication number Publication date
WO2006074266A2 (en) 2006-07-13
KR20070092262A (en) 2007-09-12
US20060150215A1 (en) 2006-07-06
KR101190462B1 (en) 2012-10-11
JP2008527539A (en) 2008-07-24
EP1834477A2 (en) 2007-09-19
CN101484869A (en) 2009-07-15
EP1834477A4 (en) 2016-12-28
WO2006074266A3 (en) 2008-11-20
CN101484869B (en) 2014-11-26
US7386806B2 (en) 2008-06-10

Similar Documents

Publication Publication Date Title
US7386806B2 (en) Scaling and layout methods and systems for handling one-to-many objects
US20180113589A1 (en) Systems and Methods for Node Tracking and Notification in a Control Framework Including a Zoomable Graphical User Interface
US8046705B2 (en) Systems and methods for resolution consistent semantic zooming
US8432358B2 (en) Methods and systems for enhancing television applications using 3D pointing
US8521587B2 (en) Systems and methods for placing advertisements
US7834849B2 (en) Control framework with a zoomable graphical user interface for organizing selecting and launching media items
US8555165B2 (en) Methods and systems for generating a zoomable graphical user interface
KR100994011B1 (en) A control framework with a zoomable graphical user interface for organizing, selecting and launching media items
US20060176403A1 (en) Distributed software construction for user interfaces
US20060262116A1 (en) Global navigation objects in user interfaces

Legal Events

Date Code Title Description
AS Assignment

Owner name: MULTIPLIER CAPITAL, LP, MARYLAND

Free format text: SECURITY AGREEMENT;ASSIGNOR:HILLCREST LABORATORIES, INC.;REEL/FRAME:037963/0405

Effective date: 20141002

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION

AS Assignment

Owner name: IDHL HOLDINGS, INC., DELAWARE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HILLCREST LABORATORIES, INC.;REEL/FRAME:042747/0445

Effective date: 20161222

AS Assignment

Owner name: HILLCREST LABORATORIES, INC., DELAWARE

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:MULTIPLIER CAPITAL, LP;REEL/FRAME:043339/0214

Effective date: 20170606