US20080218523A1 - System and method for navigation of display data - Google Patents
System and method for navigation of display data Download PDFInfo
- Publication number
- US20080218523A1 US20080218523A1 US11/684,482 US68448207A US2008218523A1 US 20080218523 A1 US20080218523 A1 US 20080218523A1 US 68448207 A US68448207 A US 68448207A US 2008218523 A1 US2008218523 A1 US 2008218523A1
- Authority
- US
- United States
- Prior art keywords
- sections
- section
- display
- graphic
- determining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0489—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
- G06F3/04892—Arrangements for controlling cursor position based on codes indicative of cursor displacements from one discrete location to another, e.g. using cursor control keys associated to different directions or using the tab key
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
Definitions
- the present invention relates, in general, to systems and methods for aiding a user in navigating through data for display on a display.
- display data may be presented on a display for illustrating various types of information to a user.
- maps, photographs, videos, and/or other graphic data is commonly presented via a display of an electronic device.
- Users often desire to navigate through the display data, such as by panning, zooming in, and/or zooming out through the display data.
- a user may desire to pan through the map to find a location of interest (e.g., a particular portion of a city), and then the user may desire to zoom in on the location of interest to view greater detail about such location (e.g., street names, etc. in the particular portion of the city).
- the screen is typically large enough to display a section of a map that is large enough to make out details, such as street names, terrain features, and the like, as well as show a sufficient amount of area around a specific area of interest.
- the user has many options to effectively interact with the map.
- To zoom into a specific area a user can continually select that area with a zoom tool.
- the user may also get an area of interest to show up in the center of the device display by clicking on that part of the map while in a “click to re-center” mode.
- a user may select and drag that part of the map to bring it into the center of the display screen.
- a mobile telephone may only provide a 5-way input interface that includes 4 directional inputs (e.g., left, right, up, and down buttons) and 1 selection input (e.g., an OK button). This may further increase the user's difficulty in navigating through display data.
- 4 directional inputs e.g., left, right, up, and down buttons
- 1 selection input e.g., an OK button
- map and direction providers present a requested map to a user on a mobile phone or PDA connected to the Internet.
- Examples of such mobile mapping and directions applications are Google, Inc.'s GOOGLETM Maps Mobile (GMM), and Verizon Wireless' VZ NAVIGATOR SM .
- GMM the user downloads the GMM software to the particular mobile phone or PDA, which then interacts via the Internet or wireless provider system with the map databases operated by Google, Inc.
- a portion of the map is generally downloaded to the user's device, with the particular area of interest being centered on the small screen of the device.
- the GMM application provides a user interface (UI) for the user to interact with the map by panning in four directions, zooming in, zooming out, re-centering, and the like.
- UI user interface
- the limitation of small panning movements makes it difficult to quickly look at the area surrounding the current view if the user desires to get a sense of where the current view is in relation to the larger area of the map.
- the zoom feature inserts a rectangle over the area in the middle of the screen that either is to be zoomed into or indicates the area from which the display was zoomed out from.
- the rectangle loosely frames the area on the screen that has or is to be expanded or zoomed into.
- the rectangle is only placed onto the display screen after the user indicates to perform one of the zoom directions. Thus, there is no indication to the user in advance of activating the feature as to what may happen when it is activated.
- the present invention and its various embodiments are directed to systems, methods, and computer program products for navigating through data for display on a display.
- a navigation application is provided which is operable (e.g., computer-executable) to aid a user in navigating through display data, such as through a graphic being displayed on a display.
- the navigation application presents visual indicators (or cues) on the display to divide the display data into a plurality of sections. For instance, grid lines may be overlaid on the display data to divide the display data into a plurality of sections defined by such grid lines.
- various navigation functions such as panning zooming-in, and zooming-out
- the displayed visual indicators may be used to aid a user in understanding the navigation operation being performed.
- animated movement of the visual indicators may be performed to provide a visual reference to the user regarding the navigation through the display data.
- animated movement of the visual indicators may provide a visual indication of the performance of such navigation functions as panning, zooming in, and/or zooming-out through the display data, while aiding the user in maintaining some reference as to the location within the overall display data to which the display has navigated.
- a first set of visual indicators such as grid lines, boxes, or the like, are layered over the portion of the document displayed on a display.
- the user selects a particular navigation task, which selection signal is received by the navigation application.
- the navigation application determines a section of interest from one of the multiple sections visually dividing the document portion based on the particular navigation task selected.
- a second set of visual indicators such as shading, coloring, or the like, is then layered over the portion of the document defined by all of the sections except for the section of interest.
- the navigation application will then animate movement of the document portion and both sets of visual indicators on the device display according to the particular navigation task selected.
- the navigation application analyzes the display data and determines the visual indicators to display based on the display data. For example, in one embodiment, the visual indicators are determined to divide the display data into a plurality of sections, wherein the sections may be of different sizes depending on the concentration of display data presented at the corresponding portion of the display. For instance, a greater number of smaller sections may be formed for areas of the display which contain highly concentrated display data, whereas a fewer number of larger sections may be formed for areas of the display that contain less concentrated display data. In this manner, in certain embodiments the visual indicators generated by the navigation application may dynamically vary depending on the display data being presented on the display. In other embodiments, the navigation application may generate an arrangement of visual indicators that is not dependent on the display data being presented.
- the navigation application is employed to navigate to data that is not currently being displayed on the display.
- display data exceeds the size of a given display screen. For instance, a user may pan to a portion of the display data that is not currently visible on the display.
- the display data to which the user desires to navigate may reside above, below, or to one side of the currently-displayed data being presented on the display.
- the display data to which the user desires to navigate may become visible only as the user zooms in or zooms out on the currently-displayed data.
- navigation of display data may involve navigating to data that is not currently visible on the display.
- certain embodiments of the present invention enable a user to perform such navigation of display data in a manner that aids the user in recognizing how, in reference to the currently-displayed display data, a given navigation unction moves to another portion of the display data (e.g., to a portion of the display data previously not visible on the display screen).
- the navigation application of certain embodiments of the present invention may be employed for navigating through any of various types of display data, such as map data (as may be presented by a mapping application), photographic data, video data, video-game data, etc.
- Such navigation application may be integrated as part of a presentation or viewing application that generates and/or presents the display data to a user, and/or the navigation application may be used in conjunction with such presentation or viewing application for allowing a user to navigate through the display data output by such presentation or viewing application to a display.
- exemplary techniques employed by embodiments of the navigation application may be particularly advantageous for use in navigating through display data in certain environments.
- the navigation application of certain embodiments may be particularly advantageous for use in navigating through display data presented on a small-screen display, such as a small-screen display of a mobile telephone, PDA, portable media player, digital camera, etc.
- the navigation application of certain embodiments may be particularly advantageous for use in navigating through display data presented by a system in which user input for navigation control is limited. For instance, many electronic devices, such as mobile telephones, often have limited support for user input for controlling navigation through display data.
- the user input may be limited to directional input (e.g., up, down, left, and right) and selection input (e.g., an “OK” button).
- directional input e.g., up, down, left, and right
- selection input e.g., an “OK” button
- Various other input device configurations provide limited user input for navigating through display data.
- a user may desire to use a more limited subset of inputs, such as the directional inputs (e.g., up, down, left, and right arrows on a keyboard) and selection input (e.g., Enter key on the keyboard) to navigate through display data, wherein certain embodiments of the navigation application may be employed in any such environment to assist a user's navigation through display data.
- FIG. 1A is a diagram illustrating a mobile phone operating a map application that uses a navigation system with a panning feature configured according to one embodiment of the present invention
- FIG. 1B is a diagram illustrating the mobile phone of FIG. 1A operating the panning feature configured according to one embodiment of the present invention
- FIG. 1C is a diagram of the mobile phone of FIG. 1A which illustrates a section in the center of the phone's display;
- FIG. 1D is a diagram illustrating the mobile phone of FIG. 1A after completing the panning animation of the panning feature configured according to one embodiment of the present invention
- FIG. 1E is a diagram illustrating the mobile phone of FIG. 1D operating another panning feature according to one embodiment of the present invention
- FIG. 2A is a diagram illustrating the mobile phone of FIG. 1A operating the zoom-in feature configured according to one embodiment of the present invention
- FIG. 2B illustrates the display of the mobile phone of FIG. 1A after the user activates the navigation feature
- FIG. 2C is a diagram illustrating the mobile phone of FIG. 1A after the selected section has been fully enlarged or magnified to fit the entirety of the display;
- FIG. 2D is a diagram illustrating the mobile phone of FIG. 1A operating the zoom-in feature configured according to one embodiment of the present invention
- FIG. 3A is a diagram illustrating the mobile phone of FIG. 1A operating a zoom-out feature configured according to one embodiment of the present invention
- FIG. 3B is a diagram illustrating the mobile phone of FIG. 1A operation a zoom-out feature configured according to one embodiment of the present invention
- FIG. 3C is a diagram illustrating the mobile phone of FIG. 1A operating a zoom-out feature configured according to one embodiment of the present invention
- FIG. 3D is a diagram illustrating the mobile phone of FIG. 1A operating a zoom-out feature configured according to one embodiment of the present invention
- FIG. 4A is a diagram illustrating a media device including a view application including a navigation feature configured according to one embodiment of the present invention
- FIG. 4B is a diagram illustrating a media device including a view application including a navigation feature configured according to one embodiment of the present invention
- FIG. 4C is a diagram illustrating a media device including a view application including a navigation feature configured according to one embodiment of the present invention
- FIG. 5 is a flowchart illustrating an operational flow of a navigation application according to one embodiment of the present invention.
- FIG. 6 illustrates an exemplary computer system adapted to use embodiments of the present invention.
- FIG. 1A is a diagram illustrating mobile phone 10 operating a map application that uses a navigation application configured according to one embodiment of the present invention.
- the navigation application may support one or more navigation functions, such as panning, zooming-in, and/or zooming-out to enable a user to navigate through display data, such as a map presented by the map application.
- FIGS. 1A-1E illustrate an exemplary panning unction that is supported by the navigation application.
- Mobile phone 10 includes display 100 with map 101 displayed thereon.
- Mobile phone 10 also includes navigation pad 102 , which provides a 5-way input interface for the user to interact with the content on display 100 .
- the navigation application of the map application causes a first set of visual indicators or dividers, such as grid lines 103 - 106 , to be overlaid on top of map 101 .
- Grid lines 103 - 106 divide display 100 into nine sections having an aspect ratio equivalent to that of display 100 .
- grid lines 103 - 106 may be rendered in any different number of ways, such that the lines may be visible to the user but not so dark so as to interfere with perception of the underlying content. Moreover, in various additional and/or alternative embodiments of the present invention, the positions of grid lines 103 - 106 may also be editable by a user.
- display 100 may be divided into a different number of sections which would not have to share the same or similar aspect ratio of display 100 .
- the present invention is not limited to merely the example embodiment illustrated in FIG. 1A .
- FIG. 1B is a diagram illustrating mobile phone 10 operating the panning unction according to one embodiment of the present invention.
- the panning function places a second set of visual indicators, such as shading, over each of the display sections defined by grid lines 103 - 106 ( FIG. 1A ), except for section 108 . This shading de-emphasizes each of the sections created by grid lines 103 - 106 , except for section 108 , which was determined by the panning function to be the section of interest.
- FIG. 1C is a diagram of mobile phone 10 which illustrates section 108 in the center of display 100 as a result of the above-described operation in FIG. 1B .
- the map application with the panning function configured according to one embodiment of the present invention takes section 108 at the top of display 100 ( FIG. 1B ) and animates it from that position to the center position, as illustrated in FIG. 1C .
- the portion of the map that had been above the top of display 100 (and thus not originally visible on the display 100 in FIG.
- the panning function places a second set of visual indicators, such as shading, over each of the display sections defined by the grid lines 103 - 106 (in FIG. 1D ), except for section 110 .
- This shading de-emphasizes each of the sections created by grid lines 103 - 106 , except for section 110 , which was determined by the panning function to be the section of interest.
- this is the section (i.e., section 110 ) that will be moved to the center of the display, and then the shading, is removed in a manner similar to that described above when the user panned up.
- a result of such panning to the right is shown on display 100 in FIG. 2A .
- the addition of the grid lines and shading provides sets of visual indicators that convey information to the user regarding what will happen once the user selects a particular navigation unction and then will give feedback to the user as the navigation function is executing in order to help the user maintain context.
- the shading also provides the feedback that focuses the user attention on the selected section allowing him or her to more easily follow the execution of the navigation function in context with the original state of the selected section.
- FIGS. 2A-2D illustrate an exemplary zoom-in function that is supported by the navigation application according to one embodiment of the present invention.
- FIG. 2A is a diagram illustrating mobile phone 10 operating the navigation application according to one embodiment of the present invention.
- the navigation system in the map application operating in mobile phone 10 provides for zooming in or magnifying to view a more detailed level of map 101 .
- the user With the area of interest centered at section 201 , defined by grid lines 103 - 106 , the user activates navigation pad 102 at point 200 .
- FIG. 2B illustrates display 100 of mobile phone 10 after the user activates point 200 of navigation pad 102 .
- Section 201 Upon activation of the zoom-in function, shading is rendered over the other display sections created by grid lines 103 - 106 except for section 201 .
- Section 201 also begins to enlarge, thus, magnifying the portion of map 101 that is displayed in section 201 .
- the shading provides visual indication to the user of the steps that are taking place on the zoom-in. First, it highlights that the area of interest is found in section 201 because it remains un-shaded, and second, as the map application animates the enlargement or magnification of the portion of map 101 within section 201 , the shaded portions appear to the user to be moving off of the visible area of display 100 , thus, providing the user with a context of enlarging or zooming-in from the original point of view.
- FIG. 2C is a diagram illustrating mobile phone 10 after section 201 has been fully enlarged or magnified to fit the entirety of display 100 .
- mobile phone 10 completes animating section 201 to its complete size, it appears on display 100 without grid lines 103 - 106 , thus, indicating to the user that section 201 has been fully zoomed into.
- FIG. 2D is a diagram that illustrates the next step of the zoom-in unction according to this exemplary embodiment, in which grid lines 103 - 106 are replaced on top of enlarged section 201 and dividing it into nine additional sections.
- the user may select to continue navigating through the display data by, as examples, zooming into one of the new sections, may choose to pan around map 101 at the new, more magnified level, or may choose to zoom out to view a less magnified level.
- FIGS. 3A-3D illustrate an exemplary zoom-out function that is supported by the navigation application according to one embodiment of the present invention.
- FIG. 3A is a diagram illustrating mobile phone 10 operating the navigation application according to one embodiment of the present invention.
- the navigation application may further provide for zooming out to view a less detailed or de-magnified level of map 101 .
- the navigation application provides for a zoom-out function when the user activates soft key 300 .
- FIG. 3B is a diagram that illustrates the beginning of the zoom-out function after the user selects soft key 300 . Because the entire content of display 100 will be reduced, grid lines 103 - 106 ( FIG. 3A ) are removed from the original display and instead are moved outside of the visible area of display 100 , such that their new intersections form section 301 as the new center section, which, as illustrated in FIG. 3A at the beginning of the zoom-out function, encompasses the entire content of display 100 .
- the underlying map image remains constant while the grid lines 103 - 106 are animated and moved so as to make a larger portion of the image be contained in the center block 301 .
- the center block 301 is expanded beyond that shown in FIG. 3B to encompass the entire display data that is visible on display 100 in FIG. 3B .
- the navigation application animates the reduction process by shrinking the part of map 101 within section 301 .
- the additional sections formed by grid lines 103 - 106 are added to display 100 in order to maintain the context of the map portion displayed in section 301 being a contiguous part of the whole map 101 .
- These additional sections outside of section 301 are added to display 100 with shading to provide a visual indicator to the user that the context of the operation is zooming out to reveal a less detailed, less magnified part of map 101 .
- the additional parts of map 101 that correspond to the surrounding area of map 101 shown in section 301 are also rendered on display 100 .
- the un-shaded section 301 is animated to become smaller and smaller while the shaded area overlaying the remaining sections formed by grid lines 103 - 106 and their corresponding parts of map 101 become larger in relation to display 100 .
- FIG. 3C is a diagram illustrating map 101 after zooming out from the previous display of section 301 (of FIG. 3B ).
- Section 301 is now visible as the center section of the nine sections of display 100 formed by grid lines 103 - 106 , which gridlines 103 - 106 have now been returned to their typical location on display 100 .
- the shading is still covering the other sections as an indicator to the user of the particular feature process that has been executed.
- the navigation application running on mobile phone 10 removes the shading and displays map 101 in its new view aspect, as illustrated in the diagram of mobile phone 10 in FIG. 3D .
- the visual indicators e.g., grid lines
- the visual indicators defining the center section encompass a larger area of the display (e.g., the full display 100 ), such as in FIG. 3B wherein the grid lines are expanded outward to result in the center section 301 including all of the map data currently shown on display 100 .
- the visual indicators are then animated and moved back to their original size, shrinking the display data contained therein, thereby resulting in the map data of display 100 shown in FIG. 3B being contained within the re-sized center section 301 in FIG. 3C .
- the adjacent display data that was not visible on display 100 in FIG. 3B becomes visible in FIGS. 3C-3D , as a result of this zoom-out operation.
- the grid lines disclosed as a part of the various embodiments of the present invention may be rendered on the display screen may be spaced equally and may define 9 equal sections of the display. However, as noted above, the grid lines may also be rendered in such a way that they generate more or fewer than 9 sections and may generate sections of varying sizes.
- embodiments of the present invention are not limited in application for navigation through a map, but may additionally or alternatively be employed for navigating through any type of display data (e.g., document, image, photograph, video games, etc.).
- display data e.g., document, image, photograph, video games, etc.
- embodiments of the present invention are not limited in application for use on a mobile telephone, but may additionally or alternatively be employed on other types of electronic devices, including without limitation personal computers, laptop computers, PDAs, portable media players, digital cameras, video cameras, gaming devices (e.g., portable video gaming devices), etc.
- exemplary techniques employed by embodiments of the navigation application may be particularly advantageous for use in navigating through display data in certain environments.
- the navigation application of certain embodiments may be particularly advantageous for use in navigating through display data presented on a small-screen display, such as small-screen display 100 of mobile telephone 10 shown in FIGS. 1-3 .
- the navigation application of certain embodiments may be particularly advantageous for use in navigating through display data presented by a system in which user input for navigation control is limited. For instance, many electronic devices, such as mobile telephone 10 , often have limited support for user input for controlling navigation through display data.
- the user input may be limited to directional input (e.g., up, down, left, and right) and selection input (e.g., an “OK” button), such as that provided by 5-way interface 102 of mobile telephone 10 (shown in FIG. 1A ), and button 300 of mobile telephone 10 shown in FIG. 3A .
- selection input e.g., an “OK” button
- Various other input device configurations provide limited user input mechanisms for navigating through display data (such as the navigation wheel 416 provided by device 40 in FIG. 4A , discussed below).
- a user may desire to use a more limited subset of input mechanisms, such as the directional inputs (e.g., up, down, left, and right arrows on a keyboard) and selection input (e.g., Enter key on the keyboard) to perform navigation functions for navigating through display data, wherein certain embodiments of the navigation application may be employed in any such environment to assist a user's navigation through display data.
- the directional inputs e.g., up, down, left, and right arrows on a keyboard
- selection input e.g., Enter key on the keyboard
- the navigation application analyzes the display data and determines the visual indicators to display based on the display data.
- the visual indicators are determined to divide the display data into a plurality of sections, wherein the sections may be of different sizes depending on the concentration of display data presented at the corresponding portion of the display. For instance, a greater number of smaller sections may be formed for areas of the display which contain highly concentrated display data, whereas a fewer number of larger sections may be formed for areas of the display that contain less concentrated display data.
- the visual indicators generated by the navigation application may dynamically vary depending on the display data being presented on the display.
- FIG. 4A is a diagram illustrating media device 40 (e.g., portable media player, such as an iPodTM, a digital camera, etc.) including a view application with a navigation application configured according to one embodiment of the present invention executing thereon.
- Media device 40 may be any type of media device, whether used primarily as a personal music player, personal video player, gaining device, digital camera, or the like.
- media device 40 is displaying one section or portion of a larger document, such as a photograph. The actual photograph is not illustrated in FIG. 4A in order to present more clear detail of the operation of the illustrated embodiment of the present invention.
- the view application which allows the user to view the photograph on media device 40 , renders grid lines on display 400 that define sections 401 - 415 .
- the navigation application in this example receives input from navigation wheel 416 and function buttons 417 - 418 to allow the user to issue navigation signals to navigate around the photograph.
- the navigation application contains logic that analyzes the graphic image, such as the photograph displayed on media device 40 , to determine the plurality of sections to be defined by the grid lines based at least in part on the concentrations of the display data being presented. For instance, in this example, the logic analyzes the graphic image and determines the high-data areas, e.g., areas that include substantial variations in colors, multiple edges, and the like, as distinguished from low-data areas e.g., areas that repeat the same color or have very little variation in pixel data. Based on the analysis of the graphic image, such as the photograph, the view application generates grid lines and grid sections that may allow a more granular ability to navigate the more high-data areas, while maintaining a minimum navigation ability of the low-data areas.
- the graphic image such as the photograph displayed on media device 40
- the logic analyzes the graphic image and determines the high-data areas, e.g., areas that include substantial variations in colors, multiple edges, and the like, as distinguished from low-data areas e.g., areas that repeat the
- the generation of the grid lines for defining the sections may dynamically vary based on the underlying data concentrations.
- the section sizes may dynamically vary in relation to the concentration of underlying data being displayed.
- the navigation application has created additional smaller (or finer) sections in the area covered by sections 405 - 412 .
- the part of the photograph beneath sections 405 - 412 are more high-data than the other sections, in this example.
- FIG. 4B is a diagram illustrating media device 40 after the user selects to zoom into section 406 .
- the navigation application renders shading over each of the sections on display 41 except section 406 to indicate to the user that the area of interest is section 406 .
- the navigation application then animates the zooming process by making section 406 larger until it fits within display 41 .
- FIG. 4C is a diagram illustrating media device 40 when the zooming function has been completed.
- the shading is no longer present, because all of the shaded regions have been moved off of the visible area of display 400 , and a new set of grid lines are overlaid on the content of section 406 , thus, creating new sections 419 - 427 .
- the new sections 419 - 427 may be of varying size depending on their respective underlying display data concentrations.
- FIG. 5 is a flowchart illustrating an exemplary operational flow of a navigation application according to one embodiment of the present invention.
- a first set of visual indicators is rendered over a graphic displayed on an electronic display, where the visual indicators visually divide the graphic into a plurality of sections.
- the electronic display is a small-screen display on which the graphic is displayed, such as commonly included for such electronic devices as mobile telephones, PDAs, digital cameras, portable media players, video cameras, portable gaming devices, etc.
- the graphic being displayed is a subpart of a larger graphic that is too large to fit completely on the display.
- a signal is received, in step 501 , to execute a navigation function.
- Such navigation function may comprise a panning function, zoom-in function, or zoom-out function, as examples.
- the signal may be received by the navigation application in response to user input to an electronic device indicating a desired navigation unction to be initiated to navigate to a desired display of the graphic data.
- a section of interest is determined from the plurality of sections, in step 502 , according to the selected navigation function.
- a second set of visual indicators is layered over the parts of the graphic defined by each unselected sections.
- movement of the graphic and both sets of visual indicators is animated on the device display according to the particular navigation function selected.
- various elements of embodiments of the present invention are in essence the software code defining the operations of such various elements.
- the executable instructions or software code may be obtained from a readable medium (e.g., a hard drive media, optical media, EPROM, EEPROM, tape media, cartridge media, flash memory, ROM, memory stick, and/or the like) or communicated via a data signal from a communication medium (e.g., the Internet).
- readable media can include any medium that can store or transfer information.
- FIG. 6 illustrates an exemplary computer system 600 on which the navigation application may be implemented according to one embodiment of the present invention.
- Central processing unit (CPU) 601 is coupled to system bus 602 .
- CPU 601 may be any general-purpose CPU. The present invention is not restricted by the architecture of CPU 601 (or other components of exemplary system 600 ) as long as CPU 601 (and other components of system 600 ) supports the inventive operations as described herein.
- CPU 601 may execute the various logical instructions according to embodiments of the present invention. For example, CPU 601 may execute machine-level instructions according to the exemplary operational flow described above in conjunction with FIG. 5 .
- Computer system 600 also preferably includes random access memory (RAM) 603 , which may be SRAM, DRAM, SDRAM, or the like.
- Computer system 600 preferably includes read-only memory (ROM) 604 which may be PROM, EPROM, EEPROM, or the like.
- RAM 603 and ROM 604 hold user and system data and programs, as is well known in the art.
- Computer system 600 also preferably includes input/output (S/O) adapter 605 , communications adapter 611 , user interface adapter 608 , and display adapter 609 .
- I/O adapter 605 , user interface adapter 608 , and or communications adapter 611 may, in certain embodiments, enable a user to interact with computer system 600 in order to input information, such as to indicate a desired navigation function to be performed for navigating through display data.
- I/O adapter 605 preferably connects to storage device(s) 606 , such as one or more of hard drive, compact disc (CD) drive, floppy disk drive, tape drive, etc. to computer system 600 .
- the I/O adapter 605 is also connected to a printer (not shown), which would allow the system to print paper copies of information such as documents, photographs, articles, and the like.
- the printer may be a printer (e.g., dot matrix, laser, and the like), a fax machine, scanner, or a copier machine.
- the storage devices may be utilized when RAM 603 is insufficient for the memory requirements associated with storing data for operations of the navigation application.
- Communications adapter 611 is preferably adapted to couple computer system 600 to network 612 , which may enable information to be input to and/or output from system 600 via such network 612 (e.g., the Internet or other wide-area network, a local-area network, a public or private switched telephony network, a wireless network, any combination of the foregoing).
- network 612 e.g., the Internet or other wide-area network, a local-area network, a public or private switched telephony network, a wireless network, any combination of the foregoing.
- an application generating display data may execute remote from computer system 600 and such display data may be input to system 600 via network 612 from a remote computer, and/or navigation commands may be output and communicated via network 612 to a remote computer.
- User interface adapter 608 couples user input devices, such as keyboard 613 and pointing device 607 to computer system 600 .
- Display adapter 609 is driven by CPU 601 to control the display on display device 610 to, for example, display
- the present invention is not limited to the architecture of system 600 .
- any suitable processor-based device may be utilized for implementing the exemplary embodiments of the navigation application described above, including without limitation personal computers, laptop computers, computer workstations, multi-processor servers, mobile telephones, PDAs, portable media players, etc.
- embodiments of the present invention may be implemented on application specific integrated circuits (ASICs) or very large scale integrated (VLSI) circuits.
- ASICs application specific integrated circuits
- VLSI very large scale integrated circuits.
- persons of ordinary skill in the art may utilize any number of suitable structures capable of executing (logical operations according to the embodiments of the present invention.
Abstract
Description
- The present invention relates, in general, to systems and methods for aiding a user in navigating through data for display on a display.
- Commonly today, display data may be presented on a display for illustrating various types of information to a user. For instance, maps, photographs, videos, and/or other graphic data is commonly presented via a display of an electronic device. Users often desire to navigate through the display data, such as by panning, zooming in, and/or zooming out through the display data. As an example, when viewing a map being displayed, a user may desire to pan through the map to find a location of interest (e.g., a particular portion of a city), and then the user may desire to zoom in on the location of interest to view greater detail about such location (e.g., street names, etc. in the particular portion of the city). Challenges arise in enabling a user to navigate efficiently through the display data, particularly in a manner that aids the user in not becoming “lost” within the display data. That is, it becomes desirable to aid a user in navigating through display data in a manner that the user can understand where within the display data he/she is navigating.
- Particular navigation challenges are presented when displays are small and/or when the user input available for controlling the navigation is limited. Many devices provide small displays and/or limited user input for controlling navigation. For instance, electronic devices such as mobile phones, personal digital assistants (PDAs), and the like, often have small screen displays wherein a user may desire to navigate through information, such a map, a large spread sheet, a large graphic, or the like, that exceeds the display area of the screen. In such case, only a small portion of the information may be presented at a given time on the small screen display, and it becomes desirable to assist a user in navigating through the information while maintaining a sense of how the information fits together. On desktop computers, the screen is typically large enough to display a section of a map that is large enough to make out details, such as street names, terrain features, and the like, as well as show a sufficient amount of area around a specific area of interest. With the larger screen and a pointing device, the user has many options to effectively interact with the map. To zoom into a specific area, a user can continually select that area with a zoom tool. The user may also get an area of interest to show up in the center of the device display by clicking on that part of the map while in a “click to re-center” mode. Alternatively, a user may select and drag that part of the map to bring it into the center of the display screen.
- Having much smaller screens, mobile phone and PDA users will typically need to zoom closer into a map, a graphic, spreadsheet, or the like, to make out details such as street names, illustration details, cell entries, and the like. It generally takes many steps of panning and zooming to get a particular area of interest to show up at the desired size and position on the display screen. At such a detailed view, the user may not easily be able to look at the area surrounding the area of interest represented on the screen without executing many additional panning and zooming steps, which may cause the user to lose context of the area of interest the user initially desired to see. Overall, instead of feeling like holding a portable foldable map in your hands, the resulting experience is more like interacting with a wall-sized map by looking through a cardboard tube and then walking closer or farther from the wall to zoom in and out. Further, many such electronic devices provide only limited user input ability for controlling the navigation. For instance, a mobile telephone may only provide a 5-way input interface that includes 4 directional inputs (e.g., left, right, up, and down buttons) and 1 selection input (e.g., an OK button). This may further increase the user's difficulty in navigating through display data.
- In the mobile map application space, map and direction providers present a requested map to a user on a mobile phone or PDA connected to the Internet. Examples of such mobile mapping and directions applications are Google, Inc.'s GOOGLE™ Maps Mobile (GMM), and Verizon Wireless' VZ NAVIGATORSM. In GMM, the user downloads the GMM software to the particular mobile phone or PDA, which then interacts via the Internet or wireless provider system with the map databases operated by Google, Inc. In response to a request from a user, a portion of the map is generally downloaded to the user's device, with the particular area of interest being centered on the small screen of the device. The GMM application provides a user interface (UI) for the user to interact with the map by panning in four directions, zooming in, zooming out, re-centering, and the like. When panning around the map, only small movements are made in any of the selected directions. More of the map is downloaded to accommodate this panning. However, the limitation of small panning movements makes it difficult to quickly look at the area surrounding the current view if the user desires to get a sense of where the current view is in relation to the larger area of the map.
- While the small panning steps make such localized panning more difficult, larger palming steps would not necessarily solve this difficulty in a desirable manner. If the UI simply panned further with each key press, the user would tend to lose track of where they are on the map if it moves too far from its previous position. Therefore, when the user selects to pan in any particular direction, only a very small amount of distance is moved, in order to preserve the user's context in interacting with the subject map. However, even with limiting the amount of movement between each series of pans, the user's experience may be tenuous because there is also nothing that conveys what is happening to the user as the user operates the interface controls.
- In one feature of GMM, the zoom feature, GMM inserts a rectangle over the area in the middle of the screen that either is to be zoomed into or indicates the area from which the display was zoomed out from. The rectangle loosely frames the area on the screen that has or is to be expanded or zoomed into. However, the rectangle is only placed onto the display screen after the user indicates to perform one of the zoom directions. Thus, there is no indication to the user in advance of activating the feature as to what may happen when it is activated.
- The present invention and its various embodiments are directed to systems, methods, and computer program products for navigating through data for display on a display. A navigation application is provided which is operable (e.g., computer-executable) to aid a user in navigating through display data, such as through a graphic being displayed on a display. In certain embodiments, the navigation application presents visual indicators (or cues) on the display to divide the display data into a plurality of sections. For instance, grid lines may be overlaid on the display data to divide the display data into a plurality of sections defined by such grid lines. Furthers various navigation functions, such as panning zooming-in, and zooming-out, may be supported by the navigation application, wherein the displayed visual indicators may be used to aid a user in understanding the navigation operation being performed. For instance, animated movement of the visual indicators may be performed to provide a visual reference to the user regarding the navigation through the display data. Thus, for example, animated movement of the visual indicators may provide a visual indication of the performance of such navigation functions as panning, zooming in, and/or zooming-out through the display data, while aiding the user in maintaining some reference as to the location within the overall display data to which the display has navigated.
- In one exemplary embodiment; a first set of visual indicators, such as grid lines, boxes, or the like, are layered over the portion of the document displayed on a display. The user selects a particular navigation task, which selection signal is received by the navigation application. The navigation application determines a section of interest from one of the multiple sections visually dividing the document portion based on the particular navigation task selected. A second set of visual indicators, such as shading, coloring, or the like, is then layered over the portion of the document defined by all of the sections except for the section of interest. The navigation application will then animate movement of the document portion and both sets of visual indicators on the device display according to the particular navigation task selected.
- In certain embodiments, the navigation application analyzes the display data and determines the visual indicators to display based on the display data. For example, in one embodiment, the visual indicators are determined to divide the display data into a plurality of sections, wherein the sections may be of different sizes depending on the concentration of display data presented at the corresponding portion of the display. For instance, a greater number of smaller sections may be formed for areas of the display which contain highly concentrated display data, whereas a fewer number of larger sections may be formed for areas of the display that contain less concentrated display data. In this manner, in certain embodiments the visual indicators generated by the navigation application may dynamically vary depending on the display data being presented on the display. In other embodiments, the navigation application may generate an arrangement of visual indicators that is not dependent on the display data being presented.
- In many cases, the navigation application is employed to navigate to data that is not currently being displayed on the display. In many cases, display data exceeds the size of a given display screen. For instance, a user may pan to a portion of the display data that is not currently visible on the display. For example, the display data to which the user desires to navigate may reside above, below, or to one side of the currently-displayed data being presented on the display. As another example, the display data to which the user desires to navigate may become visible only as the user zooms in or zooms out on the currently-displayed data. Thus, navigation of display data may involve navigating to data that is not currently visible on the display. As described further herein, certain embodiments of the present invention enable a user to perform such navigation of display data in a manner that aids the user in recognizing how, in reference to the currently-displayed display data, a given navigation unction moves to another portion of the display data (e.g., to a portion of the display data previously not visible on the display screen).
- The navigation application of certain embodiments of the present invention may be employed for navigating through any of various types of display data, such as map data (as may be presented by a mapping application), photographic data, video data, video-game data, etc. Such navigation application may be integrated as part of a presentation or viewing application that generates and/or presents the display data to a user, and/or the navigation application may be used in conjunction with such presentation or viewing application for allowing a user to navigate through the display data output by such presentation or viewing application to a display.
- While embodiments of the present invention are not limited in application to any particular type of data or display device, exemplary techniques employed by embodiments of the navigation application may be particularly advantageous for use in navigating through display data in certain environments. For instance, the navigation application of certain embodiments may be particularly advantageous for use in navigating through display data presented on a small-screen display, such as a small-screen display of a mobile telephone, PDA, portable media player, digital camera, etc. Additionally or alternatively, the navigation application of certain embodiments may be particularly advantageous for use in navigating through display data presented by a system in which user input for navigation control is limited. For instance, many electronic devices, such as mobile telephones, often have limited support for user input for controlling navigation through display data. For example, the user input may be limited to directional input (e.g., up, down, left, and right) and selection input (e.g., an “OK” button). Various other input device configurations provide limited user input for navigating through display data. Additionally, in some instances, even though a device, such as a personal computer, may provide great flexibility to a user in inputting navigation commands (e.g., using a mouse, etc.), a user may desire to use a more limited subset of inputs, such as the directional inputs (e.g., up, down, left, and right arrows on a keyboard) and selection input (e.g., Enter key on the keyboard) to navigate through display data, wherein certain embodiments of the navigation application may be employed in any such environment to assist a user's navigation through display data.
- The foregoing has outlined rather broadly the features and technical advantages of the present invention in order that the detailed description of the invention that follows may be better understood. Additional features and advantages of the invention will be described hereinafter which form the subject of the claims of the invention. It should be appreciated by those skilled in the art that the conception and specific embodiment disclosed may be readily utilized as a basis for modifying or designing other structures for carrying out the same purposes of the present invention. It should also be realized by those skilled in the are that such equivalent constructions do not depart from the spirit and scope of the invention as set forth in the appended claims. The novel features which are believed to be characteristics of the invention, both as to its organization and method of operation, together with further objects and advantages will be better understood from the following description when considered in connection with the accompanying figures. It is to be expressly understood, however, that each of the figures is provided for the purpose of illustration and description only and is not intended as a definition of the limits of the present invention.
- For a more complete understanding of the present invention, reference is now made to the following descriptions taken in conjunction with the accompanying drawing in which:
-
FIG. 1A is a diagram illustrating a mobile phone operating a map application that uses a navigation system with a panning feature configured according to one embodiment of the present invention; -
FIG. 1B is a diagram illustrating the mobile phone ofFIG. 1A operating the panning feature configured according to one embodiment of the present invention; -
FIG. 1C is a diagram of the mobile phone ofFIG. 1A which illustrates a section in the center of the phone's display; -
FIG. 1D is a diagram illustrating the mobile phone ofFIG. 1A after completing the panning animation of the panning feature configured according to one embodiment of the present invention; -
FIG. 1E is a diagram illustrating the mobile phone ofFIG. 1D operating another panning feature according to one embodiment of the present invention; -
FIG. 2A is a diagram illustrating the mobile phone ofFIG. 1A operating the zoom-in feature configured according to one embodiment of the present invention; -
FIG. 2B illustrates the display of the mobile phone ofFIG. 1A after the user activates the navigation feature; -
FIG. 2C is a diagram illustrating the mobile phone ofFIG. 1A after the selected section has been fully enlarged or magnified to fit the entirety of the display; -
FIG. 2D is a diagram illustrating the mobile phone ofFIG. 1A operating the zoom-in feature configured according to one embodiment of the present invention; -
FIG. 3A is a diagram illustrating the mobile phone ofFIG. 1A operating a zoom-out feature configured according to one embodiment of the present invention; -
FIG. 3B is a diagram illustrating the mobile phone ofFIG. 1A operation a zoom-out feature configured according to one embodiment of the present invention; -
FIG. 3C is a diagram illustrating the mobile phone ofFIG. 1A operating a zoom-out feature configured according to one embodiment of the present invention; -
FIG. 3D is a diagram illustrating the mobile phone ofFIG. 1A operating a zoom-out feature configured according to one embodiment of the present invention; -
FIG. 4A is a diagram illustrating a media device including a view application including a navigation feature configured according to one embodiment of the present invention; -
FIG. 4B is a diagram illustrating a media device including a view application including a navigation feature configured according to one embodiment of the present invention; -
FIG. 4C is a diagram illustrating a media device including a view application including a navigation feature configured according to one embodiment of the present invention; -
FIG. 5 is a flowchart illustrating an operational flow of a navigation application according to one embodiment of the present invention; and -
FIG. 6 illustrates an exemplary computer system adapted to use embodiments of the present invention. -
FIG. 1A is a diagram illustratingmobile phone 10 operating a map application that uses a navigation application configured according to one embodiment of the present invention. The navigation application may support one or more navigation functions, such as panning, zooming-in, and/or zooming-out to enable a user to navigate through display data, such as a map presented by the map application. The example ofFIGS. 1A-1E illustrate an exemplary panning unction that is supported by the navigation application.Mobile phone 10 includesdisplay 100 withmap 101 displayed thereon.Mobile phone 10 also includesnavigation pad 102, which provides a 5-way input interface for the user to interact with the content ondisplay 100. The navigation application of the map application causes a first set of visual indicators or dividers, such as grid lines 103-106, to be overlaid on top ofmap 101. Grid lines 103-106divide display 100 into nine sections having an aspect ratio equivalent to that ofdisplay 100. - It should be noted that in operation of additional and/or alternative embodiments of the present invention, grid lines 103-106 may be rendered in any different number of ways, such that the lines may be visible to the user but not so dark so as to interfere with perception of the underlying content. Moreover, in various additional and/or alternative embodiments of the present invention, the positions of grid lines 103-106 may also be editable by a user.
- It should further be noted that in additional and/or alternative embodiments of the present invention,
display 100 may be divided into a different number of sections which would not have to share the same or similar aspect ratio ofdisplay 100. The present invention is not limited to merely the example embodiment illustrated inFIG. 1A . -
FIG. 1B is a diagram illustratingmobile phone 10 operating the panning unction according to one embodiment of the present invention. When the user desires to pan up or move laterally up frommap 101 displayed ondisplay 100, he or she manipulatesnavigation pad 102 at point 107 (i.e., activates the up directional-input button 107). When the entry atpoint 107 is received, the panning function places a second set of visual indicators, such as shading, over each of the display sections defined by grid lines 103-106 (FIG. 1A ), except forsection 108. This shading de-emphasizes each of the sections created by grid lines 103-106, except forsection 108, which was determined by the panning function to be the section of interest. As the top-most section ofmap 101 displayed ondisplay 100, this is the section that will be moved to the center of the display. The user's attention is, therefore, drawn tosection 108, indicating that some operation will occur with regard tosection 108.FIG. 1C is a diagram ofmobile phone 10 which illustratessection 108 in the center ofdisplay 100 as a result of the above-described operation inFIG. 1B . The map application with the panning function configured according to one embodiment of the present invention takessection 108 at the top of display 100 (FIG. 1B ) and animates it from that position to the center position, as illustrated inFIG. 1C . The portion of the map that had been above the top of display 100 (and thus not originally visible on thedisplay 100 inFIG. 1A ) is then rendered as the new top row as the map is panned down. Similarly, the bottom row that was originally visible ondisplay 100 inFIG. 1A is moved off the bottom ofdisplay 100 as the map is panned down. When the painting animation and operation is complete, the visual pan indicator, i.e., the shading, is removed from the other grid sections to reveal the newly positioned map, as illustrated inFIG. 1D . - With reference now to
FIG. 1E if the user next desires to pan to the right, he or she manipulatesnavigation pad 102 at point 109 (i.e., activates the right directional-input button 109). When the entry atpoint 109 is received, the panning function places a second set of visual indicators, such as shading, over each of the display sections defined by the grid lines 103-106 (inFIG. 1D ), except forsection 110. This shading de-emphasizes each of the sections created by grid lines 103-106, except forsection 110, which was determined by the panning function to be the section of interest. As the right-most section ofmap 101 displayed ondisplay 100, this is the section (i.e., section 110) that will be moved to the center of the display, and then the shading, is removed in a manner similar to that described above when the user panned up. A result of such panning to the right is shown ondisplay 100 inFIG. 2A . - It should be noted that the addition of the grid lines and shading provides sets of visual indicators that convey information to the user regarding what will happen once the user selects a particular navigation unction and then will give feedback to the user as the navigation function is executing in order to help the user maintain context. With the grid lines, once the user sees the operation of the navigation, he or she will understand what will happen with each navigation selection even before the navigation selection is made. The shading also provides the feedback that focuses the user attention on the selected section allowing him or her to more easily follow the execution of the navigation function in context with the original state of the selected section. Thus, the user is provided with a natural and intuitive experience.
- The example of
FIGS. 2A-2D illustrate an exemplary zoom-in function that is supported by the navigation application according to one embodiment of the present invention.FIG. 2A is a diagram illustratingmobile phone 10 operating the navigation application according to one embodiment of the present invention. In addition to panning around map 101 (in the manner discussed above withFIGS. 1A-1E ), the navigation system in the map application operating inmobile phone 10 provides for zooming in or magnifying to view a more detailed level ofmap 101. With the area of interest centered atsection 201, defined by grid lines 103-106, the user activatesnavigation pad 102 atpoint 200.FIG. 2B illustratesdisplay 100 ofmobile phone 10 after the user activatespoint 200 ofnavigation pad 102. Upon activation of the zoom-in function, shading is rendered over the other display sections created by grid lines 103-106 except forsection 201.Section 201 also begins to enlarge, thus, magnifying the portion ofmap 101 that is displayed insection 201. The shading provides visual indication to the user of the steps that are taking place on the zoom-in. First, it highlights that the area of interest is found insection 201 because it remains un-shaded, and second, as the map application animates the enlargement or magnification of the portion ofmap 101 withinsection 201, the shaded portions appear to the user to be moving off of the visible area ofdisplay 100, thus, providing the user with a context of enlarging or zooming-in from the original point of view. -
FIG. 2C is a diagram illustratingmobile phone 10 aftersection 201 has been fully enlarged or magnified to fit the entirety ofdisplay 100. Whenmobile phone 10 completes animatingsection 201 to its complete size, it appears ondisplay 100 without grid lines 103-106, thus, indicating to the user thatsection 201 has been fully zoomed into.FIG. 2D , is a diagram that illustrates the next step of the zoom-in unction according to this exemplary embodiment, in which grid lines 103-106 are replaced on top ofenlarged section 201 and dividing it into nine additional sections. Here, the user may select to continue navigating through the display data by, as examples, zooming into one of the new sections, may choose to pan around map 101 at the new, more magnified level, or may choose to zoom out to view a less magnified level. - The example of
FIGS. 3A-3D illustrate an exemplary zoom-out function that is supported by the navigation application according to one embodiment of the present invention.FIG. 3A is a diagram illustratingmobile phone 10 operating the navigation application according to one embodiment of the present invention. In addition to providing a zoom-in function that magnifies the parts of map 101 (as in the example discussed above withFIGS. 2A-2D ), the navigation application may further provide for zooming out to view a less detailed or de-magnified level ofmap 101. According to the exemplary embodiment shown inFIG. 3A , the navigation application provides for a zoom-out function when the user activatessoft key 300. Instead of focusing on a single section defined by grid lines 103-106, the zoom-out function's area of interest is the content displayed in a plurality of the sections displayed, such as all of thesections defining display 100 in this example.FIG. 3B is a diagram that illustrates the beginning of the zoom-out function after the user selectssoft key 300. Because the entire content ofdisplay 100 will be reduced, grid lines 103-106 (FIG. 3A ) are removed from the original display and instead are moved outside of the visible area ofdisplay 100, such that their new intersections formsection 301 as the new center section, which, as illustrated inFIG. 3A at the beginning of the zoom-out function, encompasses the entire content ofdisplay 100. In this example, the underlying map image remains constant while the grid lines 103-106 are animated and moved so as to make a larger portion of the image be contained in thecenter block 301. In this example, thecenter block 301 is expanded beyond that shown inFIG. 3B to encompass the entire display data that is visible ondisplay 100 inFIG. 3B . - Further, in this exemplary embodiment; the navigation application animates the reduction process by shrinking the part of
map 101 withinsection 301. Assection 301 is reduced, the additional sections formed by grid lines 103-106 are added to display 100 in order to maintain the context of the map portion displayed insection 301 being a contiguous part of thewhole map 101. These additional sections outside ofsection 301 are added to display 100 with shading to provide a visual indicator to the user that the context of the operation is zooming out to reveal a less detailed, less magnified part ofmap 101. Also, beneath the shading that is provided, the additional parts ofmap 101 that correspond to the surrounding area ofmap 101 shown insection 301 are also rendered ondisplay 100. Theun-shaded section 301 is animated to become smaller and smaller while the shaded area overlaying the remaining sections formed by grid lines 103-106 and their corresponding parts ofmap 101 become larger in relation to display 100. -
FIG. 3C is adiagram illustrating map 101 after zooming out from the previous display of section 301 (ofFIG. 3B ).Section 301 is now visible as the center section of the nine sections ofdisplay 100 formed by grid lines 103-106, which gridlines 103-106 have now been returned to their typical location ondisplay 100. The shading is still covering the other sections as an indicator to the user of the particular feature process that has been executed. After a predetermined time displaying the shading to the user, the navigation application running onmobile phone 10 removes the shading and displays map 101 in its new view aspect, as illustrated in the diagram ofmobile phone 10 inFIG. 3D . - Thus, in the exemplary embodiment of zooming-out shown in
FIGS. 3A-3D , the visual indicators (e.g., grid lines) surrounding a center section of display data are animated and moved over the underlying display data until the visual indicators defining the center section encompass a larger area of the display (e.g., the full display 100), such as inFIG. 3B wherein the grid lines are expanded outward to result in thecenter section 301 including all of the map data currently shown ondisplay 100. The visual indicators are then animated and moved back to their original size, shrinking the display data contained therein, thereby resulting in the map data ofdisplay 100 shown inFIG. 3B being contained within the re-sizedcenter section 301 inFIG. 3C . And, as discussed above, the adjacent display data that was not visible ondisplay 100 inFIG. 3B becomes visible inFIGS. 3C-3D , as a result of this zoom-out operation. - The grid lines disclosed as a part of the various embodiments of the present invention may be rendered on the display screen may be spaced equally and may define 9 equal sections of the display. However, as noted above, the grid lines may also be rendered in such a way that they generate more or fewer than 9 sections and may generate sections of varying sizes.
- While the above examples show navigation through map data presented by a mapping application, embodiments of the present invention are not limited in application for navigation through a map, but may additionally or alternatively be employed for navigating through any type of display data (e.g., document, image, photograph, video games, etc.). Furthers while the above examples show the navigation application as being employed on a
mobile telephone 10, embodiments of the present invention are not limited in application for use on a mobile telephone, but may additionally or alternatively be employed on other types of electronic devices, including without limitation personal computers, laptop computers, PDAs, portable media players, digital cameras, video cameras, gaming devices (e.g., portable video gaming devices), etc. - While embodiments of the present invention are not limited in application to any particular type of data or display device, exemplary techniques employed by embodiments of the navigation application may be particularly advantageous for use in navigating through display data in certain environments. For instance, the navigation application of certain embodiments may be particularly advantageous for use in navigating through display data presented on a small-screen display, such as small-
screen display 100 ofmobile telephone 10 shown inFIGS. 1-3 . Additionally or alternatively, the navigation application of certain embodiments may be particularly advantageous for use in navigating through display data presented by a system in which user input for navigation control is limited. For instance, many electronic devices, such asmobile telephone 10, often have limited support for user input for controlling navigation through display data. For example, the user input may be limited to directional input (e.g., up, down, left, and right) and selection input (e.g., an “OK” button), such as that provided by 5-way interface 102 of mobile telephone 10 (shown inFIG. 1A ), andbutton 300 ofmobile telephone 10 shown inFIG. 3A . Various other input device configurations provide limited user input mechanisms for navigating through display data (such as thenavigation wheel 416 provided bydevice 40 inFIG. 4A , discussed below). Additionally, in some instances, even though a device, such as a personal computer, may provide great flexibility to a user in inputting navigation commands (e.g., may enable input using a mouse, etc.), a user may desire to use a more limited subset of input mechanisms, such as the directional inputs (e.g., up, down, left, and right arrows on a keyboard) and selection input (e.g., Enter key on the keyboard) to perform navigation functions for navigating through display data, wherein certain embodiments of the navigation application may be employed in any such environment to assist a user's navigation through display data. - In certain embodiments, the navigation application analyzes the display data and determines the visual indicators to display based on the display data. For example, in one embodiment, the visual indicators are determined to divide the display data into a plurality of sections, wherein the sections may be of different sizes depending on the concentration of display data presented at the corresponding portion of the display. For instance, a greater number of smaller sections may be formed for areas of the display which contain highly concentrated display data, whereas a fewer number of larger sections may be formed for areas of the display that contain less concentrated display data. In this manner, in certain embodiments the visual indicators generated by the navigation application may dynamically vary depending on the display data being presented on the display.
-
FIG. 4A is a diagram illustrating media device 40 (e.g., portable media player, such as an iPod™, a digital camera, etc.) including a view application with a navigation application configured according to one embodiment of the present invention executing thereon.Media device 40 may be any type of media device, whether used primarily as a personal music player, personal video player, gaining device, digital camera, or the like. In the present example,media device 40 is displaying one section or portion of a larger document, such as a photograph. The actual photograph is not illustrated inFIG. 4A in order to present more clear detail of the operation of the illustrated embodiment of the present invention. The view application, which allows the user to view the photograph onmedia device 40, renders grid lines ondisplay 400 that define sections 401-415. The navigation application in this example receives input fromnavigation wheel 416 and function buttons 417-418 to allow the user to issue navigation signals to navigate around the photograph. - In this exemplary embodiment, the navigation application contains logic that analyzes the graphic image, such as the photograph displayed on
media device 40, to determine the plurality of sections to be defined by the grid lines based at least in part on the concentrations of the display data being presented. For instance, in this example, the logic analyzes the graphic image and determines the high-data areas, e.g., areas that include substantial variations in colors, multiple edges, and the like, as distinguished from low-data areas e.g., areas that repeat the same color or have very little variation in pixel data. Based on the analysis of the graphic image, such as the photograph, the view application generates grid lines and grid sections that may allow a more granular ability to navigate the more high-data areas, while maintaining a minimum navigation ability of the low-data areas. Thus, the generation of the grid lines for defining the sections may dynamically vary based on the underlying data concentrations. In other words, the section sizes may dynamically vary in relation to the concentration of underlying data being displayed. In the example illustrated inFIG. 4A , the navigation application has created additional smaller (or finer) sections in the area covered by sections 405-412. The part of the photograph beneath sections 405-412 are more high-data than the other sections, in this example. - In operation, suppose the user desires to zoom into the portion of the photograph within
section 406. The user rolls his or her finger aroundnavigation wheel 416, which is touch-sensitive, to cycle between each of sections 401-415. When the user stops onsection 406, he or she clicks on the center button ofnavigation wheel 416 to select to zoom intosection 406.FIG. 4B is a diagram illustratingmedia device 40 after the user selects to zoom intosection 406. On the selection of the zoom feature, the navigation application renders shading over each of the sections ondisplay 41 exceptsection 406 to indicate to the user that the area of interest issection 406. The navigation application then animates the zooming process by makingsection 406 larger until it fits withindisplay 41. During this animation, the shaded region of sections 401-405 and 407-415 along with those sections are moved from the visible region ofdisplay 41.FIG. 4C is a diagram illustratingmedia device 40 when the zooming function has been completed. On completion, the shading is no longer present, because all of the shaded regions have been moved off of the visible area ofdisplay 400, and a new set of grid lines are overlaid on the content ofsection 406, thus, creating new sections 419-427. Again, the new sections 419-427 may be of varying size depending on their respective underlying display data concentrations. -
FIG. 5 is a flowchart illustrating an exemplary operational flow of a navigation application according to one embodiment of the present invention. Inoperational block 500, a first set of visual indicators is rendered over a graphic displayed on an electronic display, where the visual indicators visually divide the graphic into a plurality of sections. In certain embodiments, the electronic display is a small-screen display on which the graphic is displayed, such as commonly included for such electronic devices as mobile telephones, PDAs, digital cameras, portable media players, video cameras, portable gaming devices, etc. Thus, in many cases, the graphic being displayed is a subpart of a larger graphic that is too large to fit completely on the display. - A signal is received, in
step 501, to execute a navigation function. Such navigation function may comprise a panning function, zoom-in function, or zoom-out function, as examples. The signal may be received by the navigation application in response to user input to an electronic device indicating a desired navigation unction to be initiated to navigate to a desired display of the graphic data. - A section of interest is determined from the plurality of sections, in
step 502, according to the selected navigation function. Instep 503, a second set of visual indicators is layered over the parts of the graphic defined by each unselected sections. Instep 504, movement of the graphic and both sets of visual indicators is animated on the device display according to the particular navigation function selected. - When implemented via computer-executable instructions, various elements of embodiments of the present invention are in essence the software code defining the operations of such various elements. The executable instructions or software code may be obtained from a readable medium (e.g., a hard drive media, optical media, EPROM, EEPROM, tape media, cartridge media, flash memory, ROM, memory stick, and/or the like) or communicated via a data signal from a communication medium (e.g., the Internet). In fact, readable media can include any medium that can store or transfer information.
-
FIG. 6 illustrates anexemplary computer system 600 on which the navigation application may be implemented according to one embodiment of the present invention. Central processing unit (CPU) 601 is coupled tosystem bus 602.CPU 601 may be any general-purpose CPU. The present invention is not restricted by the architecture of CPU 601 (or other components of exemplary system 600) as long as CPU 601 (and other components of system 600) supports the inventive operations as described herein.CPU 601 may execute the various logical instructions according to embodiments of the present invention. For example,CPU 601 may execute machine-level instructions according to the exemplary operational flow described above in conjunction withFIG. 5 . -
Computer system 600 also preferably includes random access memory (RAM) 603, which may be SRAM, DRAM, SDRAM, or the like.Computer system 600 preferably includes read-only memory (ROM) 604 which may be PROM, EPROM, EEPROM, or the like.RAM 603 andROM 604 hold user and system data and programs, as is well known in the art. -
Computer system 600 also preferably includes input/output (S/O)adapter 605,communications adapter 611,user interface adapter 608, anddisplay adapter 609. I/O adapter 605,user interface adapter 608, and orcommunications adapter 611 may, in certain embodiments, enable a user to interact withcomputer system 600 in order to input information, such as to indicate a desired navigation function to be performed for navigating through display data. - I/
O adapter 605 preferably connects to storage device(s) 606, such as one or more of hard drive, compact disc (CD) drive, floppy disk drive, tape drive, etc. tocomputer system 600. The I/O adapter 605 is also connected to a printer (not shown), which would allow the system to print paper copies of information such as documents, photographs, articles, and the like. Note that the printer may be a printer (e.g., dot matrix, laser, and the like), a fax machine, scanner, or a copier machine. The storage devices may be utilized whenRAM 603 is insufficient for the memory requirements associated with storing data for operations of the navigation application.Communications adapter 611 is preferably adapted to couplecomputer system 600 tonetwork 612, which may enable information to be input to and/or output fromsystem 600 via such network 612 (e.g., the Internet or other wide-area network, a local-area network, a public or private switched telephony network, a wireless network, any combination of the foregoing). For instance, an application generating display data may execute remote fromcomputer system 600 and such display data may be input tosystem 600 vianetwork 612 from a remote computer, and/or navigation commands may be output and communicated vianetwork 612 to a remote computer.User interface adapter 608 couples user input devices, such askeyboard 613 andpointing device 607 tocomputer system 600.Display adapter 609 is driven byCPU 601 to control the display ondisplay device 610 to, for example, display the underlying data and navigation indicators (e.g., grid lines) provided by the navigation application according to certain embodiments of the present invention. - It shall be appreciated that the present invention is not limited to the architecture of
system 600. For example, any suitable processor-based device may be utilized for implementing the exemplary embodiments of the navigation application described above, including without limitation personal computers, laptop computers, computer workstations, multi-processor servers, mobile telephones, PDAs, portable media players, etc. Moreover, embodiments of the present invention may be implemented on application specific integrated circuits (ASICs) or very large scale integrated (VLSI) circuits. In fact, persons of ordinary skill in the art may utilize any number of suitable structures capable of executing (logical operations according to the embodiments of the present invention. - Although the present invention and its advantages have been described in detail, it should be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims. Moreover, the scope of the present application is not intended to be limited to the particular embodiments of the process, machine, manufacture, composition of matter, means, methods and steps described in the specification. As one of ordinary skill in the art will readily appreciate from the disclosure of the present invention, processes, machines, manufacture, compositions of matter, means, methods, or steps, presently existing or later to be developed that perform substantially the same function or achieve substantially the same result as the corresponding embodiments described herein may be utilized according to the present invention. Accordingly, the appended claims are intended to include within their scope such processes, machines, manufacture, compositions of matter, means, methods, or steps.
Claims (50)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/684,482 US7750825B2 (en) | 2007-03-09 | 2007-03-09 | System and method for navigation of display data |
PCT/US2008/053950 WO2008112383A2 (en) | 2007-03-09 | 2008-02-14 | System and method for navigation of display data |
CN2008800075778A CN101652741B (en) | 2007-03-09 | 2008-02-14 | System and method for navigation of display data |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/684,482 US7750825B2 (en) | 2007-03-09 | 2007-03-09 | System and method for navigation of display data |
Publications (2)
Publication Number | Publication Date |
---|---|
US20080218523A1 true US20080218523A1 (en) | 2008-09-11 |
US7750825B2 US7750825B2 (en) | 2010-07-06 |
Family
ID=39741182
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/684,482 Active 2028-06-23 US7750825B2 (en) | 2007-03-09 | 2007-03-09 | System and method for navigation of display data |
Country Status (3)
Country | Link |
---|---|
US (1) | US7750825B2 (en) |
CN (1) | CN101652741B (en) |
WO (1) | WO2008112383A2 (en) |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080231642A1 (en) * | 2004-10-27 | 2008-09-25 | Hewlett-Packard Development Company, L.P. | Data Distribution System and Method Therefor |
US20090009535A1 (en) * | 2007-07-02 | 2009-01-08 | Taro Iwamoto | Display processing device and display control method |
US20090315917A1 (en) * | 2008-06-19 | 2009-12-24 | Fuji Xerox Co., Ltd. | Information display apparatus, information displaying method, and computer readable medium |
US20100097399A1 (en) * | 2008-10-20 | 2010-04-22 | Research In Motion Limited | Method and system for rendering of labels |
US20100225649A1 (en) * | 2009-03-06 | 2010-09-09 | Casio Computer Co., Ltd. | Graph display control apparatus and graph display control method |
US20120054663A1 (en) * | 2010-08-24 | 2012-03-01 | Lg Electronics Inc. | Mobile terminal and method of setting an application indicator therein |
US8552992B1 (en) * | 2008-06-30 | 2013-10-08 | Amazon Technologies, Inc. | Systems and methods for textual input using multi-directional input devices |
US20140078184A1 (en) * | 2012-09-14 | 2014-03-20 | Canon Kabushiki Kaisha | Apparatus, method, and program |
US20150141042A1 (en) * | 2012-06-27 | 2015-05-21 | Ntt Docomo, Inc. | Mobile terminal, system and method |
US20150212712A1 (en) * | 2004-03-02 | 2015-07-30 | Microsoft Corporation | Advanced navigation techniques for portable devices |
US9170111B2 (en) * | 2010-12-07 | 2015-10-27 | Tomtom International B.V. | Mapping or navigation apparatus and method of operation thereof |
US20160224226A1 (en) * | 2010-12-01 | 2016-08-04 | Sony Corporation | Display processing apparatus for performing image magnification based on face detection |
JP2016212431A (en) * | 2016-07-14 | 2016-12-15 | キヤノン株式会社 | Display control device, display control method, and program |
US20170223176A1 (en) * | 2007-06-29 | 2017-08-03 | Apple Inc. | Portable multifunction device with animated user interface transitions |
US20200004842A1 (en) * | 2018-06-27 | 2020-01-02 | Uber Technologies, Inc. | Visual search system for finding trip destination |
US10620780B2 (en) | 2007-09-04 | 2020-04-14 | Apple Inc. | Editing interface |
US10852914B2 (en) | 2010-12-20 | 2020-12-01 | Apple Inc. | Device, method, and graphical user interface for navigation of concurrently open software applications |
US11016643B2 (en) | 2019-04-15 | 2021-05-25 | Apple Inc. | Movement of user interface object with user-specified content |
US11126321B2 (en) | 2007-09-04 | 2021-09-21 | Apple Inc. | Application menu user interface |
US20220351444A1 (en) * | 2019-09-24 | 2022-11-03 | XVI Inc. | Animation production method |
US11907497B2 (en) * | 2005-11-07 | 2024-02-20 | Google Llc | Multiple views of a geographic area on a mobile device |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090244095A1 (en) * | 2008-04-01 | 2009-10-01 | Research In Motion Limited | Run-time label cache for efficient map labeling |
KR20100065744A (en) * | 2008-12-08 | 2010-06-17 | 엔에이치엔(주) | Method and apparatus for transcoding web page to be suitable for mobile device |
JP5652097B2 (en) * | 2010-10-01 | 2015-01-14 | ソニー株式会社 | Image processing apparatus, program, and image processing method |
CN102157005B (en) * | 2011-04-19 | 2012-11-21 | 无锡永中软件有限公司 | Document view drawing method applied to small-screen equipment |
US11069066B2 (en) | 2019-08-21 | 2021-07-20 | Adobe Inc. | Dynamically change tracker speed, switch crop rectangles, and display invisible corners via zoom-loupes |
US10901589B1 (en) | 2019-08-21 | 2021-01-26 | Adobe Inc. | Automatic zoom-loupe creation, selection, layout, and rendering based on interaction with crop rectangle |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4630065A (en) * | 1982-05-17 | 1986-12-16 | Honda Giken Kogyo Kabushiki Kaisha | Current location indication apparatus for use in an automotive vehicle |
US5945985A (en) * | 1992-10-27 | 1999-08-31 | Technology International, Inc. | Information system for interactive access to geographic information |
US6460000B1 (en) * | 1999-07-12 | 2002-10-01 | Anritsu Corporation | Measurement data display apparatus |
US6476831B1 (en) * | 2000-02-11 | 2002-11-05 | International Business Machine Corporation | Visual scrolling feedback and method of achieving the same |
US6538670B1 (en) * | 1999-01-25 | 2003-03-25 | Sanyo Electric Company, Ltd. | Pointing method |
US6559868B2 (en) * | 1998-03-05 | 2003-05-06 | Agilent Technologies, Inc. | Graphically relating a magnified view to a simultaneously displayed main view in a signal measurement system |
US20030189553A1 (en) * | 2000-06-13 | 2003-10-09 | Michael Goren | Rapid entry of data and information on a reduced size input area |
US6750886B1 (en) * | 2000-01-26 | 2004-06-15 | Donald B. Bergstedt | Method and software for displaying information on a display area of a screen of an electronic device |
US6956590B1 (en) * | 2001-02-28 | 2005-10-18 | Navteq North America, Llc | Method of providing visual continuity when panning and zooming with a map display |
US7031728B2 (en) * | 2004-09-21 | 2006-04-18 | Beyer Jr Malcolm K | Cellular phone/PDA communication system |
US20060161868A1 (en) * | 2005-01-19 | 2006-07-20 | Microsoft Corporation | Dynamic stacking and expansion of visual items |
US7088266B2 (en) * | 2002-10-18 | 2006-08-08 | Nissan Motor Co., Ltd. | Map image display device |
US20060195801A1 (en) * | 2005-02-28 | 2006-08-31 | Ryuichi Iwamura | User interface with thin display device |
US20060224315A1 (en) * | 2005-03-31 | 2006-10-05 | Denso Corporation | Map display device |
US7296232B1 (en) * | 2002-04-01 | 2007-11-13 | Microsoft Corporation | Calendar control for selection of time periods to filter data |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7317449B2 (en) | 2004-03-02 | 2008-01-08 | Microsoft Corporation | Key-based advanced navigation techniques |
-
2007
- 2007-03-09 US US11/684,482 patent/US7750825B2/en active Active
-
2008
- 2008-02-14 CN CN2008800075778A patent/CN101652741B/en active Active
- 2008-02-14 WO PCT/US2008/053950 patent/WO2008112383A2/en active Application Filing
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4630065A (en) * | 1982-05-17 | 1986-12-16 | Honda Giken Kogyo Kabushiki Kaisha | Current location indication apparatus for use in an automotive vehicle |
US5945985A (en) * | 1992-10-27 | 1999-08-31 | Technology International, Inc. | Information system for interactive access to geographic information |
US6559868B2 (en) * | 1998-03-05 | 2003-05-06 | Agilent Technologies, Inc. | Graphically relating a magnified view to a simultaneously displayed main view in a signal measurement system |
US6538670B1 (en) * | 1999-01-25 | 2003-03-25 | Sanyo Electric Company, Ltd. | Pointing method |
US6460000B1 (en) * | 1999-07-12 | 2002-10-01 | Anritsu Corporation | Measurement data display apparatus |
US6750886B1 (en) * | 2000-01-26 | 2004-06-15 | Donald B. Bergstedt | Method and software for displaying information on a display area of a screen of an electronic device |
US6476831B1 (en) * | 2000-02-11 | 2002-11-05 | International Business Machine Corporation | Visual scrolling feedback and method of achieving the same |
US20030189553A1 (en) * | 2000-06-13 | 2003-10-09 | Michael Goren | Rapid entry of data and information on a reduced size input area |
US6956590B1 (en) * | 2001-02-28 | 2005-10-18 | Navteq North America, Llc | Method of providing visual continuity when panning and zooming with a map display |
US7296232B1 (en) * | 2002-04-01 | 2007-11-13 | Microsoft Corporation | Calendar control for selection of time periods to filter data |
US7088266B2 (en) * | 2002-10-18 | 2006-08-08 | Nissan Motor Co., Ltd. | Map image display device |
US7031728B2 (en) * | 2004-09-21 | 2006-04-18 | Beyer Jr Malcolm K | Cellular phone/PDA communication system |
US20060161868A1 (en) * | 2005-01-19 | 2006-07-20 | Microsoft Corporation | Dynamic stacking and expansion of visual items |
US20060195801A1 (en) * | 2005-02-28 | 2006-08-31 | Ryuichi Iwamura | User interface with thin display device |
US20060224315A1 (en) * | 2005-03-31 | 2006-10-05 | Denso Corporation | Map display device |
Cited By (42)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150212712A1 (en) * | 2004-03-02 | 2015-07-30 | Microsoft Corporation | Advanced navigation techniques for portable devices |
US20080231642A1 (en) * | 2004-10-27 | 2008-09-25 | Hewlett-Packard Development Company, L.P. | Data Distribution System and Method Therefor |
US8184128B2 (en) * | 2004-10-27 | 2012-05-22 | Hewlett-Packard Development Company, L. P. | Data distribution system and method therefor |
US11907497B2 (en) * | 2005-11-07 | 2024-02-20 | Google Llc | Multiple views of a geographic area on a mobile device |
US10761691B2 (en) * | 2007-06-29 | 2020-09-01 | Apple Inc. | Portable multifunction device with animated user interface transitions |
US20170223176A1 (en) * | 2007-06-29 | 2017-08-03 | Apple Inc. | Portable multifunction device with animated user interface transitions |
US20200363919A1 (en) * | 2007-06-29 | 2020-11-19 | Apple Inc. | Portable multifunction device with animated user interface transitions |
US11507255B2 (en) * | 2007-06-29 | 2022-11-22 | Apple Inc. | Portable multifunction device with animated sliding user interface transitions |
US20090009535A1 (en) * | 2007-07-02 | 2009-01-08 | Taro Iwamoto | Display processing device and display control method |
US8743150B2 (en) * | 2007-07-02 | 2014-06-03 | Alpine Electronics, Inc. | Display processing device and display control method |
US11126321B2 (en) | 2007-09-04 | 2021-09-21 | Apple Inc. | Application menu user interface |
US10620780B2 (en) | 2007-09-04 | 2020-04-14 | Apple Inc. | Editing interface |
US11010017B2 (en) | 2007-09-04 | 2021-05-18 | Apple Inc. | Editing interface |
US11861138B2 (en) | 2007-09-04 | 2024-01-02 | Apple Inc. | Application menu user interface |
US11604559B2 (en) | 2007-09-04 | 2023-03-14 | Apple Inc. | Editing interface |
US8576252B2 (en) * | 2008-06-19 | 2013-11-05 | Fuji Xerox Co., Ltd. | Information display apparatus, information displaying method, and computer readable medium |
US20090315917A1 (en) * | 2008-06-19 | 2009-12-24 | Fuji Xerox Co., Ltd. | Information display apparatus, information displaying method, and computer readable medium |
US8552992B1 (en) * | 2008-06-30 | 2013-10-08 | Amazon Technologies, Inc. | Systems and methods for textual input using multi-directional input devices |
US8400478B2 (en) * | 2008-10-20 | 2013-03-19 | Research In Motion Limited | Method and system for rendering of labels |
US20100097399A1 (en) * | 2008-10-20 | 2010-04-22 | Research In Motion Limited | Method and system for rendering of labels |
US8624930B2 (en) | 2008-10-20 | 2014-01-07 | Blackberry Limitied | Method and system for rendering of labels |
US8542254B2 (en) * | 2009-03-06 | 2013-09-24 | Casio Computer Co., Ltd. | Graph display control apparatus and graph display control method |
US20100225649A1 (en) * | 2009-03-06 | 2010-09-09 | Casio Computer Co., Ltd. | Graph display control apparatus and graph display control method |
US9052927B2 (en) * | 2010-08-24 | 2015-06-09 | Lg Electronics Inc. | Mobile terminal and method of setting an application indicator therein |
US20120054663A1 (en) * | 2010-08-24 | 2012-03-01 | Lg Electronics Inc. | Mobile terminal and method of setting an application indicator therein |
US20160224226A1 (en) * | 2010-12-01 | 2016-08-04 | Sony Corporation | Display processing apparatus for performing image magnification based on face detection |
US10642462B2 (en) * | 2010-12-01 | 2020-05-05 | Sony Corporation | Display processing apparatus for performing image magnification based on touch input and drag input |
US9170111B2 (en) * | 2010-12-07 | 2015-10-27 | Tomtom International B.V. | Mapping or navigation apparatus and method of operation thereof |
US11487404B2 (en) | 2010-12-20 | 2022-11-01 | Apple Inc. | Device, method, and graphical user interface for navigation of concurrently open software applications |
US10852914B2 (en) | 2010-12-20 | 2020-12-01 | Apple Inc. | Device, method, and graphical user interface for navigation of concurrently open software applications |
US11880550B2 (en) | 2010-12-20 | 2024-01-23 | Apple Inc. | Device, method, and graphical user interface for navigation of concurrently open software applications |
US20150141042A1 (en) * | 2012-06-27 | 2015-05-21 | Ntt Docomo, Inc. | Mobile terminal, system and method |
US20180165795A1 (en) * | 2012-09-14 | 2018-06-14 | Canon Kabushiki Kaisha | Apparatus, method, and program |
US10672105B2 (en) | 2012-09-14 | 2020-06-02 | Canon Kabushiki Kaisha | Display control apparatus changing size of plurality of objects displayed on display device, control method of the display control apparatus, and computer executable instructions for causing a computer to execute the control method |
US20140078184A1 (en) * | 2012-09-14 | 2014-03-20 | Canon Kabushiki Kaisha | Apparatus, method, and program |
JP2014059369A (en) * | 2012-09-14 | 2014-04-03 | Canon Inc | Display control device, display control method, and program |
JP2016212431A (en) * | 2016-07-14 | 2016-12-15 | キヤノン株式会社 | Display control device, display control method, and program |
US11507606B2 (en) | 2018-06-27 | 2022-11-22 | Uber Technologies, Inc. | Visual search system for finding trip destination |
US10990615B2 (en) * | 2018-06-27 | 2021-04-27 | Uber Technologies, Inc. | Visual search system for finding trip destination |
US20200004842A1 (en) * | 2018-06-27 | 2020-01-02 | Uber Technologies, Inc. | Visual search system for finding trip destination |
US11016643B2 (en) | 2019-04-15 | 2021-05-25 | Apple Inc. | Movement of user interface object with user-specified content |
US20220351444A1 (en) * | 2019-09-24 | 2022-11-03 | XVI Inc. | Animation production method |
Also Published As
Publication number | Publication date |
---|---|
CN101652741A (en) | 2010-02-17 |
CN101652741B (en) | 2011-11-09 |
US7750825B2 (en) | 2010-07-06 |
WO2008112383A3 (en) | 2009-06-04 |
WO2008112383A2 (en) | 2008-09-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7750825B2 (en) | System and method for navigation of display data | |
US9026938B2 (en) | Dynamic detail-in-context user interface for application access and content access on electronic displays | |
JP6170972B2 (en) | Method and computer-readable recording medium for gallery application for content display | |
US7317449B2 (en) | Key-based advanced navigation techniques | |
Robbins et al. | ZoneZoom: map navigation for smartphones with recursive view segmentation | |
US7327349B2 (en) | Advanced navigation techniques for portable devices | |
US20180024719A1 (en) | User interface systems and methods for manipulating and viewing digital documents | |
US7260789B2 (en) | Method of real-time incremental zooming | |
US11567624B2 (en) | Techniques to modify content and view content on mobile devices | |
US8635549B2 (en) | Directly assigning desktop backgrounds | |
US7213214B2 (en) | Graphical user interface with zoom for detail-in-context presentations | |
Elmqvist et al. | Melange: space folding for multi-focus interaction | |
JP2000512787A (en) | Method and apparatus for visualizing a hierarchical information structure on a two-dimensional screen based on nodes interconnected at an edge through fish-eye representation of nodes | |
US8640055B1 (en) | Condensing hierarchies in user interfaces | |
US8566359B1 (en) | Unfolding sparse data sets | |
GB2504085A (en) | Displaying maps and data sets on image display interfaces | |
JP2020507174A (en) | How to navigate the panel of displayed content | |
CN114327208B (en) | Legend display method and device, storage medium and terminal | |
JP2004506995A (en) | Enlarging and editing parts of an image in the context of the image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ADOBE SYSTEMS INCORPORATED, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ZUVERINK, DAVID;REEL/FRAME:018991/0165 Effective date: 20070306 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552) Year of fee payment: 8 |
|
AS | Assignment |
Owner name: ADOBE INC., CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:ADOBE SYSTEMS INCORPORATED;REEL/FRAME:048525/0042 Effective date: 20181008 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 12 |