Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20060195259 A1
Publication typeApplication
Application numberUS 11/276,555
Publication date31 Aug 2006
Filing date6 Mar 2006
Priority date26 Feb 2003
Also published asDE602004005427D1, DE602004005427T2, DE602004006733D1, DE602004006733T2, DE602004010084D1, DE602004010084T2, DE602004013732D1, EP1599702A1, EP1599702B1, EP1599702B2, EP1599703A1, EP1599703B1, EP1608935A1, EP1608935B1, EP1611416A1, EP1611416B1, EP1811269A2, EP1811269A3, EP2264405A2, EP2264405A3, EP2264405B1, US7606663, US7737951, US7769540, US7925437, US8019531, US8620584, US9367239, US20060173615, US20060192769, US20070005233, US20070103445, US20070150173, US20070150179, US20090326803, US20110144904, WO2004076976A1, WO2004076977A1, WO2004076978A1, WO2004076979A1
Publication number11276555, 276555, US 2006/0195259 A1, US 2006/195259 A1, US 20060195259 A1, US 20060195259A1, US 2006195259 A1, US 2006195259A1, US-A1-20060195259, US-A1-2006195259, US2006/0195259A1, US2006/195259A1, US20060195259 A1, US20060195259A1, US2006195259 A1, US2006195259A1
InventorsAyal Pinkus, Edwin Neef, Sven-Erik Jurgens, Mark Gretton
Original AssigneeTomtom B.V.
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Navigation Device with Touch Screen : Waypoints
US 20060195259 A1
Abstract
A navigation device programmed with a map database and software that enables a route to be planned between two user-defined places. The device may also be programmed to be able to display on a touch sensitive display a main navigation mode screen showing a map and to allow a user to set a desired location as a location to be stored in device memory by touching the screen, for example at the desired location shown on the map. The navigation device is especially advantageous for an in-car navigation device since it allows the user to easily and reliably input the current location as a waypoint such as, a reference point for future navigation, even while the device is mounted in a vehicle.
Images(5)
Previous page
Next page
Claims(16)
1. A navigation device programmed with a map database and software that enables a route to be planned between two user-defined places, wherein the device is further programmed to be able to display on a touch sensitive display a main navigation mode screen showing a map and to allow a user to set a desired location as a location to be stored in device memory by touching the screen within a zone large enough to be reliably selected by a single finger.
2. The device of claim 1 programmed to enable the user to set the current location as the location to be stored in device memory by touching the screen at the current location as shown on the map.
3. The device of claim 1 programmed to enable the user to set the current location as a waypoint by touching the screen at the current location as shown on the map.
4. The device of claim 3 in which the device is programmed so that touching the screen once or twice stores the desired location in device memory.
5. The device of claim 4 programmed so that a location stored in device memory by the user touching the screen is marked on the map with an icon.
6. The device of claim 5 in which the icon is a Point of Interest icon.
7. The device of claim 5 programmed so that the icon label can be annotated.
8. The device of claim 1 wherein said zone large enough to be reliably selected by a single finger is a square zone having an area of at least 0.7 cm2.
9. A method of enabling a user to interact with a navigation device programmed with a map database and software that enables a route to be planned between two user-defined places, wherein the method comprises the steps of:
(a) displaying on a touch sensitive display a main navigation mode screen showing a map;
(b) allowing a user to set a desired location as a location to be stored in device memory by touching the screen within a zone large enough to be reliably selected by a single finger.
10. The method of claim 9 in which the user sets the current location as the location to be stored in device memory by touching the screen at the current location as shown on the map.
11. The method of claim 9 in which the user sets the current location as a waypoint by touching the screen at the current location as shown on the map.
12. The method of claim 10 in which the user sets the current location as a waypoint by touching the screen at the current location as shown on the map.
13. The method of claim 9 in which the action of touching the screen once or twice stores the desired location in device memory.
14. The method of claim 9 further comprising the steps of (c) storing a location in device memory by the user touching the screen at the desired location and (d) marking that location on the map with an icon.
15. The method of claim 14 in which the icon is a Point of Interest icon.
16. The method of claim 13 comprising the further step of annotating the icon with a label.
Description
    CROSS REFERENCE TO RELATED APPLICATIONS
  • [0001]
    This application is a continuation of, and claims the priority of, pending U.S. application Ser. No. 10/546,741 filed on Aug. 25 2005 entitled “Navigation Device with Touch Screen”, the contents of which are hereby incorporated by reference. The parent application claims the priority of PCT Application No. PCT/GB2004/000803 filed on Feb. 26, 2004; GB Patent Application No. 0304358.5 filed on Feb. 26, 2003; and, GB Patent Application No. 0305175.2 filed on Mar. 7, 2003, the entire contents of all of which are hereby incorporated in total by reference.
  • FIELD OF THE INVENTION
  • [0002]
    This invention relates to a touch screen controlled navigation devices that can display navigation data, and in particular in-car navigation system devices.
  • BACKGROUND OF THE INVENTION
  • [0003]
    GPS based devices are well known and are widely employed as in-car navigation systems. Reference may be made to the Navigator series software from the present assignee, TomTom B. V. This is software that, when running on a PDA (such as a Compaq iPaq) connected to an external GPS receiver, enables a user to input to the PDA a start and destination address. The software then calculates the best route between the two end-points and displays instructions on how to navigate that route. By using the positional information derived from the GPS receiver, the software can determine at regular intervals the position of the PDA (typically mounted on the dashboard of a vehicle) and can display the current position of the vehicle on a map and display (and speak) appropriate navigation instructions (e.g. ‘turn left in 100 m’). Graphics depicting the actions to be accomplished (e.g. a left arrow indicating a left turn ahead) can be displayed in a status bar and also be superimposed over the applicable junctions/turnings etc in the roads shown in the map itself. Reference may also be made to devices that integrate a GPS receiver into a computing device programmed with a map database and that can generate navigation instructions on a display. The term ‘navigation device’ refers to a device that enables a user to navigate to a pre-defined destination. The device may have an internal system for receiving location data, such as a GPS receiver, or may merely be connectable to a receiver that can receive location data.
  • [0004]
    PDAs often employ touch screens to enable a user to select menu options or enter text/numbers using a virtual keyboard. Generally, touch input is meant to occur using a thin stylus since the size of individual virtual keys or other selectable items is relatively small. When navigating from a screen relating to one function or type of functions in an application to a different function or type of functions, then the presumption is that stylus selection of virtual keys, control panels, check boxes etc. will be undertaken since the related touch control zones are relatively small.
  • [0005]
    However, with some individual applications, such as a calculator application, each numeric may key be large enough to be selectable using a finger, as opposed to the stylus. However, where a large number of keys needs to be displayed at the same time (e.g. for a QWERTY or other format virtual keyboard with all alphabet letters), then a far smaller virtual keyboard has to be used; individual keys have then to be selected with the stylus. Hence, prior art devices may mix large, numeric keys available on one screen with much smaller keys on a different screen, even though the keys are of equal importance. Core functions cannot be said to be uniformly and consistently designed for effective and reliable finger operation, because the assumption is that users will operate a stylus on most occasions.
  • SUMMARY OF THE INVENTION
  • [0006]
    In a first aspect of this invention, there is a navigation device programmed with a map database and software that enables a route to be planned between two user-defined places.
  • [0007]
    The device may be further programmed to be able to display on a touch sensitive display a main navigation mode screen showing a map and to allow a user to set a desired location as a location to be stored in device memory by touching the screen within a zone large enough to be reliably selected by a single finger.
  • [0008]
    This is especially advantageous for an in-car navigation device since it allows the user to easily and reliably input the current location as a waypoint, that is a reference point for future navigation, even whilst the device is mounted in a vehicle.
  • [0009]
    These and other features of the invention will be more fully understood by references to the following drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • [0010]
    FIG. 1 is a screen shot from a navigation device implementing the present invention; the screen shot shows a plan map view and a status bar running along the bottom of the display;
  • [0011]
    FIG. 2 is a screen shot from the navigation device implementing a 3-D view;
  • [0012]
    FIG. 3 is a screen shot from the navigation device showing various route planning functions that enable a user to require the device to plot a new route to the destination that (i) is an alternative route; (ii) avoids a roadblock immediately ahead; (iii) avoids predefined roads or (iv) is a reversion to the original route;
  • [0013]
    FIG. 4 is a screen shot from the navigation device showing a virtual ABCD format keyboard.
  • DETAILED DESCRIPTION
  • [0014]
    The present invention may be implemented in software from TomTom B. V. called Navigator. Navigator software runs on a touch screen (i.e. stylus controlled) Pocket PC powered PDA device, such as the Compaq iPaq. It provides a GPS based navigation system when the PDA is coupled with a GPS receiver. The combined PDA and GPS receiver system is designed to be used as an in-vehicle navigation system. The invention may also be implemented in any other arrangement of navigation device, such as one with an integral GPS receiver/computer/display, or a device designed for non-vehicle use (e.g. for walkers) or vehicles other than cars (e.g. aircraft). The navigation device may implement any kind of position sensing technology and is not limited to GPS; it can hence be implemented using other kinds of GNSS (global navigation satellite system) such as the European Galileo system. Equally, it is not limited to satellite based location/velocity systems but can equally be deployed using ground-based beacons or any other kind of system that enables the device to determine its geographic location.
  • [0015]
    Navigator software, when running on a PDA, results in a navigation device that causes the normal navigation mode screen shown in FIG. 1 to be displayed. This view provides driving instructions using a combination of text, symbols, voice guidance and a moving map. Key user interface elements are the following: a 2-D map 1 occupies most of the screen. The map shows the user's car and its immediate surroundings, rotated in such a way that the direction in which the car is moving is always “up”. Running across the bottom quarter of the screen is the status bar 2. The current location of the device, as the device itself determines using conventional GPS location finding and its orientation (as inferred from its direction of travel) is depicted by an arrow 3. The route calculated by the device (using route calculation algorithms stored in device memory as applied to map data stored in a map database in device memory) is shown as darkened path 4 superimposed with arrows giving the travel direction. On the darkened path 4, all major actions (e.g. turning corners, crossroads, roundabouts etc.) are schematically depicted by arrows 5 overlaying the path 4. The status bar 2 also includes at its left hand side a schematic 6 depicting the next action (here, a right turn). The status bar 2 also shows the distance to the next action (i.e. the right turn—here the distance is 220 meters) as extracted from a database of the entire route calculated by the device (i.e. a list of all roads and related actions defining the route to be taken). Status bar 2 also shows the name of the current road 8, the estimated time before arrival 9 (here 2 minutes and 40 seconds), the actual estimated arrival time 10 (11.36 am) and the distance to the destination 11 (1.4 Km). The GPS signal strength is shown in a mobile-phone style signal strength indicator 12.
  • [0016]
    If the user touches the centre of the screen 13, then a navigation screen menu is displayed; from this menu, other core navigation functions within the Navigator application can be initiated or controlled. Allowing core navigation functions to be selected from a menu screen that is itself very readily called up (e.g. one step away from the map display to the menu screen) greatly simplifies the user interaction and makes it faster and easier.
  • [0017]
    The area of the touch zone which needs to be touched by a user is far larger than in most stylus based touch screen systems. It is designed to be large enough to be reliably selected by a single finger without special accuracy; i.e. to mimic the real-life conditions for a driver when controlling a vehicle; he or she will have little time to look at a highly detailed screen with small control icons, and still less time to accurately press one of those small control icons. Hence, using a very large touch screen area associated with a given soft key (or hidden soft key, as in the centre of the screen 13) is a deliberate design feature of this implementation. Unlike other stylus based applications, this design feature is consistently deployed throughout Navigator to select core functions that are likely to be needed by a driver whilst actually driving. Hence, whenever the user is given the choice of selecting on-screen icons (e.g. control icons, or keys of a virtual keyboard to enter a destination address, for example), then the design of those icons/keys is kept simple and the associated touch screen zones is expanded to such a size that each icon/key can unambiguously be finger selected. In practice, the associated touch screen zone will be of the order of at least 0.7 cm2 and will typically be a square zone. In normal navigation mode, the device displays a map. Touching the map (i.e. the touch sensitive display) once (or twice in a different implementation) near to the screen center (or any part of the screen in another implementation) will then call up a navigation menu (see FIG. 3) with large icons corresponding to various navigation functions, such as the option to calculate an alternative route, and re-calculate the route so as to avoid the next section of road (useful when faced with an obstruction or heavy congestion); or recalculate the route so as to avoid specific, listed roads.
  • [0018]
    The actual physical structure of the device itself may be fundamentally no different from any conventional handheld computer, other than the integral GPS receiver or a GPS data feed from an external GPS receiver. Hence, memory stores the route calculation algorithms, map database and user interface software; a microprocessor interprets and processes user input (e.g. using a device touch screen to input the start and destination addresses and all other control inputs) and deploys the route calculation algorithms to calculate the optimal route. ‘Optimal’ may refer to criteria such as shortest time or shortest distance, or some other user-related factors.
  • [0019]
    More specifically, the user inputs his start position and required destination in the normal manner into the Navigator software running on the PDA using a virtual keyboard. The user then selects the manner in which a travel route is calculated: various modes are offered, such as a ‘fast’ mode that calculates the route very rapidly, but the route might not be the shortest; a ‘full’ mode that looks at all possible routes and locates the shortest, but takes longer to calculate etc. Other options are possible, with a user defining a route that is scenic—e.g. passes the most POI (points of interest) marked as views of outstanding beauty, or passes the most POIs of possible interest to children or uses the fewest junctions etc.
  • [0020]
    Roads themselves are described in the map database that is part of Navigator (or is otherwise accessed by it) running on the PDA as lines—i.e. vectors (e.g. start point, end point, direction for a road, with an entire road being made up of many hundreds of such sections, each uniquely defined by start point/end point direction parameters). A map is then a set of such road vectors, plus points of interest (POIs), plus road names, plus other geographic features like park boundaries, river boundaries etc, all of which are defined in terms of vectors. All map features (e.g. road vectors, POIs etc.) are defined in a co-ordinate system that corresponds or relates to the GPS co-ordinate system, enabling a device's position as determined through a GPS system to be located onto the relevant road shown in a map.
  • [0021]
    Route calculation uses complex algorithms that are part of the Navigator software. The algorithms are applied to score large numbers of potential different routes. The Navigator software then evaluates them against the user defined criteria (or device defaults), such as a full mode scan, with scenic route, past museums, and no speed camera. The route which best meets the defined criteria is then calculated by a processor in the PDA and then stored in a database in RAM as a sequence of vectors, road names and actions to be done at vector end-points (e.g. corresponding to pre-determined distances along each road of the route, such as after 100 meters, turn left into street x).
  • [0022]
    Finger UI Design Approach
  • [0023]
    The present invention associates a touch activation zone for each of a core set of functions; this zone is large enough to be reliably selected by a single finger without special accuracy. This mimics the real-life conditions for a driver when controlling a vehicle; he or she will have little time to look at a highly detailed screen with small control icons, and still less time to accurately press one of those small control icons.
  • [0024]
    This UI design feature is consistently deployed throughout Navigator 2.0 in relation to all of the defined core functions and not just in an ad hoc manner where the screen design happens to permit a large control icon to be displayed: hence, whenever the user is given the choice of selecting certain on-screen options relating to core functions (e.g. control icons, or keys of a virtual keyboard to enter a destination address, for example), then the design of those icons/keys is kept simple and the associated touch screen zones is expanded to such a size that each icon/key can unambiguously be finger selected. Further, whenever a screen includes selectable graphical options (e.g. icons, names, check boxes etc.), then each of these options is linked to a non-overlapping touch input zone that is large enough to be reliably activated using a finger is displayed.
  • [0025]
    Hence, the device will not present to the user at different times a mix of selectable graphical options relating to core functions, some being large enough to be reliably activated with a finger and some being too small for that and requiring stylus activation. Key is that the user interaction design has been based on analysing what core functions might need to be activated by a driver whilst still driving and ensuring that these can be activated by selecting options (e.g. large graphical icons) linked to unusually large touch screen activation areas. Prior art approaches to UI design have failed to consistently identify core functions and treat them in this manner.
  • [0026]
    In practice, the associated touch screen zone will be a minimum of 0.7 cm2 (far larger than normal touch screen activation zones) and will typically be square.
  • [0027]
    Examples of the core functions which consistently employ this approach are:
      • (i) moving between the highest level in the menu hierarchy to the next level down;
      • (ii) tasking away from the normal navigation mode screen;
      • (iii) selecting options that initiate route recalculation functions;
      • (iv) setting the current location as a location to be marked on a map.
  • [0032]
    This approach can be illustrated in several contexts. First, to facilitate access to functions that enable alternative routes to be calculated by placing a menu of graphical icons for those functions (or any other kind of way or option to allow selection of the functions, such as lists, check boxes etc.) on a menu screen that is easily accessed from the main navigation screen—i.e. the screen that is displayed during actual or simulated/preview navigation (FIGS. 1 or 2). As noted above, in normal navigation mode, the device displays an animated map that shows the location of the navigation device as the journey progresses. Touching the map (i.e. the touch sensitive display) once (or twice in a different implementation) near to the screen centre (or any part of the screen in another implementation) will then call up a Recalculate menu (see FIG. 3) with large icons corresponding to various route recalculation functions, such as the option to calculate an alternative route; re-calculate the route so as to avoid the next section of road (useful when faced with an obstruction or heavy congestion); and recalculate the route so as to avoid specific, listed roads. These alternative route functions are initiated by touching also the appropriate icon in the Recalculate menu screen (which is one user interaction, such as a screen touch, away from the normal mode navigation screen). Other route recalculation functions may be reached at a deeper level in the menu structure. However, all can be reached by selecting options such as graphical icons, lists, check boxes which are unambiguously associated with touch screen areas that are large enough to allow the user to select them with a fingertip whilst safely driving, typically at least 0.7 cm2 in area.
  • [0033]
    Virtual Keyboard
  • [0034]
    As noted above, a key feature is the use of large touch screen areas for each graphical icon that initiates a core function that a driver may need to deploy whilst driving. This approach is also used for the keys of the virtual keyboards as well (e.g. ABCD, as shown in FIG. 4, QWETY etc formats). Because the device can display a large alphabet keyboard, far larger than conventional screen based keyboards on PDAs, this allows a user to input text more easily—and without taking the device out of the cradle or off the dashboard—and even using his finger rather than the stylus.
  • [0035]
    The optimal dimensions on a iPaq (with 240320 pixels, or 106 pixels per inch, 48 pixels per cm) are:
  • [0036]
    QWERTY/AZERTY Keyboard Images:
  • [0037]
    Horizontal spacing: 25 pixels centre to centre (button to button)
  • [0038]
    Vertical spacing: 32 pixels centre to centre (button to button)
  • [0039]
    ABC Keyboard Image:
  • [0040]
    Horizontal spacing: 40 pixels centre to centre
  • [0041]
    Vertical spacing: 32 pixels centre to centre
  • [0042]
    NOTE: The numeric keyboard image is mixed (has both small and big keys). Also, some keys might be 1 pixel smaller in width than other keys (for aesthetics), therefore the centre to centre might be different from key to key.
  • [0043]
    The individual key size in pixels is (width, height):
  • [0044]
    3628 (ABC keyboard image)
  • [0045]
    2128 (QWERTY/ASERTY keyboard image)
  • [0046]
    4628 (arrow keys on QWERTY/AZERTY keyboard images)
  • [0047]
    7028 (space/back keys on QWERTY/AZERTY keyboard images)
  • [0048]
    NOTE: Some keys might be 1 pixel smaller in width than other keys (for aesthetics)
  • [0049]
    The total image sizes for different keyboards (width, height) are as follows:
  • [0050]
    240155 (ABC keyboard image)
  • [0051]
    240155 (QWERTY keyboard image)
  • [0052]
    240155 (AZERTY keyboard image)
  • [0053]
    24062 (2 line NUM/Arrowkeys image)
  • [0054]
    24031 (1 line Arrow key image)
  • [0055]
    NOTE: This includes white-space edges in the range of 1 to 3 pixels.
  • [0056]
    The above sizes enable a soft keyboard to be displayed that a user can readily operate with one finger when the device is mounted on a dashboard cradle with the car being driven and without being significantly distracted from driving.
  • [0057]
    Tolerances to the above sizes are approximately 25% (plus or minus).
  • [0058]
    Waypoints
  • [0059]
    If the driver passes a location of interest on the route (e.g. while driving), he can store the present location by a very simple action, such as a rapid double tap a pre-defined zone on the screen, such as a 0.7 cm2 zone centred on the current vehicle location displayed by the device (or by issuing a voice command). This stores a marker in a database of waypoints; in essence the co-ordinates of the location of interest. This is another example of a core function (labelling the current location as a waypoint) that is activated using a touch screen area large enough to allow reliable finger selection even whilst the user is driving. The waypoint can be marked on the map itself with a POI (point of interest) icon. Later, the user can retrieve and use it (or even annotate and store it). For example, if marked as a POI on a map, the user could select the POI on the map, which would cause an annotation window to open, into which the user could input text (e.g. “great bookshop here”).
  • [0060]
    Although the invention has been described in language specific to structural features and/or methodological acts, it is to be understood that the invention defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as exemplary forms of implementing the claimed invention
Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US4281323 *5 Dec 197828 Jul 1981Bank Computer Network CorporationNoise responsive data input apparatus and method
US5410485 *19 Oct 199325 Apr 1995Alpine Electronics, Inc.Navigation apparatus and method for exploring an optimal route based on characteristics of an exploration object zone
US5452217 *19 Jul 199319 Sep 1995Aisin Aw Co., Ltd.Navigation system for guiding vehicle orally
US5487616 *1 Jun 199530 Jan 1996Jean D. IchbiahMethod for designing an ergonomic one-finger keyboard and apparatus therefor
US5528248 *19 Aug 199418 Jun 1996Trimble Navigation, Ltd.Personal digital location assistant including a memory cartridge, a GPS smart antenna and a personal computing device
US5652706 *26 Jun 199529 Jul 1997Aisin Aw Co., Ltd.Navigation system with recalculation of return to guidance route
US5731979 *6 Jun 199524 Mar 1998Mitsubishi Denki Kabushiki KaishaMap information display apparatus for vehicle
US5841849 *31 Oct 199624 Nov 1998Lucent Technologies Inc.User interface for personal telecommunication devices
US5859628 *13 Nov 199512 Jan 1999Pois, Inc.Apparatus and method for a personal onboard information system
US5920303 *15 Dec 19976 Jul 1999Semantic Compaction SystemsDynamic keyboard and method for dynamically redefining keys on a keyboard
US5928307 *15 Jan 199727 Jul 1999Visteon Technologies, LlcMethod and apparatus for determining an alternate route in a vehicle navigation system
US5999892 *5 May 19977 Dec 1999Fan; Rodric C.Method for providing an instant distance measurement on a map
US6040824 *30 Jun 199721 Mar 2000Aisin Aw Co., Ltd.Information display system with touch panel
US6115669 *9 Dec 19965 Sep 2000Aisin Aw Co., Ltd.Navigation system for vehicles and waypoint entering and storage method
US6157379 *21 May 19985 Dec 2000Ericsson Inc.Apparatus and method of formatting a list for display on a touchscreen
US6208344 *29 Jul 199827 Mar 2001Ncr CorporationSystem and process for manipulating and viewing hierarchical iconic containers
US6295372 *25 Feb 199725 Sep 2001Palm, Inc.Method and apparatus for handwriting input on a pen based palmtop computing device
US6297811 *2 Jun 19992 Oct 2001Elo Touchsystems, Inc.Projective capacitive touchscreen
US6317687 *5 Oct 199213 Nov 2001Aisin Aw Co., Ltd.Vehicle navigation apparatus providing both automatic guidance and guidance information in response to manual input request
US6321158 *31 Aug 199820 Nov 2001Delorme Publishing CompanyIntegrated routing/mapping information
US6336072 *9 Sep 19991 Jan 2002Fujitsu LimitedApparatus and method for presenting navigation information based on instructions described in a script
US6381534 *3 Jan 200130 Apr 2002Fujitsu LimitedNavigation information presenting apparatus and method thereof
US6397148 *16 Aug 200128 May 2002Garmin CorporationMethod and device for displaying animated navigation information
US6405126 *22 Oct 199811 Jun 2002Trimble Navigation LimitedPre-programmed destinations for in-vehicle navigation
US6411283 *20 May 199925 Jun 2002Micron Technology, Inc.Computer touch screen adapted to facilitate selection of features at edge of screen
US6484094 *19 Feb 200219 Nov 2002Alpine Electronics, Inc.Display method and apparatus for navigation system
US6507333 *24 Nov 199914 Jan 2003Xerox CorporationTribo-addressed and tribo-supperessed electric paper
US6542802 *2 Jul 20011 Apr 2003Delphi Technologies, Inc.Vehicle occupant characterization method with rough road compensation
US6542812 *4 Oct 20001 Apr 2003American Calcar Inc.Technique for effective navigation based on user preferences
US6546335 *21 Dec 20018 Apr 2003Garmin, Ltd.System, functional data, and methods to bias map matching
US6636803 *30 Nov 200121 Oct 2003Corus Home RealtyReal-estate information search and retrieval system
US6640185 *21 Jul 200128 Oct 2003Alpine Electronics, Inc.Display method and apparatus for navigation system
US6661920 *19 Jan 20009 Dec 2003Palm Inc.Method and apparatus for multiple simultaneously active data entry mechanisms on a computer system
US6674414 *23 Aug 20016 Jan 2004Mitsubishi Denki Kabushiki KaishaCar navigation display system
US6675147 *29 Mar 20006 Jan 2004Robert Bosch GmbhInput method for a driver information system
US6687614 *25 Apr 20023 Feb 2004Sony CorporationNavigation device, information display device, object creation method, and recording medium
US6714220 *19 Jan 200030 Mar 2004Siemens AktiengesellschaftInteractive input with limit-value monitoring and on-line help for a palmtop device
US6721651 *28 Jun 200213 Apr 2004Garmin Ltd.Rugged, waterproof, navigation device with touch panel
US6724370 *12 Apr 200120 Apr 2004International Business Machines CorporationTouchscreen user interface
US6750849 *10 Dec 200115 Jun 2004Nokia Mobile Phones, Ltd.Method and arrangement for accomplishing a function in an electronic apparatus and an electronic apparatus
US6898525 *18 Mar 200424 May 2005Garmin Ltd.Rugged, waterproof, navigation device with touch panel
US6903773 *14 Dec 19987 Jun 2005Canon Kabushiki KaishaImage taking method and apparatus for linking images
US6993718 *22 Aug 200231 Jan 2006Sony CorporationInformation processing method and apparatus
US7030863 *20 Dec 200418 Apr 2006America Online, IncorporatedVirtual keyboard system with automatic correction
US7098896 *16 Jan 200329 Aug 2006Forword Input Inc.System and method for continuous stroke word-based text input
US7136047 *9 Apr 200314 Nov 2006Microsoft CorporationSoftware multi-tap input system and method
US7272497 *16 Mar 200418 Sep 2007Fuji Jukogyo Kabushiki KaishaVehicle navigation system with multi-use display
US7313477 *27 Sep 200625 Dec 2007Garmin Ltd.Vehicle dash-mounted navigation device
US7336263 *17 Jan 200326 Feb 2008Nokia CorporationMethod and apparatus for integrating a wide keyboard in a small device
US7366609 *29 Aug 200529 Apr 2008Garmin Ltd.Navigation device with control feature limiting access to non-navigation application
US7474960 *30 Dec 20026 Jan 2009Mapquest, Inc.Presenting a travel route
US7539572 *18 Oct 200226 May 2009Fujitsu Ten LimitedImage display
US7546188 *22 Sep 20049 Jun 2009Sony CorporationIn-vehicle apparatus and control method of in-vehicle apparatus
US7737951 *26 Feb 200415 Jun 2010Tomtom International B.V.Navigation device with touch screen
US20010035880 *6 Mar 20011 Nov 2001Igor MusatovInteractive touch screen map device
US20020061217 *30 Mar 200123 May 2002Robert HillmanElectronic input device
US20020080123 *26 Dec 200027 Jun 2002International Business Machines CorporationMethod for touchscreen data input
US20020082775 *15 Jun 200127 Jun 2002Meadows James W.Personal golfing assistant
US20020177944 *25 Apr 200228 Nov 2002Koji IharaNavigation device, information display device, object creation method, and recording medium
US20030036848 *16 Aug 200220 Feb 2003Sheha Michael A.Point of interest spatial rating search method and system
US20030182052 *30 Oct 200125 Sep 2003Delorme David M.Integrated routing/mapping information system
US20040030493 *30 Apr 200312 Feb 2004Telmap LtdNavigation system using corridor maps
US20040039522 *26 Aug 200226 Feb 2004Star JwoCycling route recording system for a bicycle
US20040039523 *25 Mar 200326 Feb 2004Mamoru KainumaNavigation system and program for navigation system setup
US20040070602 *1 Aug 200315 Apr 2004Sony CorporationElectronic guide system, contents server for electronic guide system, portable electronic guide device, and information processing method for electronic guide system
US20040122674 *19 Dec 200224 Jun 2004Srinivas BangaloreContext-sensitive interface widgets for multi-modal dialog systems
US20040203863 *28 Jun 200214 Oct 2004Heikki HuomoSystem and method for initiating location-dependent applications on mobile devices
US20040204845 *19 Jun 200214 Oct 2004Winnie WongDisplay method and apparatus for navigation system
US20050004746 *24 Feb 20036 Jan 2005Hiroyuki MoritaNavigation apparatus and navigation method
US20050140660 *17 Jan 200330 Jun 2005Jyrki ValikangasMethod and apparatus for integrating a wide keyboard in a small device
US20060173615 *6 Mar 20063 Aug 2006Tomtom B.V.Navigation Device with Touch Screen
US20070087866 *3 Oct 200619 Apr 2007Meadows James WPersonal golfing assistant and method and system for graphically displaying golf related information and for collection, processing and distribution of golf related data
US20100179005 *5 Oct 200915 Jul 2010Skyhawke Technologies, Llc.Personal golfing assistant and method and system for graphically displaying golf related information and for collection, processing and distribution of golf related data
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US81758968 Aug 20118 May 2012H. Lee Moffitt Cancer Center And Research Institute, Inc.Computer systems and methods for selecting subjects for clinical trials
US8405504 *30 Jan 201226 Mar 2013N99 LlcBrand mapping
US8451114 *12 Oct 201228 May 2013N99 LlcBrand mapping
US8818727 *4 Nov 200926 Aug 2014Mitac International Corp.Method of assisting a user of a personal navigation device with parking nearby a destination location and related personal navigation device
US9047691 *30 Sep 20122 Jun 2015Apple Inc.Route display and review
US914612530 Sep 201229 Sep 2015Apple Inc.Navigation application with adaptive display of graphical directional indicators
US941867230 Sep 201216 Aug 2016Apple Inc.Navigation application with adaptive instruction text
US950105817 Nov 201422 Nov 2016Google Inc.User interface for displaying object-based indications in an autonomous driving system
US20080140307 *18 Oct 200612 Jun 2008Kenny ChenMethod and apparatus for keyboard arrangement for efficient data entry for navigation system
US20110106435 *4 Nov 20095 May 2011Richard StantonMethod of assisting a user of a personal navigation device with parking nearby a destination location and related personal navigation device
US20130325341 *30 Sep 20125 Dec 2013Apple Inc.Route display and review
USD740832 *6 Jun 201213 Oct 2015Apple Inc.Display screen or portion thereof with a graphical user interface
USD743432 *5 Mar 201317 Nov 2015Yandex Europe AgGraphical display device with vehicle navigator progress bar graphical user interface
USD743438 *23 Jul 201417 Nov 2015Apple Inc.Display screen or portion thereof with graphical user interface
USD743988 *24 Nov 201424 Nov 2015Apple Inc.Display screen or portion thereof with graphical user interface
USD748146 *26 Jan 201526 Jan 2016Apple Inc.Display screen or portion thereof with graphical user interface
USD750110 *8 Nov 201223 Feb 2016Uber Technologies, Inc.Display screen of a computing device with a computer-generated electronic panel for providing information of a service
USD750663 *12 Mar 20131 Mar 2016Google Inc.Display screen or a portion thereof with graphical user interface
USD753717 *22 Jun 201512 Apr 2016Google Inc.Display screen or a portion thereof with a graphical user interface
USD753718 *22 Jun 201512 Apr 2016Google Inc.Display screen or a portion thereof with a graphical user interface
USD753719 *22 Jun 201512 Apr 2016Google Inc.Display screen or a portion thereof with a graphical user interface
USD753720 *22 Jun 201512 Apr 2016Google Inc.Display screen or a portion thereof with a graphical user interface
USD75372123 Jun 201512 Apr 2016Google Inc.Display screen or portion thereof with animated graphical user interface
USD75372223 Jun 201512 Apr 2016Google Inc.Display screen or portion thereof with animated graphical user interface
USD754189 *13 Mar 201319 Apr 2016Google Inc.Display screen or portion thereof with graphical user interface
USD754190 *13 Mar 201319 Apr 2016Google Inc.Display screen or portion thereof with graphical user interface
USD754203 *22 Jun 201519 Apr 2016Google Inc.Display screen or a portion thereof with a graphical user interface
USD75420423 Jun 201519 Apr 2016Google Inc.Display screen or a portion thereof with a graphical user interface
USD76185722 Dec 201519 Jul 2016Google Inc.Display screen or a portion thereof with graphical user interface
USD76387616 Nov 201516 Aug 2016Apple Inc.Display screen or portion thereof with graphical user interface
USD76571222 Jan 20166 Sep 2016Apple Inc.Display screen or portion thereof with graphical user interface
USD7657131 Mar 20166 Sep 2016Google Inc.Display screen or portion thereof with graphical user interface
USD7663042 Mar 201613 Sep 2016Google Inc.Display screen or portion thereof with graphical user interface
USD7681841 Mar 20164 Oct 2016Google Inc.Display screen or portion thereof with graphical user interface
USD76932420 Nov 201518 Oct 2016Apple Inc.Display screen or portion thereof with graphical user interface
USD7716811 Mar 201615 Nov 2016Google, Inc.Display screen or portion thereof with graphical user interface
USD7716822 Mar 201615 Nov 2016Google Inc.Display screen or portion thereof with graphical user interface
USD7722695 Jun 201522 Nov 2016Apple Inc.Display screen or portion thereof with graphical user interface
USD7722742 Mar 201622 Nov 2016Google Inc.Display screen or portion thereof with graphical user interface
USD7735171 Mar 20166 Dec 2016Google Inc.Display screen or portion thereof with graphical user interface
USD775138 *5 Jun 201227 Dec 2016Apple Inc.Display screen or portion thereof with a graphical user interface
USD77563627 Feb 20153 Jan 2017Uber Technologies, Inc.Display screen for a computing device with graphical user interface
USD78689222 Dec 201516 May 2017Waymo LlcDisplay screen or portion thereof with transitional graphical user interface
USD78689322 Dec 201516 May 2017Waymo LlcDisplay screen or portion thereof with transitional graphical user interface
USD79056017 Mar 201527 Jun 2017Apple Inc.Display screen or portion thereof with graphical user interface
USD79526825 Aug 201522 Aug 2017Uber Technologies, Inc.Display screen computing device with graphical user interface
Classifications
U.S. Classification701/431
International ClassificationG06F3/0481, G06F3/0488, G06F3/0482, G01C21/32, G06F1/16, G01C21/36
Cooperative ClassificationG06F3/0482, G06F3/04817, G06Q10/02, G01C21/362, G06F3/04886, G06F1/1626
European ClassificationG01C21/36D5, G06F3/0482, G06F3/0488T, G06F3/0481H, G06Q10/02, G06F1/16P3
Legal Events
DateCodeEventDescription
11 Apr 2006ASAssignment
Owner name: TOMTOM B.V., NETHERLANDS
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PINKUS, AYAL;NEEF, EDWIN;JURGENS, SVEN-ERIK;AND OTHERS;REEL/FRAME:017455/0682;SIGNING DATES FROM 20060320 TO 20060407
Owner name: TOMTOM B.V., NETHERLANDS
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PINKUS, AYAL;NEEF, EDWIN;JURGENS, SVEN-ERIK;AND OTHERS;SIGNING DATES FROM 20060320 TO 20060407;REEL/FRAME:017455/0682
3 Nov 2006ASAssignment
Owner name: TOMTOM INTERNATIONAL B.V., NETHERLANDS
Free format text: CHANGE OF NAME;ASSIGNOR:TOMTOM B.V.;REEL/FRAME:018478/0152
Effective date: 20050513